Translation:
Mom, the abyss is staring into me!
He started!
Translation:
Mom, the abyss is staring into me!
He started!
You often just want to go with what’s popular, since hardware vendors will only provide APIs for select languages.
Well, and depending on the field, you may need to get certifications for your toolchain and such, so then you have to use what’s popular.
In my corner of the embedded world, it feels like everyone is practically jumping to integrate Rust. In the sense that vendors which haven’t had to innovate for 10+ years will suddenly publish a Rust API out of the blue. And I’m saying “out of the blue”, but I do also regularly hear from other devs, that they’ve been pestering the vendors to provide a Rust API or even started writing own wrappers for their C APIs.
And while it’s certainly a factor that Rust is good, in my experience they generally just want to get away from C. Even our management is well aware that C is a liability.
I guess, I should add that while I say “jumping”, this is the embedded world where everything moves extremely slowly, so we’re talking about a multi-year jump. In our field, you need to get certifications for your toolchain and code quality, for example, so lots of work is necessary to formalize all of that.
Yeah, particularly the broadcasting really irks me.
That is an opinion you can hold for yourself and then make compromises as you encounter reality. I do expect programmers to hold strong opinions.
But when you broadcast it, you strip yourself of the option to make compromises. You’re just saying something which is going to be wrong in one way or another in most situations. I do expect programmers to be smarter than that.


Calling someone “smooth-brained” always felt backwards.
Like, I understand that brains are supposed to be wrinkly and there’s an actual disorder where the brain doesn’t have those wrinkles, which leads to developmental delays: https://en.wikipedia.org/wiki/Lissencephaly
But it still sounds to me like you’re just calling them “smooth” as in “cool”.


I mean, for me, it’s also mostly a matter of us doing embedded(-adjacent) software dev. So far, my company would hardly ever choose one stack over another for performance/efficiency reasons. But yeah, maybe that is going to change in the future.


Large shared codebases never reflect a single design, but are always in some intermediate state between different software designs. How the codebase will hang together after an individual change is thus way more important than what ideal “north star” you’re driving towards.
Yeah, learned this the hard way. Came up with an architecture to strive for 1½ years ago. We shipped the last remaining refactorings two weeks ago. It has been a ride. Mostly a ride of perpetually being low-priority, because refactorings always are.
In retrospect, it would’ve likely been better to go for a half-assed architecture that requires less of a diff, while still enabling us to ship similar features. It’s not like the new architecture is a flawless fit either, after 1½ years of evolving requirements.
And ultimately, architecture needs to serve the team. What does not serve the team is 1½ years of architectural limbo.
Yeah, I don’t think there’s a concrete joke in this. Many comics are just a light chuckle as you flip through the newspaper…


I mean, don’t get me wrong, I also find startup time important, particularly with CLIs. But high memory usage slows down your application in other ways, too (not just other applications on the system). You will have more L1, L2 etc. cache misses. And the OS is more likely to page/swap out more of your memory onto the hard drive.
Of course, I don’t either sit in front of an application and can tell that it was a non-local NUMA memory access that caused a particular slowness, so I can understand not really being able to care for iterative improvements. But yeah, that is also why I quite like using an efficient stack outright. It just makes computers feel as fast as they should be, without me having to worry about it.
I heavily considered ending this comment with this dumbass meme:

Then I realized, I’m responding to someone called “Caveman”. Might’ve been subconscious influence there. 😅


Nah, I did understand it like that.
I posted that comment, because people generally wildly underestimate how common depression is. Because of that and because they don’t know how to deal with depression, they will look for other “explanations”, like yeah, sure, my kid’s just lazy. And that even despite these other “explanations” being extremely toxic and worsening depression.
So, this basic fact, that depression is among the most common disorders, would be part of my rebuttal. In hopes that she can accept that and help you work through it, like an adult should, rather than this silly game of peek-a-boo, where she hopes for it to not exist, so long as she doesn’t believe in depression. That is not helping anyone.


I don’t know what part of that is supposed to be an insult.
And the article may have talked of such stark differences, but I didn’t. I’m just saying that the resource usage is noticeably lower.


Yeah, you need to do tree-shaking with JavaScript to get rid of unused library code: https://developer.mozilla.org/en-US/docs/Glossary/Tree_shaking
I would expect larger corporate projects to do so. It is something that one needs to know about and configure, but if one senior webdev works on a project, they’ll set it up pretty quickly.


Major depressive disorder affected approximately 163 million people in 2017 (2% of the global population). The percentage of people who are affected at one point in their life varies from 7% in Japan to 21% in France.
https://en.wikipedia.org/wiki/Major_depressive_disorder#Epidemiology


This isn’t Reddit. You don’t need to talk in absolutes.
Similar to WittyShizard, my experience is very different. Said Rust application uses 1200 dependencies and I think around 50 MB RAM. We had a Kotlin application beforehand, which used around 300 dependencies and 1 GB RAM, I believe. I would expect a JavaScript application of similar complexity to use a similar amount or more RAM.
And more efficient languages do have an effect on RAM usage, for example:
.iter() + .collect().

Yeah, gonna be interesting. Software companies working on consumer software often don’t need to care, because:
I can see somewhat of a shift happening for software that companies develop for themselves, though. At $DAYJOB, we have an application written in Rust and you can practically see the dollar signs lighting up in the eyes of management when you tell them “just get the cheapest device to run it on” and “it’s hardly going to incur cloud hosting costs”.
Obviously this alone rarely leads to management deciding to rewrite an application/service in a more efficient language, but it certainly makes them more open to devs wanting to use these languages. Well, and who knows what happens, if the prices for Raspberry Pis and cloud hosting and such end up skyrocketing similarly.


More Bs means it’s softer, so more graphite will get onto your paper when you draw a line, which makes it darker.
More Hs means it’s harder, so less graphite. The advantage is that it doesn’t get used up as quickly and you can draw finer lines, although the latter is kind of a given either way, since you’re using a mechanical pencil.


It might genuinely taste different to you, and not just be a matter of preferences. There’s gene variations that alter the taste: https://www.bbc.com/news/health-50387126


Personally, I also found that it’s something you have to get used to. I used to barely eat greens and wouldn’t feel terribly great, if I did.
Then, earlier this year, I spent a month eating lots of greens. I’m guessing my gut microbiome adjusted, because yeah, now I can eat a whole salad bowl for a meal and if I don’t have greens at home for a few days, I will start to feel unwell.
I mean, there is a reason why that is so pervasive. It makes you set up the dev environment and figure out how to run a program, which is a step that you cannot skip.
Of course, you should go for a slightly more complex project afterwards.
It’s like those headlines “bicyclist hit by car” or similar, where you might think the car developed a mind of its own. I guess, we are on our way there, though, with self-driving cars and such…