

Yeah, small detail.


Yeah, small detail.


WORSHIP DOOM


I remember that with Opera (before the switch to Chromium) I was able to open literally 100+ tabs on a machine with 1 gig of RAM. Sure, the web was simpler back then, but not by much.


Regardless of the OS, if you’re using the computer for anything productive, the application software, not the OS, will eat the majority of the RAM anyway. If you’re looking at the minimum requirements, chances are you’re not looking to do anything besides browsing the web with 5 tabs open.
It sucks though, I agree - software should get more efficient over time, just like hardware does. Out of curiosity, do we have anything more specific, i.e. how they tested that, what apps were running and so on? Or maybe they now deem that more things should be running?
The Moon landing was staged, but Stanley Kubrick wanted to film on location.


Can’t not think of that George Carlin quote: “If crime fighters fight crime and firefighters fight fire, what do freedom fighters fight?”
32 gigs is quite a bit of a show off in these trying times…


General purpose computers have been fast enough and had enough memory for a decade now. I bought a quad core (8 threads) laptop with 16 GB of RAM and a 1 TB SSD, 2 GB VRAM twelve years ago. Around the same time I built a NAS with an HP Gen8 microserver, also with 16 GB of RAM for ZFS. That one I recently upgraded with a better CPU for 20 €. Both of these machines still perform really well for most tasks. I haven’t upgraded my phone in 5 years, and my tablet in 8 years. These start to show their age because of the small amount of RAM built in. Last week I bought high end EIZO monitors from 8 years ago for 50 €. These are fine!
Ask yourself, are you even doing things that are limited by your hardware? If you are limited by hardware, could buying a last generation high end machine fill your needs? If you need vast amounts of computing power, renting cloud computing might be a solution as well.
Yeah, I generally agree with your points. I dislike the push to planned obsolescence with everything. I’m also trying to maximise the life out of things I have and I buy a little less often even if under normal market conditions I can afford new things whenever I want.
I’m a hobbyist photographer (so almost everything I throw at the hobby is out of pocket) and recently made a jump to higher megapixel cameras (the megapixel increase wasn’t a requirement, more of a side effect). I have a pretty adequate AM4 PC, but suddenly the 32 gigs of RAM that it has don’t quite cut it. Could’ve maybe bought 64 back then, but opted not to. It’s still a workable situation, just not ideal. Had to replace a dead SSD recently (the Phison controller ordeal), swallowed the increased prices on these as well (because the old one was “luckily” just a few months out of warranty). As for the RAM, before the price boom I could’ve gotten a decent 64 or even a 96 GiB DDR5 kit for 500-ish EUR (to add to a new CPU and mobo) - and now both cost 1500+. Upgrading the existing also wouldn’t be exactly easy because when I built it I hunted a very specific combination of frequency and timings - just buying anything would cost as much as it did when it was brand new. Should’ve jumped to AM5 last year, I could’ve even sold the current things at a profit now, but who would’ve known… At this point it’s a market crisis after another market crisis - maybe we should buy and never look back at the prices the next day.


Glad I’m stocked on memory cards that should last me for a while.
There is, however, a bigger problem that’s not addressed - manufacturers seemingly only playing nice to big corporations while screwing the end customer.
That’s pretty heavy, sorry to hear (read) you had to go through that. If you don’t mind me asking, why did you need full anaesthesia? I also had my wisdom teeth pulled out recently and all the doctors I asked said local is fine unless you have a serious reason.
The point of trackpads (and even more so of trackpoints) is that they’re faster to get to from typing position - you move your hand back a bit (or even just the index finger) instead of moving across the whole keyboard. That’s not something that would go high on the checklist when gaming - it’s usually one hand planted on WASD, the other on the mouse and hardly any going back and forth.
But I’d also spend 2 days writing a script to avoid spending 2 hours doing something tedious.
“Two days of debugging can save you 10 minutes of reading the documentation.”
Yet another invention driven by laziness - just like the remote control. Where would we be without that?


IMO it’s important to recognise that both are valid in different scenarios. If you want to click through and change something that’s actually doable with a couple of clicks, that’s fine. If you want to do this through the CLI, it’s also fine - if you’re someone who’s done 10 deployments today and configured the same thing, it would be muscle memory even if it’s 5 commands.
Says won’t use anything made by Linus Torvalds. Uses Git.
No email would be fine for most people, but then there would be the small number of folks who will cry all hell when they forget their passwords and/or secret questions and can’t get in…
Quick! Break something!
That also kills the Petri dish.