If software worked and was good in 2005 on pcs with 2gb RAM and with CPUs/GPUs vastly worse than modern ones, then why not write modern software like how that was written? Why not leverage powerful hardware when needed, but leave resource demands low at other times?
What are the reasons for which it might not work? What problems are there with this idea/approach? What architectural (and other) downgrades would this entail?
Note: I was not around at that time.


I’m a big fan of this approach to software; it works. PHP5, cgi-bin scripts, perl spaghetti, etc. are lit for hobby work.
The tradeoff is that you have to pay a lot of attention to do things securely, and you have to hand roll a lot more of your codebase instead of relying on external packages.