What is utterly stupid is with modern compression and rendering techniques - if it weren’t for developers shipping a whole ass library to prod for one function that is simplifying 8 lines of code… 56k would still be usable for light browsing and access. It’d be slow still… But far from literally impossible now.
The sheer amount of “fat” on some (most) sites and applications is just depressing.
Seems someone said it before me… But you missed the point.
I’ll respond to your statement generally though.
Basic survival on 56k was doable. Shoutcast or Pandora could even be streamed with occasional buffering while browsing more light, or less heavy, sites. On the topic of video - low quality 240 would be “manageable” again, thanks to modern compression.
Was it a good experience? Rarely. Was it passible? Certainly; and if a site optimised for load time and reduced bandwidth - it could even be near broadband “experience” with some caching tricks.
Im not saying everyone needs to be code gods and build a 96k fps… But optimizing comes from understanding what you are writing and how it works. All this bloat is the result of laziness and a looser grasp on the fundamentals. As to why we should take a harder look at optimization?
Worldwide hardware costs are rising… Less people will be building fire breathing monsters. Better optimization - better user experience - more users. Recent examples (of poor optimization:) fallout and early 2077.
I mean, the text on a website isn’t the problem for not being able to use 56k.
It’s only images and video that take up space, the libraries used on websites are all cached at this point so that’s hardly relevant to ongoing usage of a website.
You’re underestimating the text part a lot. The sheer amount of things downloaded for most sites is insane. It’s not the raw data (although it’s still pretty significant, especially when things haven’t been cached yet.
But there’s often HUNDREDS of pages loaded. Each which needs a GET, even to validate cache, which often fails. Some can be done in parallel. All requires a bunch of shitty slow HTML/CSS/JS compute. It’s stupid. It’s why loading a page on a 16-core system with gigabit internet links still takes like 5+ seconds to load instead of the like 200ms that it should. Which adds up.
…and that’s why you need 16GB and a decent CPU to navigate the web
tab hoarder like me needs 64GB, got it when was cheap
What is utterly stupid is with modern compression and rendering techniques - if it weren’t for developers shipping a whole ass library to prod for one function that is simplifying 8 lines of code… 56k would still be usable for light browsing and access. It’d be slow still… But far from literally impossible now.
The sheer amount of “fat” on some (most) sites and applications is just depressing.
Back in the day when we were all amazed at Yahoo!'s loading speed I pulled the homepage HTML. 79K. Imagine that.
Good luck watching a video on 56k
Seems someone said it before me… But you missed the point.
I’ll respond to your statement generally though.
Basic survival on 56k was doable. Shoutcast or Pandora could even be streamed with occasional buffering while browsing more light, or less heavy, sites. On the topic of video - low quality 240 would be “manageable” again, thanks to modern compression.
Was it a good experience? Rarely. Was it passible? Certainly; and if a site optimised for load time and reduced bandwidth - it could even be near broadband “experience” with some caching tricks.
Im not saying everyone needs to be code gods and build a 96k fps… But optimizing comes from understanding what you are writing and how it works. All this bloat is the result of laziness and a looser grasp on the fundamentals. As to why we should take a harder look at optimization?
Datacenter / cloud costs are rising… Smaller footprint - smaller bill.
Worldwide hardware costs are rising… Less people will be building fire breathing monsters. Better optimization - better user experience - more users. Recent examples (of poor optimization:) fallout and early 2077.
That’s not what he is saying.
I mean, the text on a website isn’t the problem for not being able to use 56k.
It’s only images and video that take up space, the libraries used on websites are all cached at this point so that’s hardly relevant to ongoing usage of a website.
You’re underestimating the text part a lot. The sheer amount of things downloaded for most sites is insane. It’s not the raw data (although it’s still pretty significant, especially when things haven’t been cached yet.
But there’s often HUNDREDS of pages loaded. Each which needs a GET, even to validate cache, which often fails. Some can be done in parallel. All requires a bunch of shitty slow HTML/CSS/JS compute. It’s stupid. It’s why loading a page on a 16-core system with gigabit internet links still takes like 5+ seconds to load instead of the like 200ms that it should. Which adds up.
Now do that over 56k.