• ripcord@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    13 hours ago

    You’re underestimating the text part a lot. The sheer amount of things downloaded for most sites is insane. It’s not the raw data (although it’s still pretty significant, especially when things haven’t been cached yet.

    But there’s often HUNDREDS of pages loaded. Each which needs a GET, even to validate cache, which often fails. Some can be done in parallel. All requires a bunch of shitty slow HTML/CSS/JS compute. It’s stupid. It’s why loading a page on a 16-core system with gigabit internet links still takes like 5+ seconds to load instead of the like 200ms that it should. Which adds up.

    Now do that over 56k.