I’m guessing that we’ll see lots of news in 2026 about how companies are hiring developers to help improve the memory efficiency of their software because they just can’t get enough servers. Also, the servers they do get will have reduced RAM compared to what they wanted.

  • four@lemmy.zip
    link
    fedilink
    English
    arrow-up
    27
    ·
    7 days ago

    Maybe something like this will happen for some critical proprietary software, at Cloudflare etc. But I wouldn’t expect any benefits for software ran on your personal machine, unless the shortage goes on for 10 years.

    RAM is still cheaper than developers, at least until some point

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 days ago

      Yeah, and for consumer software, the companies developing the software don’t have to pay for the RAM, so unless it becomes a widespread reason to switch away from their software, they do not have to care at all.

  • marble@sh.itjust.works
    link
    fedilink
    arrow-up
    19
    ·
    7 days ago

    No, plenty of software already runs like shit and didn’t get optimised. Until people find “not running like shit” more of a selling point than “has new ai feature” commercial software will do the latter.

  • fruitycoder@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    6 days ago

    Just saw a paper about shifting AI training to nvme drives. But the penality, even with direct from GPU access was high.

    If this trend looks to go past a few quarters I would expect new software or features to prioritize it. I’m sure some cool cats in the FOSS world might indenpently be motivated by this as well as part of their contributions.

    Honestly from an ecosystem side, it would be cool to see more of the green computing metrics being added to big projects