• wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 day ago

    It’s when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it’s all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it’s going to be a rocky ride down.

    • astanix@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Is it like crypto where cpus were good and then gpus and then FPGAs then ASICs? Or is this different?

      • steelrat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 hours ago

        Wildly different, though similar in that ASIC was tuned to specific crypto tasks, everyones making custom silicon for neural nets and such.

        I wouldn’t plan on it going away. Apple put optimized neural net chips in their last phone. Same with Samsung.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 day ago

        I think it’s different. The fundamental operation of all these models is multiplying big matrices of numbers together. GPUs are already optimised for this. Crypto was trying to make the algorithm fit the GPU rather than it being a natural fit.

        With FPGAs you take a 10x loss in clock speed but can have precisely the algorithm you want. ASICs then give you the clock speed back.

        GPUs are already ASICS that implement the ideal operation for ML/AI, so FPGAs would be a backwards step.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 day ago

        If bitnet or some other technical innovation pans out? Straight to ASICs, yeah.

        Future smartphone will probably be pretty good at running them.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 day ago

        It’s probably different. The crypto bubble couldn’t actually do much in the field of useful things.

        Now, I’m saying that with a HUGE grain of salt, but there are decent application with LLM (let’s not call that AI). Unfortunately, these usages are not really in the sight of any business putting tons of money into their “AI” offers.

        I kinda hope we’ll get better LLM hardware to operate privately, using ethically sourced models, because some stuff is really neat. But that’s not the push they’re going for for now. Fortunately, we can already sort of do that, although the source of many publicly available models is currently… not that great.

        • KumaSudosa@feddit.dk
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          1 day ago

          LLMs are absolutely amazing for a lot of things. I use it at work all the time to check code blocks or remembering syntax. It is NOT and should NOT be your main source of general information and we collectively have to realise how problematic and energy consuming they are.