• 4am@lemmy.zip
    link
    fedilink
    arrow-up
    5
    ·
    1 day ago

    ML tech isn’t bad, just like blockchain isn’t bad. It’s the gross capitalist opportunism happening around it that makes it overpromise (to the detriment of quality), overbuild (to the detriment of the environment), overuse (to the detriment of the economy), and overstimulate (to the detriment of mental health) while stealing the hard work of basically all of humanity up to this point (like then or not, the material reality is that these companies should be compensating artists and authors alike for their work being used for training).

    And with all that power, they do THIS fucking shit with it.

    I’m all for local LLMs to be assistants, autocompletes, and reference librarians; but just like the web, they were better when you had to be a fucking turbo-nerd to get them working.

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      1 day ago

      One unique thing I’ve observed is that big firms, especially the US ones, seem to miss all the cool innovations coming from LLM research papers… unless its in house, of course.

      So it’s also insular corporate ‘don’t innovate, scale up’ culture kind of poisoning them too. They don’t have to be so expensive and big to be useful tools.

      • AldinTheMage@ttrpg.network
        link
        fedilink
        arrow-up
        1
        ·
        21 hours ago

        Also the fact that a lot of the big firms really seem to be just interested in it as a way to get more user data. People will share some pretty sensitive info with an LLM that they wouldn’t otherwise provide.

        Running locally is definitely the way to go, if you’re going to use them.