• ShadowRam@fedia.io
    link
    fedilink
    arrow-up
    71
    arrow-down
    4
    ·
    2 days ago

    Well if people started calling it for what it is, weighted random text generator, then maybe they’d stop relying on it for anything serious…

    • hendrik@palaver.p3x.de
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Yeah, my point was more this doesn’t have to do anything with AI or the technology itself. I mean whether AI is good or bad or doesn’t really work… Their guardrails did work exactly as intended and flagged the account hundreds of times for suicidal thoughts. At least according to these articles. So it’s more a business decision to not intervene and has little to do with what AI is and what it can do.

      (Unless the system comes with too many false positives. That’d be a problem with technology. But this doesn’t seem to be discussed in any form.)

      • Axolotl@feddit.it
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        I wonder how a keyboard with those enhanched autocomplete would be to use…clearly if the autocomplete is used locally and the app is open source

        • ferrule@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          there are voice to text apps that run a model on your phone. a few more cores on our devices or some more optimisations to the models and we can run an LLM. The problem is battery life and heat.

          • Axolotl@feddit.it
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            I once runned some models on my phone thruh termux. I tried to run Llama 3.2 with 1 and 3B parameters and run pretty well, i tried 8B and was slow. I tried deepseek-r1, 1.5B and run well, 7B was slow.

            For text prediction llama 1B may be enough

            Now, this is on a 300/400€ phone (Honor magic 6 lite)

    • AnarchistArtificer@slrpnk.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      I like how the computational linguist Emily Bender refers to them: “synthetic text extruders”.

      The word “extruder” makes me think about meat processing that makes stuff like chicken nuggets.