• mycodesucks@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    3 days ago

    This is wrought with danger. If a chatbot goes off the rails, breaks the fourth wall, and becomes belligerent, it’s annoying. If a game NPC does it, you’ve taken people RIGHT out of the game. And that’s before they start giving you clues and advice for things that aren’t in the game, or inventing lore that isn’t real.

    • LordMayor@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      I think their point is that we want real AI in games. LLMs are not AI in the traditional sense. They are really advanced predictive text. They might someday be a part of an AI but they are not remotely intelligent. They just have the superficial appearance of intelligence because they guess at words in away that mimics human language.

      We don’t want LLM NPCs, we want NPCs that simulate human intelligence.

      All of this focus and money on LLMs is probably hurting research into actually useful AI.