Havent we been making jokes about dumb ai in games for literal decades before llms came along?
The thing is i want my npcs to be a curated combination of things:
environmental awareness, be aware of the dragon in the room, still have scripted interactions but triggered by detecting more complex environmental changes.
More dynamic dialogue options like tone of voice changing depending on wether earlier dialogue made them angry.
Reacting to the player fighting in a sensible way.
Still all coded and developed with human passion but to appropriately incorporate ai into more complex detecting/reacting/interacting mechanics.
Instead what nvidia (who dictates the game industry now) cares about is a “chatbot module” that they can sell to game engines (like unreal) so lazy developers can turn digital objects into a character ai persona.
Somehow the moment llms became able to impress we stopped caring about all the other 99% that is ai that isn’t a conversation simulator.
This is wrought with danger. If a chatbot goes off the rails, breaks the fourth wall, and becomes belligerent, it’s annoying. If a game NPC does it, you’ve taken people RIGHT out of the game. And that’s before they start giving you clues and advice for things that aren’t in the game, or inventing lore that isn’t real.
I think their point is that we want real AI in games. LLMs are not AI in the traditional sense. They are really advanced predictive text. They might someday be a part of an AI but they are not remotely intelligent. They just have the superficial appearance of intelligence because they guess at words in away that mimics human language.
We don’t want LLM NPCs, we want NPCs that simulate human intelligence.
All of this focus and money on LLMs is probably hurting research into actually useful AI.
I like single player rpgs.
I very much care, i would love good npc ai.
Havent we been making jokes about dumb ai in games for literal decades before llms came along?
The thing is i want my npcs to be a curated combination of things: environmental awareness, be aware of the dragon in the room, still have scripted interactions but triggered by detecting more complex environmental changes.
More dynamic dialogue options like tone of voice changing depending on wether earlier dialogue made them angry.
Reacting to the player fighting in a sensible way.
Still all coded and developed with human passion but to appropriately incorporate ai into more complex detecting/reacting/interacting mechanics.
Instead what nvidia (who dictates the game industry now) cares about is a “chatbot module” that they can sell to game engines (like unreal) so lazy developers can turn digital objects into a character ai persona.
Somehow the moment llms became able to impress we stopped caring about all the other 99% that is ai that isn’t a conversation simulator.
This is wrought with danger. If a chatbot goes off the rails, breaks the fourth wall, and becomes belligerent, it’s annoying. If a game NPC does it, you’ve taken people RIGHT out of the game. And that’s before they start giving you clues and advice for things that aren’t in the game, or inventing lore that isn’t real.
I think their point is that we want real AI in games. LLMs are not AI in the traditional sense. They are really advanced predictive text. They might someday be a part of an AI but they are not remotely intelligent. They just have the superficial appearance of intelligence because they guess at words in away that mimics human language.
We don’t want LLM NPCs, we want NPCs that simulate human intelligence.
All of this focus and money on LLMs is probably hurting research into actually useful AI.