It’s because people get mislead by the “agent”, assuming there’s something actually intelligent at the other end, able to act like they would, just… Automated.
This is because the advertising for LLMs present them as if they were intelligent.
LLMs are being promoted as a tool that can do anything even though the only thing they do well is output text that resembles human patterns. It is a hammer and they are pretending everything is a nail.
It’s because people get mislead by the “agent”, assuming there’s something actually intelligent at the other end, able to act like they would, just… Automated.
This is because the advertising for LLMs present them as if they were intelligent.
LLMs are being promoted as a tool that can do anything even though the only thing they do well is output text that resembles human patterns. It is a hammer and they are pretending everything is a nail.
I think it’s worse: it’s laziness. It’s easier to ask a machine so it does the job for you. And since it looks mostly ok, they keep doing it.