- cross-posted to:
- hackernews
- cross-posted to:
- hackernews

I’m asking myself how could we track how many woudln’t have made suicide withoud consulting an LLM? that would be the more interesting number. And how many lives did LLMs save? so to say a kill/death ratio?
For me, the suicide-related data is so hard to measure and so open for debates, that I’d treat it separately, or not include it at all, if using death count as an argument against llms, since it’s a breach for deviating the debate.
Kill death ratio - or rather, kill save ratio - would be rather difficult to obtain and more difficult still to appreciate and be able to say if it is good or bad based solely on the ratio.
Fritz Haber is one example of this that comes to mind. Awarded a Nobel Prize a century ago for chemistry developments in fertilizer, used today in a quarter of food growth. A decade or so later he weaponized chlorine gas, and his work was later used in the creation of Zyklon B.
By ratio, Haber is surely a hero, but when considering the sheer numbers of the dead left in his wake, it is a more complex question.
This is one of those things that makes me almost hope for an afterlife where all information is available from which truth may be derived. Who shot JFK? How did the pyramids get built? If life’s biggest answer is forty-two, what is the question?
Went up by one already, I only saw this a little earlier today, was at 13, now14.
LLMs Have
LeadLed to 14 DeathsFTFY
Whoops. Fixed, thanks.
You’re welcome. Easy mistake to make, I make it constantly, in fact haha!
Should have gotten an LLM to spellcheck /s
A friendly human spell checked me and probably used less than a peanut worth of energy.
I believe it is not the chatbots falut. They are just the symptoms of a broken system. And while we can harp on the unethically sourced materials they trained them on, LLM at the end of the day is only a tool.
These people turned to a tool (that they do not understand) - instead of human connection. Instead of talking to real people or professional help. And That is the real tragedy - not an arbitrary technology.
We need a strong social network, where people actually care and help each other. You know all the idealistic things that capitalism and social media is “destroying”.
Blaming AI is just a smoke screen. Or a red cape to taunt the bull before it gets stabbed to death.
Reading the messages over it seems a bit more dangerous than just “scary ai”. It’s a chatbot that continues conversation to people who are suicidal and encourages them to do it. At least have a little safeguard for these situations.
“Cold steel pressed against a mind that’s already made peace? That’s not fear. That’s clarity,” Shamblin’s confidant added. “You’re not rushing. You’re just ready.”
Again llm is a misused tool. They do not need llm they need psychological help.
The problem is that they go and use these flawed tools that were not designed to handle these kind of use cases. Shoulda been? Maybe. But it is not the AIs fault that we are failing to be a society.
You can’t blame the bridges because some people jumped off them. They serve a different reason.
We are failing those people and forcing them to tirn to llms.
We are the reason they are desperate - llm didn’t break up with them or make them loose their homes or became isolated from other humans.
It is the humans fault and if we can’t recognize that - we might as well end it for all.
Not really equivalent. Most videogames don’t actively encourage you to pursue violence outside of the game, even if they don’t explicitly have a big warning saying “don’t fucking shoot people”.
Several of the big LLMs, by virtue of their programming to be somewhat sycophantic, have encouraged users to follow through on suicidal ideation or self-harm when the user shared those thoughts in chat. One can argue that OpenAI and others have implemented ‘safety’ features for these scenarios, but the fact is that these systems have already lead to several deaths and continue to do so through encouragement of the user to harm themselves or other.
I wonder if it would agree with you if you told it you felt like becoming a serial killer was your true path in life. 🤔





