Lord Wiggle@lemmy.world to Not The Onion@lemmy.worldEnglish · 2 days agoTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comexternal-linkmessage-square183fedilinkarrow-up1957arrow-down112
arrow-up1945arrow-down1external-linkTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comLord Wiggle@lemmy.world to Not The Onion@lemmy.worldEnglish · 2 days agomessage-square183fedilink
minus-squaredeathbird@mander.xyzlinkfedilinkEnglisharrow-up7·1 day agoSue that therapist for malpractice! Wait…oh.
minus-squarejagged_circle@feddit.nllinkfedilinkEnglisharrow-up3·23 hours agoPretty sure you can sue the ai company
minus-squarewebghost0101@sopuli.xyzlinkfedilinkEnglisharrow-up3·20 hours agoPretty sure its in the Tos it can’t be used for therapy. It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.
minus-squarejagged_circle@feddit.nllinkfedilinkEnglisharrow-up2·edit-216 hours agoWhat? Its a virtual therapist. Thats the whole point. I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning
Sue that therapist for malpractice! Wait…oh.
Pretty sure you can sue the ai company
Pretty sure its in the Tos it can’t be used for therapy.
It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.
What? Its a virtual therapist. Thats the whole point.
I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning