• French75@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    48 minutes ago

    Headline is super misleading… the article says that chat gpt told him it couldn’t give him drug advice, and that he should seek help. He goactually got good advice from chat gpt, but didn’t like it, didn’t trust the good advice chat gpt gave him, then spent months trying to get chat gpt to give him the dodgy advice he wanted.

    Of course chat GPT shouldn’t be giving that sort of advice, but man that headline is as misleading as it gets. He literally didn’t trust the advice he got from chat gpt to seek help.