I read an article about that and apparently the change to the model made the ai stop responding to affectionate solicitations with the same sort of affectionate tone in reply. It got more businesslike, and less intimate.
I definitely got weirded out asking a GPT3 model about something and it got clingy.
Now I see it more like a search engine, skim the wall of text to find the useful information. Today I gave it a lot of context and explained what I had done and the error I got and it more or less told me you did everything correctly, and suggested stuff I already tried. It’s way of saying “I don’t know”.
I read an article about that and apparently the change to the model made the ai stop responding to affectionate solicitations with the same sort of affectionate tone in reply. It got more businesslike, and less intimate.
I definitely got weirded out asking a GPT3 model about something and it got clingy.
Now I see it more like a search engine, skim the wall of text to find the useful information. Today I gave it a lot of context and explained what I had done and the error I got and it more or less told me you did everything correctly, and suggested stuff I already tried. It’s way of saying “I don’t know”.