I definitely got weirded out asking a GPT3 model about something and it got clingy.
Now I see it more like a search engine, skim the wall of text to find the useful information. Today I gave it a lot of context and explained what I had done and the error I got and it more or less told me you did everything correctly, and suggested stuff I already tried. It’s way of saying “I don’t know”.
I definitely got weirded out asking a GPT3 model about something and it got clingy.
Now I see it more like a search engine, skim the wall of text to find the useful information. Today I gave it a lot of context and explained what I had done and the error I got and it more or less told me you did everything correctly, and suggested stuff I already tried. It’s way of saying “I don’t know”.