I think they don’t have a deep understanding of AI, ML, AGI, or GPTs, but are basically thinking rationally within what they know.
AGI isn’t exactly universally agreed on, but imo it’s a pretty big deal. Computers actually performing cognitive tasks at the level of an adult human would be incredible. But with how GPTs actually work, mimicking their input data with a bit of statistics thrown in to spit out new token orderings… I don’t think the idea that a GPT could ever possibly achieve AGI makes any sense.
If you distilled all of human knowledge down into a compact representation and write an efficient algorithm to make it search its dataset and respond to new input, is that algorithm doing cognition? I guess it’s more of a philosophical question than a technical one, but “yes” seems like a really unserious answer to me.
I think they don’t have a deep understanding of AI, ML, AGI, or GPTs, but are basically thinking rationally within what they know.
AGI isn’t exactly universally agreed on, but imo it’s a pretty big deal. Computers actually performing cognitive tasks at the level of an adult human would be incredible. But with how GPTs actually work, mimicking their input data with a bit of statistics thrown in to spit out new token orderings… I don’t think the idea that a GPT could ever possibly achieve AGI makes any sense.
If you distilled all of human knowledge down into a compact representation and write an efficient algorithm to make it search its dataset and respond to new input, is that algorithm doing cognition? I guess it’s more of a philosophical question than a technical one, but “yes” seems like a really unserious answer to me.