Was trying to explain to them that large language model progress will start to slow down because probabilistic networks aren’t good with anything that needs reproducible results, The data requirements are starting to balloon and restricting them reduces their creativity.
They brought up things like folding at home and other neural network models. So I explained to them that was a Markov state model not a generative AI.
Then they brought up Nvidia DLSS. And I do explain that that was also a CNN and Transformer model, not a generative one.
So I guess this is a long way of me saying that I don’t know if a lot of people understand the distinction between generative AI and a neural network with the many flavors that comes in and I think that conflation leads to a lot of the artificial hype AI gets.
Aren’t neural networks AI by definition, if we take the academic definition into account?
I know that thermostat is an AI, because it reacts to a stimuli (current temperature) and makes an action (starts heating) basted on it’s state. Which is the formal AI definition.
Wait. That actually means transformers are not AI by definition. Hmm, I need to look into it some more.
EDIT: I was confusing things, that’s the definition of AI Agent. I’ll go research the AI definition some more :D
Yes, neural networks are, usually, AI, but no, thermostats are not AI.
The definition of AI is more or less “a machine that can accomplish something that an intelligent thing like a human can do but which would be unfeasible or impossible to create an explicit algorithm for the machine to follow in order to accomplish it.”
So natural language translation is AI: before it became usable in the 2000s, this was seen as something that only humans could do. Producing meaningful text and recognisable images from scratch or a prompt is AI for the same reason.
On the threadiverse people equate AI with Artificial General Intelligence, i.e. something capable of true reasoning, with something we might call “understanding” (not a concept that I can attempt to define, but if you think about that ability which LLMs lack in spite of being able to produce text as if they had it) but this is ahistorical.
Most people don’t know the diference and the AI hype is that prediction engines/Simulated AI equates neural networks.
But if you look at LLM’s from a business and political point of view you might see, like me, that the point of “AI” haven’t been to create artificial intelligence but rather to replace low tier workers and influence impressionable people.
That’s what AI told me so it’s the objective truth.
Was trying to explain to them that large language model progress will start to slow down because probabilistic networks aren’t good with anything that needs reproducible results, The data requirements are starting to balloon and restricting them reduces their creativity.
They brought up things like folding at home and other neural network models. So I explained to them that was a Markov state model not a generative AI.
Then they brought up Nvidia DLSS. And I do explain that that was also a CNN and Transformer model, not a generative one.
So I guess this is a long way of me saying that I don’t know if a lot of people understand the distinction between generative AI and a neural network with the many flavors that comes in and I think that conflation leads to a lot of the artificial hype AI gets.
Considering I saw an article headline about a study involving a neural network labeled as “AI”, definitely not.
Aren’t neural networks AI by definition, if we take the academic definition into account?
I know that thermostat is an AI, because it reacts to a stimuli (current temperature) and makes an action (starts heating) basted on it’s state. Which is the formal AI definition.
Wait. That actually means transformers are not AI by definition. Hmm, I need to look into it some more.
EDIT: I was confusing things, that’s the definition of AI Agent. I’ll go research the AI definition some more :D
Yes, neural networks are, usually, AI, but no, thermostats are not AI.
The definition of AI is more or less “a machine that can accomplish something that an intelligent thing like a human can do but which would be unfeasible or impossible to create an explicit algorithm for the machine to follow in order to accomplish it.”
So natural language translation is AI: before it became usable in the 2000s, this was seen as something that only humans could do. Producing meaningful text and recognisable images from scratch or a prompt is AI for the same reason.
On the threadiverse people equate AI with Artificial General Intelligence, i.e. something capable of true reasoning, with something we might call “understanding” (not a concept that I can attempt to define, but if you think about that ability which LLMs lack in spite of being able to produce text as if they had it) but this is ahistorical.
Most people don’t know the diference and the AI hype is that prediction engines/Simulated AI equates neural networks.
But if you look at LLM’s from a business and political point of view you might see, like me, that the point of “AI” haven’t been to create artificial intelligence but rather to replace low tier workers and influence impressionable people.
That’s what AI told me so it’s the objective truth.