Edit: There’s a blog post on the failings of the study and the communication around it: Clickbait Neuroscience: Lessons from “Your Brain on ChatGPT” – Mind Brain Education
EEG revealed significant differences in brain connectivity: Brain-only participants exhibited the strongest, most distributed networks; Search Engine users showed moderate engagement; and LLM users displayed the weakest connectivity. Cognitive activity scaled down in relation to external tool use.
Over four months, LLM users consistently underperformed at neural, linguistic, and behavioral levels.
This lines up with my completely unscientific observation that the people who have started relying heavily on AI are dumbasses.
I’ve got some anecdotal evidence to suggest that people who go to the gym a lot also tend to be stronger. You know, there could be some connection between exercise and muscle growth… Someone should probably investigate that link, could be something there.
also lines up with my observation on how my cognitive skills declined while using it. it just stops you still thinking on your own.
I recently used AI to make a plugin using a programming language I’ve never used before.
When working on similar projects in the past, I get to the end with a new understanding of the language and its capabilities.
This time, I think it took about 10% longer, was no easier overall (although each step seemed easier to start), and I feel like I still have basically no understanding of the language or its features.
If I were to start over, I’d still have to start learning from the beginning. AI has saved me no time at all, and the plugin still has plenty of subtle bugs I need to remove by hand.
I truly fear the technical debt we are accruing by AI, as experts die out, new ones will be less and less capable due to lack of experience, because AI might help you get the job done, but has no experience gained.
But, but, that Nvidia guy said they are cognitive amplifier tools. :|
maybe he calls his net worth “cognition”
Kind of hilarious that the comments in this post are doing exactly what the linked post points out as the big pitfall of studies like this.
I mean this feels like an obvious result. Are we thinking this is ground breaking?
Like comparing 3 ways of making a table, hand tools, power tools or paying someone to make it for you. Then asking which required the most effort.
I mean I know it’s important to confirm even expected results but this feels so intuitive.
Well it’s rather than being about effort, it’s more like, in the end, which of these people best understands how to make a table?
I mean this feels like an obvious result.
I’m glad scientists think otherwise, or we would still be living in the dark ages.
In other news, dark ages are a myth disproven by science.
Fair enough, let me rephrase that:
I’m glad scientists think otherwise, or we would still be living in the bronze age.
I’m sorry, that came off very passive-aggressive, I really shouldn’t post at 3 am when I can sleep.
The whole dark ages, and golden ages thing is just very annoying, made up during the renaissance, in part as a useful tool to go “look how shit everything is, I will make it great and amazing like it was before”, still a favorite to use by populists (in reference to whatever time is most suitable) and it’s been repeated so much, it actually works, everyone kinda just accepted it. But then when you dig into it, the middle ages weren’t really worse in terms of invention/art/etc. than the renaissance, nor was there this big stagnation after the decline of the roman empire, and people always made art, new inventions and great achievements, along with cruelty, bloodshed and other awful things. But then this has been a relatively recent shift in historical research, so not that well known
Well for one the historical field is part of the humanities and for second dark ages are certainly a thing just not the medieval dark age. The Greek dark age is most certainly a thing for example.
It seems intuitive but also 2000 years ago people thought that for example the eye emitting some sort of substance that interacts with the surrounding environment to be an intuitive explanation for how vision works, so intuitions need to be tested.
And thus, ray-tracing was born.
Damnit. Eye lasers would have been so much cooler though.
No it’s more complex than that. In the final session, participants were swapped between the LLM and Brain groups, and the effects persisted to some extent.
So it’s worse than “using an LLM doesn’t engage your brain”, it’s more like “using an LLM actually makes you dumber.”
It is obvious, but we have so many liars lying to everyone and each other about AI that they get away with it. Skilled bullshitters shitting up clouds of smoke and using every manipulation tactic in the world.
Some hard evidence makes it easier to prevent their damage.
It’s so easy to pontificate instead of actually doing studies, right?
Maybe I was not eloquent enough, but I don’t object to the research being done. I think though that the result is the expected one and therefore it is not noteworthy enough to post on here. Doubtless this paper with end up highly cited.
That’s a great observation!..
Fuck yeah. Lets do it!
I mean, I did say this in a different thread, lol
:)
I bet my connectivity would be off the charts with activity because anytime I see results from it, I get enraged and wonder how something can be so stupid
I only prompt questions for things I already know… more of a confirmation thingand more often then not, it’s wrong
The medium is the massage, more true than ever.
I am an awful proofreader. My problem has always been that I am blind to the mistakes. I mean grammar and usage type things. I really though an LLM would help me with that but it causes more problems than it fixes so I quit using it.
This study is almost 6 months old. Why post it now?
Because I found it now and searching didn’t show anyone posting it before…











