The problem is that education is now a means to get a certificate of good job potential. As long as that is true, LLMs will be a problem. If all that matters is the metric (marks or getting the degree) then it’s rational to use a tool that can achieve a good metric score with little effort from you.
Like so many other things, education has been commodified. All that matters is some quantifying metric, not whether stuff was actually learned. Just like what’s being done to academia, citations are all that matters. Not what you did or how good it was. The metric that was meant to help you assess good work, without reading it yourself, has itself become what it means to do good work. This kind of environment incentivises cheating to a massive degree.
This is my take too. I’d add to it that School is also an abusive coercive environment
Relatively few students seem to feel that the work is urgent or that they need to sharpen their own mind. We are struggling to receive the lessons of discipline that used to come from having to complete complicated work on a tight deadline, because chatbots promise to complete our tasks in seconds.
We need the next generation to retain critical thinking abilities. Ideally, we want kids able to give synthesized answers without computer aid. Now just feed those thoughts into a chatbot and maybe it’ll tell us how.
This is why I’m not an AI advocate (among several other reasons - specifically the magnification of mental illness)
Don’t use it then