The newest open-source concern around AI that is seeing a lot of interest this weekend is when large language models / AI code generators may rewrite large parts of a codebase and then the “developers” claiming an alternative license incompatible with the original source license. This became a real concern this week with a popular Python project experiencing an AI-driven code rewrite and now published under an alternative license that its original author does not agree with and incompatible with the original code.
Chardet as a Python character encoding detector with its v7.0 release last week was a “ground-up, MIT-licensed rewrite of chardet.” This rewrite was largely driven via AI/LLM and claims to be up to 41x faster and offer an array of new features. But with this AI-driven rewrite, the license shifted from the LGPL to MIT.


There has already been a ruling in the US that AI-generated art cannot be copyrighted because it lacks human authorship, so it stands to reason that the same is true for code. Even copyleft is ultimately dependent on copyright to be legally enforceable.
And even if all of the rest of the world were to decide otherwise about whether AI-generated works can be copyrighted (which I very much doubt would happen), given how much software development happens in the US, it would still make the license pretty toothless.
AI-generated art not being copyrightable doesn’t necessarily mean AI-generated art can’t violate original copyright, though.
This is not about AI-generated code being relicensed to different AI-generated code. It’s about the original licensed code being relicensed or otherwise violated through AI-generated code.
You’re not wrong, but I don’t see how it’s relevant to what I’m trying to say. Whether or not they’re legally allowed to change the license has nothing to do with why they might want to change the license.
deleted by creator