

How could pleasing older women ever bring dishonor‽
Mohammed himself did as much with Khadija!
Be the change the world needs by replacing oil with pleasure as Saudi Arabia’s strongest export!
Father, Hacker (Information Security Professional), Open Source Software Developer, Inventor, and 3D printing enthusiast


How could pleasing older women ever bring dishonor‽
Mohammed himself did as much with Khadija!
Be the change the world needs by replacing oil with pleasure as Saudi Arabia’s strongest export!


They’ll claim that because someone is LGBTQ+, their birth certificate is fraudulent and that counts as enough evidence for denaturalization.


I doubt that. New services that host the open models are cropping up all the time. They’re like VPS hosting providers (in fact, existing VPS hosts will soon break out into that space too).
It’s not like Big AI has some huge advantage over the open source models. In fact, for images they’re a little bit behind!
The FOSS coding models are getting pretty fantastic and they get better all the time. It seems like once a month a new, free model comes out that eclipses the previous generation once a month.


The mistakes it makes depends on the model and the language. GPT5 models can make horrific mistakes though where it randomly removes huge swaths of code for no reason. Every time it happens I’m like, “what the actual fuck?” Undoing the last change and trying usually fixes it though 🤷
They all make horrific security mistakes quite often. Though, that’s probably because they’re trained on human code that is *also" chock full of security mistakes (former security consultant, so I’m super biased on that front haha).


Schrodinger’s AI: It is both useless shit that can only generate “slop” while at the same time being so effective, it is the reason behind 50,000 layoffs/going to take everyone’s jobs.


You want to see someone using say, VS Code to write something using say, Claude Code?
There’s probably a thousand videos of that.
More interesting: I watched someone who was super cheap trying to use multiple AIs to code a project because he kept running out of free credits. Every now and again he’d switch accounts and use up those free credits.
That was an amazing dance, let me tell ya! Glorious!
I asked him which one he’d pay for if he had unlimited money and he said Claude Code. He has the $20/month plan but only uses it in special situations because he’ll run out of credits too fast. $20 really doesn’t get you much with Anthropic 🤷
That inspired me to try out all the code assist AIs and their respective plugins/CLI tools. He’s right: Claude Code was the best by a HUGE margin.
Gemini 3.0 is supposed to be nearly as good but I haven’t tried it yet so I dunno.
Now that I’ve said all that: I am severely disappointed in this article because it doesn’t say which AI models were used. In fact, the study authors don’t even know what AI models were used. So it’s 430 pull requests of random origin, made at some point in 2025.
For all we know, half of those could’ve been made with the Copilot gpt5-mini that everyone gets for free when they install the Copilot extension in VS Code.


The Trump administration is really trying for more Charlie Kirk-style Leopards Eating Faces moments with this one.


Anyone who can make you believe absurdities can make you commit atrocities. -Voltaire
I first heard it when I was a kid but didn’t truly understand it until 9/11. The more time goes on, the more our shitty timeline proves it to be true.


Good games are orthogonal to AI usage. It’s possible to have a great game that was written with AI using AI-generated assets. Just as much as it’s possible to have a shitty one.
If AI makes creating games easier, we’re likely to see 1000 shitty games for every good one. But at the same time we’re also likely to see successful games made by people who had great ideas but never had the capital or skills to bring them to life before.
I can’t predict the future of AI but it’s easy to imagine a state where everyone has the power to make a game for basically no cost. Good or bad, that’s where we’re heading.
If making great games doesn’t require a shitton of capital, the ones who are most likely to suffer are the rich AAA game studios. Basically, the capitalists. Because when capital isn’t necessary to get something done anymore, capital becomes less useful.
Effort builds skill but it does not build quality. You could put in a ton of effort and still fail or just make something terrible. What breeds success is iteration (and luck). Because AI makes iteration faster and easier, it’s likely we’re going to see a lot of great things created using it.


FYI: Stuff like this is for automated testing, not “playing games for you” 🤣
Also, I won’t consider it realistic until it can type out, “lol git gud scrub” after ganking someone who just spawned.


I meant that the current reactor designs were made using AI running in big data centers with supercomputers/GPUs.


“Finish high school, get a job, get married, have kids, go to church. Those are all in your control,” -Ben Shapiro
…all within the comfort of your parents home (if they even have one big enough for you, your wife, and your kids). Because that’s all they can afford.
Also: WTF does going to church have to do with anything‽ That’s not going to land you a good job (blessed are the poor)! There’s no marketable skills to be learned from going to church either (unless you want to socialize with pedophiles and other sex offenders in order to better understand Trump’s social circle; see: “pastor arrested”).


8% of total global aviation emissions doesn’t put it in second place. It’s not even in the top 100. I don’t think it ever will be… Because building huge data centers takes years and by the time there’s enough data centers to make a huge dent, the previous AI data centers will have been used to make fusion power a reality.
Today’s fusion reactor designs were all made thanks to AI. The kind running in big data centers.
It takes a lot of computing power to simulate fusion reactor designs!

I use gen AI every day and I find it extremely useful. But there’s degrees to every model’s effectiveness. For example, I have a wide selection of AI models (for coding) at my disposal from OpenAI, Google, Anthropic, etc and nearly every open source model that exists. If I want to do something simple like change a light theme (CSS) to a dark one, I can do that with gpt5-mini, gpt-oss:120b or any of the other fast/cheap models… Because it’s a simple task.
If I need to do something complicated that requires a lot of planning and architecture, I’m going to use the best model(s) available for that sort of thing (currently, Claude Sonnet/Opus or Gemini Pro… The new 3.0 version; old 2.5 sucked ass). Even then I will take a skeptical view of everything it generates and make sure my prompts are only telling it to do one little thing at a time, verifying everything at each step.
What I’m saying is that AI is an effective tool depending on the use case/complexity. Do I trust the big game publishers to use AI effectively like this? FUCK NO. Huge negative response to that question.
Here’s how I suspect that they’ll use generative AI:
There’s a lot of room in gaming for fun and useful generative AI but folks like Tim Sweeney will absolutely be choosing the villain route.


You bring up a great point! When someone does that: Painting a replica and passing it off as their own, what law have they violated? They have committed fraud. That’s a counterfeit.
Is making a counterfeit stealing? No! It’s counterfeitting. That is it’s own category of law.
It’s also a violation of the owner’s copyright but let’s talk about that too: If I pay an artist to copy someone’s work, who is the copyright violator? Me, or the artist that painted it? Neither! It’s a trick question, because copyright law only comes into force when something is distributed. As long as those works never get distributed/viewed to/by the public, it’s neither here nor there.
The way AI works is the same as if you took a book you purchased, threw it in a blender, then started pasting chunks of words out of it in a ransom note.


Woah! Piracy is not considered stealing. The MPAA and RIAA made that argument over and over and over again in the 90s and early 2000s and they lost. Thank the gods!
You would download a car!
If piracy was stealing, we’d all be waiting for our chance to watch TV shows in a queue of thousands.
Copyright violations are not theft. They never were and they never will be. Because no one is deprived of anything when something is copied. In theory, there could’ve been a lost sale as a result but study after study has shown that piracy actually improves sales of copyrighted works.
When an AI is trained on images that YOU—the artist—posted to the public Internet for the world to see it will either increment or decrement a floating point value by like 0.01. That’s it! That’s all it does.
How can that be considered “stealing”‽ It’s absurd.


That’s not “upscaling”. That’s having the AI color it in for you. Like a comic artist who has a colorer (person that literally does that).
Upscaling just makes the image bigger (resolution-wise). It uses the same exact technology as regular AI image generation though 🤷
There’s degrees to everything. AI haters are at the point where they’re arguing with digital artists over what counts as art and it’s getting insane.


Don’t say, “stolen”. It’s the wrong word. “Copied” is closer but really, “trained an AI model with images freely available on the Internet” is more accurate but doesn’t sound sinister.
When you steal something, the original owner doesn’t have it anymore. AIs aren’t stealing anything. They’re sort of copying things but again, not really. At the heart of every AI LLM or image model is a random number generator. They aren’t really capable of copying things exactly unless the source material somehow gets a ridiculously high “score” when training. Such as a really popular book that gets quoted in a million places on the Internet and in other literature (and news articles, magazines, etc… anything that was used to train the AI).
Someone figured out that there’s so much Harry Potter quotes and copies in OpenAI’s training set that you could trick it into outputting something like 70% of the first book, one very long and specific prompt at a time (thousand of times). That’s because of how the scoring works, not because of any sort of malicious intent to violate copyright on the part of OpenAI.
Nobody’s stuff is being stolen.


You know it was a joke, right? 😀
Pam Bondi couldn’t put together a Lego set let alone a cash bounty system.
The GOP should just setup a copy of the press room at the White House in a retirement home where everything is painted gold. Every day, they can place dementia Don in front of the podium and tell him, “there’s a million viewers.”
He’ll keep himself busy like that until a bigger stroke than the last takes him down to hell.