Father, Hacker (Information Security Professional), Open Source Software Developer, Inventor, and 3D printing enthusiast

  • 2 Posts
  • 373 Comments
Joined 3 years ago
cake
Cake day: June 23rd, 2023

help-circle

  • You want to see someone using say, VS Code to write something using say, Claude Code?

    There’s probably a thousand videos of that.

    More interesting: I watched someone who was super cheap trying to use multiple AIs to code a project because he kept running out of free credits. Every now and again he’d switch accounts and use up those free credits.

    That was an amazing dance, let me tell ya! Glorious!

    I asked him which one he’d pay for if he had unlimited money and he said Claude Code. He has the $20/month plan but only uses it in special situations because he’ll run out of credits too fast. $20 really doesn’t get you much with Anthropic 🤷

    That inspired me to try out all the code assist AIs and their respective plugins/CLI tools. He’s right: Claude Code was the best by a HUGE margin.

    Gemini 3.0 is supposed to be nearly as good but I haven’t tried it yet so I dunno.

    Now that I’ve said all that: I am severely disappointed in this article because it doesn’t say which AI models were used. In fact, the study authors don’t even know what AI models were used. So it’s 430 pull requests of random origin, made at some point in 2025.

    For all we know, half of those could’ve been made with the Copilot gpt5-mini that everyone gets for free when they install the Copilot extension in VS Code.




  • Good games are orthogonal to AI usage. It’s possible to have a great game that was written with AI using AI-generated assets. Just as much as it’s possible to have a shitty one.

    If AI makes creating games easier, we’re likely to see 1000 shitty games for every good one. But at the same time we’re also likely to see successful games made by people who had great ideas but never had the capital or skills to bring them to life before.

    I can’t predict the future of AI but it’s easy to imagine a state where everyone has the power to make a game for basically no cost. Good or bad, that’s where we’re heading.

    If making great games doesn’t require a shitton of capital, the ones who are most likely to suffer are the rich AAA game studios. Basically, the capitalists. Because when capital isn’t necessary to get something done anymore, capital becomes less useful.

    Effort builds skill but it does not build quality. You could put in a ton of effort and still fail or just make something terrible. What breeds success is iteration (and luck). Because AI makes iteration faster and easier, it’s likely we’re going to see a lot of great things created using it.




  • “Finish high school, get a job, get married, have kids, go to church. Those are all in your control,” -Ben Shapiro

    …all within the comfort of your parents home (if they even have one big enough for you, your wife, and your kids). Because that’s all they can afford.

    Also: WTF does going to church have to do with anything‽ That’s not going to land you a good job (blessed are the poor)! There’s no marketable skills to be learned from going to church either (unless you want to socialize with pedophiles and other sex offenders in order to better understand Trump’s social circle; see: “pastor arrested”).



  • I use gen AI every day and I find it extremely useful. But there’s degrees to every model’s effectiveness. For example, I have a wide selection of AI models (for coding) at my disposal from OpenAI, Google, Anthropic, etc and nearly every open source model that exists. If I want to do something simple like change a light theme (CSS) to a dark one, I can do that with gpt5-mini, gpt-oss:120b or any of the other fast/cheap models… Because it’s a simple task.

    If I need to do something complicated that requires a lot of planning and architecture, I’m going to use the best model(s) available for that sort of thing (currently, Claude Sonnet/Opus or Gemini Pro… The new 3.0 version; old 2.5 sucked ass). Even then I will take a skeptical view of everything it generates and make sure my prompts are only telling it to do one little thing at a time, verifying everything at each step.

    What I’m saying is that AI is an effective tool depending on the use case/complexity. Do I trust the big game publishers to use AI effectively like this? FUCK NO. Huge negative response to that question.

    Here’s how I suspect that they’ll use generative AI:

    • Instead of using a gen AI model to interpolate steps between frames (which is most effective at 2D or 2.5D stuff), they will use a video model to generate the whole thing from scratch, 8-10 second clips at a time. Complete with all the inconsistencies and random bullshit that it creates. The person in charge will slap a “good enough” sticker on it and it’ll ship like that.
    • Instead of viewing the code generated by AI with a critical eye, they will merely rely on basic unit tests and similar. If it passes the test, it’ll ship. We can expect loads of “how did this even happen?” bugs from that in the near future (not just in games).
    • Instead of using image models to generate or improve things like textures (so they line up properly), they’ll have them generate whole scenes. Because that saves time and time is money! And that’s all that matters to them. Even though there will be absolutely insane and obvious inconsistencies that piss off gamers.
    • Instead of paying people to use AI to help them translate text, they’ll just throw the text at the AI and call it a day. With no verification or improvements by humans whatsoever.
    • They’ll pay 3rd parties for things like “AI cheat checking” and it will ban people left and right who were not cheating but will do nothing to stop actual cheaters (just like every anti-cheat that ever existed).
    • They will use AI bots for astroturfing and ad campaigns.
    • They will use poorly-made AI chat bots for completely unhelpful, useless support. People will jailbreak these and use them for even more nefarious purposes inside of games (because security folks won’t be paying as much attention in that space).

    There’s a lot of room in gaming for fun and useful generative AI but folks like Tim Sweeney will absolutely be choosing the villain route.


  • You bring up a great point! When someone does that: Painting a replica and passing it off as their own, what law have they violated? They have committed fraud. That’s a counterfeit.

    Is making a counterfeit stealing? No! It’s counterfeitting. That is it’s own category of law.

    It’s also a violation of the owner’s copyright but let’s talk about that too: If I pay an artist to copy someone’s work, who is the copyright violator? Me, or the artist that painted it? Neither! It’s a trick question, because copyright law only comes into force when something is distributed. As long as those works never get distributed/viewed to/by the public, it’s neither here nor there.

    The way AI works is the same as if you took a book you purchased, threw it in a blender, then started pasting chunks of words out of it in a ransom note.


  • Woah! Piracy is not considered stealing. The MPAA and RIAA made that argument over and over and over again in the 90s and early 2000s and they lost. Thank the gods!

    You would download a car!

    If piracy was stealing, we’d all be waiting for our chance to watch TV shows in a queue of thousands.

    Copyright violations are not theft. They never were and they never will be. Because no one is deprived of anything when something is copied. In theory, there could’ve been a lost sale as a result but study after study has shown that piracy actually improves sales of copyrighted works.

    When an AI is trained on images that YOU—the artist—posted to the public Internet for the world to see it will either increment or decrement a floating point value by like 0.01. That’s it! That’s all it does.

    How can that be considered “stealing”‽ It’s absurd.



  • Don’t say, “stolen”. It’s the wrong word. “Copied” is closer but really, “trained an AI model with images freely available on the Internet” is more accurate but doesn’t sound sinister.

    When you steal something, the original owner doesn’t have it anymore. AIs aren’t stealing anything. They’re sort of copying things but again, not really. At the heart of every AI LLM or image model is a random number generator. They aren’t really capable of copying things exactly unless the source material somehow gets a ridiculously high “score” when training. Such as a really popular book that gets quoted in a million places on the Internet and in other literature (and news articles, magazines, etc… anything that was used to train the AI).

    Someone figured out that there’s so much Harry Potter quotes and copies in OpenAI’s training set that you could trick it into outputting something like 70% of the first book, one very long and specific prompt at a time (thousand of times). That’s because of how the scoring works, not because of any sort of malicious intent to violate copyright on the part of OpenAI.

    Nobody’s stuff is being stolen.





  • We learned this lesson in the 90s: If you put something on the (public) Internet, assume it will be scraped (and copied and used in various ways without your consent). If you don’t want that, don’t put it on the Internet.

    There’s all sorts of clever things you can do to prevent scraping but none of them are 100% effective and all have negative tradeoffs.

    For reference, the big AI players aren’t scraping the Internet to train their LLMs anymore. That creates too many problems, not the least of which is making yourself vulnerable to poisoning. If an AI is scraping your content at this point it’s either amateurs or they’re just indexing it like Google would (or both) so the AI knows where to find it without having to rely on 3rd parties like Google.

    Remember: Scraping the Internet is everyone’s right. Trying to stop it is futile and only benefits the biggest of the big search engines/companies.