• rozodru@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    3 days ago

    it’s already happening. A lot of places are now realizing that advocating “vibe coding” and what have you is generating a lot of broken shit and tech debt. I’m a front end/back end dev consultant. been doing it for a couple decades now. and lately most of my contacts have been for fixing or refactoring or straight up rebuilding stuff built by a vibe coder.

    Nothing produced by AI scales. None of it is encrypted, everything is exploitable, and eventually it all breaks. Example call I got last week: a startup had decided to set up their own mastodon instance for marketing reasons or whatever. they left the setup and configuration of it to their vibe coder who essentially had Claude Code set it up for them. basically build it out locally then throw it in some dockers for the server. real backwards ass way to do it. The problem is on weekly basis it was completely maxing out the storage on the server, thus it would crash and also crash whatever other instances for whatever they had on their (namely their own Gitea instance). Ends up the vibe coder in charge of setting this thing up either used Claude Code (doubtful) or straight up when to Claude.ai to walk them through the setup process. What was happening was all the images, videos, cached stuff was going into some extra .config dir and that’s it. wasn’t getting cleaned out, just all being thrown into some random directory and sitting there gradually growing. The fix was painfully easy just clean it out and make sure the cached stuff goes into the proper dirs and as a safety just run a cron job like once a week to clean it.

    Digging around same company pretty much set up all their instances for various things the same way. a couple of their apps had major security holes cause AI really doesn’t care or know what to do with that stuff. It was a mess.

    And it’s not just that company. like I said most of my calls now for work are just being a sort of digital janitor for AI and Vibe Coders. And I’ve dropped these companies some hints saying “look, hiring this dude who touts being a vibe coder is going to cost you way more money and tech debt in the long run then saving a few bucks right now. get rid of them and hire devs that actually know what they’re doing.” But most of these CEO’s and CTO’s only think in the short term. A year from now they’ll all be collectively fucked. Expect a LOT more stories like the recent Tea App to come out. Everyone’s data is at risk currently. I wouldn’t sign up for shit using my ID or anything right now.

    • phutatorius@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      That’s consistent with our experience using AI “assistants.” If it’s a common problem, the training set will be large enough that there’s a chance the AI will return a correct answer, though without contextual knowledge that might be important. But in a case like that, you could also just go and look it up on Stack Overflow. And if it’s not a common problem, the AI-proposed solution is likely to be crap, and one unlikely to take into account nonfunctional requirements, architectural guidelines, maintainability or best practice.

      My own principle is that if AI was involved at any step in the coding process, that means we need to test that code even more than usual, because programmers who remain in the business learn not to do stupid things over time, and AI doesn’t. When an AI makes some stupid coding suggestion, there’s no feedback loop telling it that it fucked up.

      I wouldn’t sign up for shit using my ID or anything right now.

      That’s some sound advice there.

    • ansiz@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      If this company you worked with was an AWS deployment then it’s likely they used Q for the deployment, it’s pretty integrated into AWS and they hype it constantly.