• fubarx@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    2
    ·
    1 day ago

    I’ve been using these for constrained, boring development tasks since they first came out. “Pro” versions too. Like converting code from one language to another, or adding small features to existing code bases. Things I don’t really want to bother taking weeks to learn, when I know I’ll only be doing them once. They work fine if you take baby steps, make sure you do functional/integrated testing as you go (don’t trust their unit tests–they’re worthless), and review EVERYTHING generated. Also, make sure you have a good, working repo version you can always revert to.

    Another good use is for starting boilerplate scaffolding (like, a web server with a login page, a basic web UI, or REST APIs). But the minute you go high-level, they just shit the bed.

    The key point in that article is the “90%” one (in my experience it’s more like 75%). Taking a project from POC/tire-kicking/prototype to production is HARD. All the shortcuts you took to get to the end fast have to be re-done. Sometimes, you have to re-architect the whole thing to scale up to multiple users vs just a couple. There’s security, and realtime monitoring, and maybe compliance/regulatory things to worry about. That’s where these tools offer no help (or worse, hallucinate bad help).

    Ultimately, there’s no substitute for battle-tested, scar-tissued, human experience.

  • codeinabox@programming.devOP
    link
    fedilink
    English
    arrow-up
    55
    ·
    3 days ago

    I think the most interesting, and also concerning, point is the eighth point, that people may become busier than ever.

    After guiding way too many hobby projects through Claude Code over the past two months, I’m starting to think that most people won’t become unemployed due to AI—they will become busier than ever. Power tools allow more work to be done in less time, and the economy will demand more productivity to match.

    Consider the advent of the steam shovel, which allowed humans to dig holes faster than a team using hand shovels. It made existing projects faster and new projects possible. But think about the human operator of the steam shovel. Suddenly, we had a tireless tool that could work 24 hours a day if fueled up and maintained properly, while the human piloting it would need to eat, sleep, and rest.

    In fact, we may end up needing new protections for human knowledge workers using these tireless information engines to implement their ideas, much as unions rose as a response to industrial production lines over 100 years ago. Humans need rest, even when machines don’t.

    This does sound very much like what Cory Doctorow refers to as a reverse-centaur, where the developer’s responsibility becomes overseeing the AI tool.

    • Clent@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Knowledge worker burn out is real. The idea that this is some inevitable future is nonsense. Feels like a hedge from some who bought the hype.

      Any company that tries will end up cannibalisizing productivity.

      It’s like claiming that if supercars became affordable everyone will of course start traveling at 120 mph in average. When the reality would be wasted energy and fools wrapping themselves around trees.

    • andioop@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      23 hours ago

      I think that is one of my problems with AI. People who were once creators, and who were the only means to produce it (either you became a creator by learning the skills or hired a creator to do it) did things. Now that a machine can do it they babysit it at best. If they enjoyed the process of creation, something was lost. I would hate washing clothes by hand and am glad I have a washing machine to do it for me. But if I enjoyed it I am sure I’d have to use the washing machine at a laundromat job because it is more efficient, while lamenting the lost enjoyment from handwashing myself and instead becoming a glorified babysitter. And in my personal life I might not even have the time to wash clothes by hand if I liked it and would keep washing with the machine. Much easier to make time for something if that is the only way it can get done; much easier to have time to handwash clothes when that is the only way you can have clean clothes without buying new.

      I admit I now work a non-tech career. And my motivation to do my own hobby projects went way down because “well if AI can do it since it’s greenfield,” but I also do not want to use AI, so nothing gets done. Small hobby project so I could learn would be still building something new only I could do (or that I’d have to hire out). Now if something else can do it I feel like I am handicapping myself so I can learn (even though I have never even used AI to code), which feels very different. Like having to learn math without calculators while knowing they are a possibility, vs. learning math without when calculators are not invented yet so you cannot have that complaint. Yes, I know, abacuses, but I hope you get my point. Going from the project being something only I or someone else with the skills could have done, so it makes sense for me to do it, it was efficient, to “well I have to to learn the skills but it would have been faster to just get AI to do it.” (If that is even true! Maybe I am getting overly influenced by bots pushing AI and it would be faster for ME to do it.) That makes it feel way less motivating to try myself.

      And that does not even get into whether AI can even do it. So many comments (that may or may not be bots, not sure) nowadays in dev communities seeming to change their tune from “AI slows me down. It’s faster to do it yourself correctly, making mistakes that are easier to catch, than to use the hallucinator that makes mistakes that look correct so it’s harder to find those mistakes in the first place” to “actually you have to use it, it’s just a tool, and it is getting better every day, get on board or get left behind, maybe it sucks at brownfield projects in industry but it can whip up your hobby project or quick web UI much faster than you can.” Not sure if this is actual people changing opinions as they have more time with the tech and it improves (and is it? Because there are both reports of it getting better, and reports of new models being worse).

      “Go try the tools to assess them and form your own opinion then!” I’m fresh out of a CS degree and don’t feel knowledgeable enough in coding to be able to assess if it would be flawed or not. I trust expert opinions (or at least more experienced opinions, which I assume to be most users of programming.dev) over my own, which usually helps but is awful when the experts conflict. As for just learning to code well so I become qualified to judge for myself, I’m still hearing conflicting things! “It’ll speed up your learning x5, it helped me with this thing that otherwise would have taken hours of forumscrolling and trial and error” vs. “it makes so many mistakes, especially when you are learning and do not yet know how to catch the AI mistakes, you shouldn’t use it.” I personally do not feel great about trusting something that (as far as I know) is not actually citing its sources, that does not know and is not speaking from experience. I hate the idea of trusting a black box where I cannot look at the insides and see how it came to that output or ask an expert who knows more than me about what the insides are doing (because yeah, I admit cars are black boxes to me because I do not understand their working, but I know someone does!). I hate the idea of trusting something made to be nondeterministic by design to be correct all the time, or having to be its babysitter-corrected-reviewer instead of just making it myself. But maybe it advanced further than I thought?

      End result: too many conflicting ideas, paralysis, do nothing. (And even if AI does actually make you faster, doing it without AI is still infinitely faster than doing nothing.) God I hope typing that conclusion helps motivate me to get back on those projects, building my skills I have not been using at work back up.

      I don’t know what to think anymore. Do not want to fight a losing battle and be a stupid Luddite, also recognizing Luddites were not exactly “all tech bad” but “who is the tech doing things for and who is it doing it to,” wanting to get ahead and not have things done to me without also promoting use of a tool that does things to others. Not wanting to get automated out of a job, “you’re so smart andioop, you are sure to get a good job!” only to see knowledge workers threatened while being autistic so the social skills are not so hot (I can work on improving! But I feel I’ll probably never approach the level of a non-autistic person applying the same effort, nor will I ever get some nonverbal cues) and not liking the idea of manual labor… also recognizing that people who can change things do not care about the people getting automated out of a job so I have to figure out how to reskill while having a full-time job, into something that will not just be automated away too (but everything I am good at is stuff people claim AI can do). Thinking about how disruptors, even of men-in-the-middle, not only can take out fat cats but also little guys like me who learned how to do that kind of job, so maybe being forced to do something else for society is good, but also that families and lives depend on a job and the unregulated surveillance capitalism society I live in does not always have us doing meaningful work that contributes to something useful for society so maybe the disruption does not actually force us to pick something more useful to society now.

      Still don’t like AI, but also scared of being wrong and getting eaten because of my own bias and not being willing to move on and change with the times at the old age of my 20s. Also not even sure how to change with the times.

    • MonkeMischief@lemmy.today
      link
      fedilink
      arrow-up
      21
      ·
      2 days ago

      This is exactly why I laughed out loud, incredulously, at Dell’s “AI powered laptop” commercial that promises you will “free up so much time for the things you love” by using AI.

      From washing machines to robot assembly, we’re still buying that old lie??

      The tools improve, the expectations increase, the wage stays the same.

      • lad@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Good point, I wonder if elasticity of demand will be affected by the AI bubble bursting (I expect there will be a recession and demand will get less elastic, bit I’m no economist)

  • pageflight@piefed.social
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    2
    ·
    3 days ago

    Since 1990, I’ve programmed in BASIC, C, Visual Basic, PHP, ASP, Perl, Python, Ruby, MUSHcode, and some others. I am not an expert in any of these languages—I learned just enough to get the job done. I have developed my own hobby games over the years using BASIC, Torque Game Engine, and Godot,

    I think this is where AI unquestionably shines: switching languages/projects frequently, on personal projects.

    so I have some idea of what makes a good architecture for a modular program that can be expanded over time.

    But I actually draw the opposite conclusion. The architecture and maintainability needs are where AI is pretty poor, and they’re vastly different and more important in a 100-1000 person 10 year production system.

    • Clent@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Admits they aren’t an expert, claims they know what good architecture is.

      That is about the most competency one can expect from anyone pushing this sort of tech.

      Actual experts know its severely limited. Dunning-Krugers cannot.

    • DacoTaco@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      3 days ago

      Agreed. As an ex-technical lead and co-architect i also agree that what ai does is often very poor architectural design and i wouldnt want it to touch that, ever.

  • footfaults@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    3 days ago

    This guy just vibe coded a bunch of slop, based on high quality training data (everyone’s code on GitHub, including probably lots of unity projects, godot, etc). It’s sort of disgusting to me.

    • Mikina@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      2 days ago

      It’s starting to slowly show up. I think I’ve seen quite a lot new tools and projects pop up, from World of Warcraft addons, CIs, through “game engine based on Tesla’s Aether theory” to secure loginless messengers.

      I remember few months ago that the state was “If vibe coding is so good, where are the AI coded projects?”, and I’m starting to slowly feel, at least anecdotally and in the past few weeks, that they are slowly starting to surface.

      As a DJ, AI music made it extremely difficult for me to build sets, since I really don’t want to support AI music. If FOSS vibe-coded apps start popping out just like AI music did, it’s going to suck - especially as someone who likes to look for new tools and cool software often, mostly around cybersecurity. Vetting tools as safe to use is already pretty difficult in that scenario.

      Thankfully, most of them will just have agents.md or ./claude, so I know I can disregard them outright. Unfortunately, seems like Bitwarden is one of those :(