• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 hours ago

    This is why programmers aren’t rending our garments over generative AI. We’ve been trying to trivialize our profession for sixty fucking years. The best we’ve done is compilers, high-level languages, and autocomplete.

    I’m not sure LLMs will do better than BASIC. Neural networks will, for sure, and soon. We’ve proven that data alone can train the robot to do any damn thing. But LLMs are the wrong answer to nearly every question. They don’t know how to say “I don’t know.” They can’t look at their recent output and go “wait, that’s off.” They lose track of what they’re doing, just by doing it.

    The fact they even almost work demonstrates the raw power of backpropagation and scale. But when the hype bubble bursts, diffusion’s gonna stick around, and LLMs mostly won’t. I’m not even sure they’re the right answer for the RPG NPC dialog generators that absolutely everyone expected.