• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    6 days ago

    Ctrl-F “Chinese.”

    Yeah this is garbage.

    Only an explanation of consciousness in terms of unconscious events could explain consciousness. Anything less is religion.

    LLMs aren’t AGI because they’re just spicy autocomplete. They’re designed for plausibility - not correctness. If you ask one to write like a creationist troll, they’ll gladly do it. That’s a part of their training data - that’s a vector they can maximize - that’s a label they can satisfy. Nowhere in that data was there a metatextual value judgement separating dishonest bullshit, our best efforts at truth, and the bizarre middle ground occupied by fiction. There’s text calling the bullshit, bullshit… but there’s also text calling the truth bullshit. An LLM is not the right shape of network to give a dang which is which. You can only tell it which angle it’s writing from today.

    … oh my god, this paper keeps going. Dude is grasping for ways a billion-dollar supercomputer could never ever ever do what’s possible in three pounds of wet meat powered by cheeseburgers. It’s not accidentally titled ‘AGI won’t happen.’ This dipshit with half a credit in Philosophy 101 is doing bigotry against robots. Like no matter how clearly any future system acts like it thinks and feels, it’s not Like Us, so it just won’t count.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 days ago

        I’m just sick to death of John Searle’s forty-year-long refusal to understand computers. That prick thinks you can yank out your CPU to interrogate it, and if it doesn’t know what a web browser is, it cannot possibly have rendered this webpage. It’s the software, stupid. Software does the work. This was what Church & Turing were on about, back in nineteen-thirty-fuck-you.

        If your grand thought experiment begins with demonstrable understanding of Chinese, in a room containing a book and a dude, and it’s not the dude - understanding still occurred. There’s a Chinese guy outside the room, and as surely as if the room contained a second Chinese guy, the system within the room understands Chinese. But there’s just Jim and a book. Either Jim or the book understand Chinese… and it’s not Jim. Take all the time you need.

  • jrs100000@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    This is like a reverse ontological argument. AGI cant exist because only natural intelligence actually counts as intelligence.

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    6 days ago

    this is just pointing out that llms are not agi. doesnt change the fact that the llms are going to be good enough to get a lot of us fired.

  • JeeBaiChow@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 days ago

    Didn’t stop the financial markets from hyping the crap out of it though. Such a waste. If only we spent money on improving the lives of the impoverished.