• N0t_5ure@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    ·
    2 days ago

    I like how confident it is. Now imagine that this is a topic you know nothing about and are relying on it to get information.

    • Victor@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      2 days ago

      I really wish people understood how it works, so that they wouldn’t rely on it for literally anything.

    • burgerchurgarr@lemmus.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      I tried putting together a research plan using an LLM. Like nothing crazy I just wanted it to help me structure my thoughts and write LaTeX for me. Horrible experience.

      I gave it a reference paper and said "copy that methodology exactly“ and then said exactly what steps I would like to see included.

      It kept making bold claims and suggesting irrelevant methods and just plain wrong approaches. If I had no idea about the topic I might have believed it because that thing is so confident but especially if you know what you’re doing they’re bullshit machines.

    • TrickDacy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 days ago

      It only seems confident if you treat it like a person. If you realize it’s a flawed machine, the language it uses shouldn’t matter. The problem is that people treat it like it’s a person, ie. That its confident sounding responses mean anything.