• BradleyUffner@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    4
    ·
    edit-2
    3 days ago

    You do understand that the model weights and the context are not the same thing right? They operate completely differently and have different purposes.

    Trying to change the model’s behavior using instructions in the context is going to fail. That’s like trying to change how a word processor works by typing in to the document. Sure, you can kind of get the formatting you want if you manhandle the data, but you haven’t changed how the application works.

    • SchmidtGenetics@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      7
      ·
      3 days ago

      Why are you so focused on just the training? The data is ALSO the issue.

      Of course if you ignore one fix, that works, of course you can only cry it’s not fixable.

      But it is.

      • BradleyUffner@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        4
        ·
        edit-2
        3 days ago

        Why are you so focused on just the training?

        Because I work with LLMs daily. I understand how they work. No matter how much I type at an LLM, its behavior will never fundamentally change without regenerating the model. It never learns anything from the content of the context.

        The model is the LLM. The context is the document of a word processor.

        A Jr developer will actually learn and grow in to a Sr developer and will retain that knowledge as they move from job to job. That is fundamentally different from how an LLM works.

        I’m not anti-AI. I’m not “crying” about their issues. I’m just discussing the from a practical standpoint.

        LLMs do not learn.

        • SchmidtGenetics@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          8
          ·
          edit-2
          3 days ago

          Because I work with LLMs daily. I understand how they work.

          Clearly you don’t, because context data modifies how the training data extrapolates.

          You can use something, while not being educated on how to use it. And just using something does not mean you understand how they work. Your comments have made it QUITE clear that you have no idea.

          People who just whing about AI and pretend they know how they work are the worst kind of people right now.

          • BradleyUffner@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            2
            ·
            edit-2
            3 days ago

            Your comments have made it QUITE clear that you have no idea.

            Odd, I can say the exact same thing about your comments on the subject.

            We are clearly at an impasse that won’t be solved through this discussion.