• Shardikprime@lemmy.world
    link
    fedilink
    arrow-up
    22
    arrow-down
    5
    ·
    edit-2
    3 days ago

    I think you people are vastly overestimating how much we actually know about the brain or severely underestimating how freaking complex it is.

    The “you” reading this right now, is a fucking stack of six A4 sized sheets, each one nanometers thick, and crumpled into something which, by all appearances, looks to an external observer as an oversized walnut seed, cooled and maintained by a network of 400 miles capillaries, and isolated from the world by the blood brain barrier, which can only be described as a fucking miracle.

    No. No one is going to be implanting any memories soon

    • infinite_ass@leminal.space
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      2 days ago

      Maybe memories are actually really simple. Like the words on a screen. An arrangement of symbols, then a boatload of meaning and interpretation and rationalization. So all you need to do to make memories is to insert a few words. The brain’s “memory interpreter” does the rest of the work.

      For example, we insert the words “brother appears”. Then, for the “new memory”, we reference your memories of your brother. His appearance and the sound of his voice. Then we contrive a narrative explaining why “brother” is at this place and time. Etc. Voila! You now have a memory of your brother standing there saying some stuff.

      So to make a memory, it wouldn’t require a grand delicate manipulation of brainstuff. Just a simple thing.

      • Shardikprime@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        17 hours ago

        Memory and simple are words that you can only read when saying “memory IS NOT simple”

        For fucks sake, our body stores memories for preferences in our literal guts

        Memory is a lot of things except simple

    • stinky@redlemmy.com
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      3 days ago

      AI is better at recognizing patterns than we are. The brain may be unfathomable to us, but technology already exists which could recognize the signals in your brain that represent memories and reproduce or alter them.

      Neuralink and similar devices are being used right now, today, to record the thoughts of animals. The first neuralink patient is alive and well, meaning it’s already being used on humans.

      Do you really think this technology won’t exist in our lifetime?

      • ✺roguetrick✺@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        2 days ago

        Do you really think this technology won’t exist in our lifetime?

        Yes, absolutely. What you’re describing is AGI. If an AI could untangle engrams from branched clusters of extremely plastic neurons, it could understand and improve it’s own thinking. It would actually be self aware before it could untangle the mess that our brains are. And I don’t see AGI happening with our current material and resource constraints before I die. Seeing brain regions being active and de-novo engram implantation is about as close as an LLM is to AGI.

        • Shardikprime@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          2 days ago

          It is as you say, the scale doesn’t even exist at this point

          Even the recent fly brain mapping, enhanced with AI, had to take a destructive approach to map a half a milligram brain and these people are thinking matrix reloaded already

        • stinky@redlemmy.com
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          2 days ago

          Respectfully, this sounds like opinion and doubt rather than a credibly timeline. Other than rattling off industry terms the only support you’ve given your argument is “I don’t see AGI happening”. You’ve collected an impressive shopping basket of buzz words but done little to dissuade me or the engineers developing this technology that it won’t be ready within a lifetime. Stay tuned.

          Oh, and “its own thinking” not “it’s own thinking”. His, hers, its.

          • ✺roguetrick✺@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            2 days ago

            Your extrapolation has about as much support. I don’t really know what bothers you about the vocabulary I used but I can say I don’t play much attention to punctuation marks when inputting text with a swipe keyboard on my phone.

            • stinky@redlemmy.com
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              2 days ago

              “pay much attention” not “play”. I’d be more careful with that keyboard if I were you. Wouldn’t want to lose any credibility.

                • stinky@redlemmy.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  2 days ago

                  But you expect us to care about your opinion? Be correct and be nice or you won’t get to finish the discussion. It’s like a recipe, you have to do the work to get the product.

        • Naz@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          2 days ago

          Being 70-80 years old sucks. My condolences. We’ll mess around with AGI when you’re gone and I’ll think about you