For most use cases, web search engines are fine. But I am wondering if there are alternative ways to finding information. There is also the enshittification of google and tbh most(free) search engines just give google search result

Obviously, the straight answer is just asking other people, in person or online, in general forums or specialised communities

Libraries are good source too but for those of is that don’t have access to physical libraries, there free online public libraries(I will post the links for those that I found below)

Books in general, a lot of them have reference to outside materials.

So, I been experimenting with an AI chat bot(Le chat), partially as life coach of sorts and partially as a fine tuned web search engine. To cut to the chase, its bad. when its not just listing google top results it list tools that are long gone or just makes shit up. I was hoping it to be a fine tuned search engine, cuz with google, if what you want is not in the top 10 websites, your on your own.

So yeah, that all I can think of. Those are all the routes I can think of for finding information and probably all there is but maybe I missed some other routes.

  • mushroommunk@lemmy.today
    link
    fedilink
    arrow-up
    14
    ·
    15 hours ago

    Chatbots can’t think or tell if anything is correct. They can generate fake data and there is literally no amount of “tuning” to fix that. For the love of the planet and your sanity, please never touch one again.

    I find duck duck go works better than Google now. I also find perusing Wikipedia will often give me the information I need or point me in a better direction.

    • Perspectivist@feddit.uk
      link
      fedilink
      arrow-up
      9
      arrow-down
      4
      ·
      15 hours ago

      The chatbot isn’t the issue here. It’s the user treating it like a reliable source of information.

      It’s a large language model - not a large knowledge model. It gets plenty of stuff right, but that’s not because it actually “knows” anything - it’s just trained on a massive pile of correct information.

      People trash it for the times it gets things wrong, but it should be the other way around. It’s honestly amazing how much it gets right when you consider that’s not even what it’s built to do.

      It’s like cruise control that turns out to be a surprisingly decent driver too.

      • mushroommunk@lemmy.today
        link
        fedilink
        arrow-up
        12
        arrow-down
        3
        ·
        15 hours ago

        I’m fully aware of the technology behind it. I trash it because it’s resource sucking, planet burning trash that serves no real purpose.

        The technology is inherently flawed and fills no niche because you can never trust anything from it. Even if it’s right 9/10 times that tenth time can, and has, killed people.

        • Perspectivist@feddit.uk
          link
          fedilink
          arrow-up
          2
          arrow-down
          7
          ·
          14 hours ago

          It’s a chatbot. You talk to it, and it responds in natural language. That’s exactly what it’s designed to do - and it does it exceptionally well, far better than any system we’ve had before.

          Faulting it for being untrustworthy just shows most people don’t actually understand this tech, even though they claim they do. Like I said before: it’s a large language model - not a large knowledge model.

          Expecting factual answers from it is like expecting cruise control to steer your car. When you end up in the ditch, it’s not because cruise control is some inherently flawed technology with no purpose. It’s because you misused the system for something it was never designed to do.

      • turboSnail@piefed.europe.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        11 hours ago

        If you just Google something like “health effects of hibiscus,” you’ll find a mixture of information too. Most people probably can’t tell which claims are well researched and which ones aren’t.

        You’ll be left with a mixed bag, but reading all that takes more time than it takes to read an equally flawed summary you get from a gas powered AI. From a convenience perspective, I can understand why some people might prefer an LLM. From a reliability perspective, I can’t favour either option. Regardless, the difference in environmental impact should be clear to everyone.

        • mushroommunk@lemmy.today
          link
          fedilink
          arrow-up
          4
          ·
          12 hours ago

          We should exact mandatory media literacy classes alongside killing LLMs. I know Finland has them and from what I can tell it’s been a resounding success.