• MudMan@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    I keep having to repeat this, but the conversation does keep going on a loop: LLMs aren’t entirely useless and they’re not search engines. You shouldn’t ask it any questions you don’t already know the answer to (or have the tools to verify, at least).

    • seven_phone@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      That is exactly the point, LLM aim to simulate the chaotic best guess flow of the human mind, to be conscious and at least present the appearance of thinking and from that to access and process facts but not be a repository of facts in themselves. The accusation here that the model constructed a fact and then built on it is missing the point, this is exactly the way organic minds work. Human memory is constantly reworked and altered based on fresh information and simple musings and the new memory taken as factual even while it is in large part fabricated, and to an increasing extent over time. Many of our memories of past events bear only cursory fidelity to the actual details of the events themselves to the point that they could be defined as imagined. We still take these imagined memories as real and act upon them exactly as has been done here by the AI model.

    • faythofdragons@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I had to tell DDG to not give me an AI summary of my search, so its clearly intended to be used as a search engine.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        “Intended” is a weird choice there. Certainly the people selling them are selling them as search engines, even though they aren’t one.

        On DDG’s implementation, though, you’re just wrong. The search engine is still the search engine. They are using an LLM as a summary of the results. Which is also a bad implementation, because it will do a bad job at something you can do by just… looking down. But, crucially, the LLM is neither doing the searching nor generating the results themselves.