• Zozano@aussie.zone
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    What makes you think humans are better at evaluating truth? Most people can’t even define what they mean by “truth,” let alone apply epistemic rigor. Tweak it a little, and Gpt is more consistent and applies reasoning patterns that outperform the average human by miles.

    Epistemology isn’t some mystical art, it’s a structured method for assessing belief and justification, and large models approximate it surprisingly well. Sure it doesn’t “understand” truth in the human sense, but it does evaluate claims against internalized patterns of logic, evidence, and coherence based on a massive corpus of human discourse. That’s more than most people manage in a Facebook argument.

    So yes, it can evaluate truth. Not perfectly, but often better than the average person.

    • Dzso@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      59 minutes ago

      I’m not saying humans are infallible at recognizing truth either. That’s why so many of us fall for the untruths that AI tells us. But we have access to many tools that help us evaluate truth. AI is emphatically NOT the right tool for that job. Period.