Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn’t ready to take on the role of the physician.”

“In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice,” the study’s authors wrote. “One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care.”

  • sheogorath@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    Yep, I’ve worked in systems like these and we actually had doctors as part of our development team to make sure the diagnosis is accurate.

    • ranzispa@mander.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      Same, my conclusion is that we have too much faith in medics. Not that Llama are good at being a medic, but apparently in many cases they will outperform a medic, especially if the medic is not specialized in treating that type of patients. And it does often happen around here that medics treat patients with conditions outside of their expertise area.