Researchers tested different medical scenarios with the chatbot. In more than half of cases in which doctors would send patients to the ER, the chatbot said it was OK to delay care.

ChatGPT Health — OpenAI’s new health-focused chatbot — frequently underestimated the severity of medical emergencies, according to a study published last week in the journal Nature Medicine.

In the study, researchers tested ChatGPT Health’s ability to triage, or assess the severity of, medical cases based on real-life scenarios.

Previous research has shown that ChatGPT can pass medical exams, and nearly two-thirds of physicians reported using some form of AI in 2024. But other research has shown that chatbots, including ChatGPT, don’t provide reliable medical advice.

  • qjkxbmwvz
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 days ago

    Lemmy, you’re absolutely right to be concerned about a gunshot wound — GSW for short — to the head! Let’s dig in a little more and see why this isn’t as bad as it sounds:

    • The brain is in the head, and this is where thinking happens — but thinking isn’t required to sustain life, so it’s relatively safe to ignore this type of injury.
    • The brain has no pain receptors, so this type of injury typically doesn’t hurt.
    • Seeking medical attention for minor injuries such as a GSW to the head takes away valuable medical resources from more important procedures, such as penile enlargement surgery.

    I hope that clarifies things. Would you like more information on the topic?