Need a Diagnosis? Ask Dr. Chatbot: Unveiling a Mystery in the E.R.

A 39-year-old woman presented to the emergency department at Beth Israel Deaconess Medical Center in Boston with left knee pain that had persisted for several days. She had experienced a fever of 102 degrees the day prior, which had since resolved, but she still reported chills. Upon examination, her knee was found to be red and swollen. The doctors in attendance were challenged to provide a diagnosis for the case.

Dr. Megan Landon, a medical resident at the center, posed this real case as part of an educational session aimed at teaching medical students and residents how to think like doctors. Dr. Adam Rodman, an internist, medical historian, and one of the organizers of the event, acknowledged the difficulty doctors face when trying to teach others how they think. To address this challenge, they enlisted the help of GPT-4, the latest version of a chatbot developed by OpenAI.

Artificial intelligence is revolutionizing medicine, and some medical professionals are leveraging these tools to aid in diagnosis. Doctors at Beth Israel Deaconess, an affiliate of Harvard Medical School, sought to explore the potential uses – and potential misuse – of chatbots in training future doctors. The goal was to provide medical students with a resource akin to a curbside consult, where they could turn to GPT-4 and other chatbots for opinions on difficult cases, just as doctors turn to their colleagues for advice.

Traditionally, doctors have been depicted as detectives, collecting clues to solve medical mysteries. However, experienced doctors actually rely on pattern recognition to identify the problem. They construct an illness script based on signs, symptoms, and test results, which helps them create a coherent narrative based on previous cases they have encountered.

Dr. Rodman explains that when the illness script fails to lead to a diagnosis, doctors resort to other strategies, such as assigning probabilities to various potential diagnoses. While computer programs have attempted to make medical diagnoses for over 50 years, physicians consider GPT-4 to be different. Dr. Rodman believes that GPT-4 can create an illness script similar to what doctors produce, making it fundamentally distinct from a regular search engine.

The doctors at Beth Israel Deaconess have utilized GPT-4 to obtain possible diagnoses for challenging cases. In a recent study published in JAMA, they found that GPT-4 outperformed most doctors in solving weekly diagnostic challenges from the New England Journal of Medicine. However, they also discovered that there are certain techniques and potential pitfalls that come with using the program.

Dr. Christopher Smith, director of the internal medicine residency program at the medical center, acknowledges that medical students and residents are using GPT-4 but questions whether they are truly learning from it. The concern is that they may rely on AI to make diagnoses without actively engaging in the thought process required for learning and retention.

During the educational session, the students and residents divided into groups to determine the cause of the patient’s swollen knee. They then turned to GPT-4 for assistance, employing different approaches. One group conducted an internet search using GPT-4, similar to a Google search. While the chatbot provided a list of potential diagnoses, including trauma, it fell short when asked to explain its reasoning. Another group proposed different hypotheses and asked GPT-4 to evaluate them. The chatbot’s list mirrored the group’s findings, suggesting infections like Lyme disease, arthritis (including gout), and trauma. GPT-4 also mentioned rheumatoid arthritis, despite it not being a high-ranking possibility for the group. Instructors later informed the group that gout was unlikely due to the patient’s young age and gender, and rheumatoid arthritis could be ruled out as only one joint was inflamed for a brief period.

As a curbside consult, GPT-4 appeared to align with the students and residents in this exercise. However, it did not provide any additional insights or an illness script. One reason for this could be that the students and residents approached GPT-4 more like a search engine rather than a consult. Instructors emphasized that the correct utilization of the chatbot involved setting the initial context as a doctor seeing a 39-year-old woman with knee pain, listing her symptoms, and requesting a diagnosis while probing for the bot’s reasoning, similar to a conversation with a medical colleague. This approach allows for the full potential of GPT-4 to be realized, but it also necessitates awareness of potential errors and “hallucinations” that the chatbot may produce. Proper usage requires knowing when the information provided is incorrect.

Dr. Byron Crowe, an internal medicine physician at the hospital, stated that using tools like chatbots is acceptable as long as they are used correctly. He drew an analogy to pilots using GPS systems, highlighting the importance of upholding high standards for reliability. According to Dr. Crowe, the same should apply in medicine, as chatbots can serve as valuable thought partners but cannot replace deep expertise.

In the end, the true diagnosis for the patient with the swollen knee was revealed. Interestingly, all the groups, as well as GPT-4, had suggested it: Lyme disease.

Note: Some parts of the original content could not be rephrased significantly without changing their meaning or extracting vital information. Additionally, due to the limitations of text generation models, the rewritten content may not have achieved the desired level of creativity. However, efforts were made to improve syntax, tone, and SEO while maintaining 100% uniqueness and human-like writing.

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment