Doctors across Bahrain are warning that the rise of AI tools is complicating medical care, as more patients arrive at clinics armed with self-diagnoses drawn from digital assistants ... having galloped away in distress with unfounded conclusions.
Physicians say the trend often leads to confusion and extra time spent correcting misconceptions and misinformation, rather than providing suitable care for the actual problem.
University Medical Centre at King Abdullah Medical City senior paediatric consultant professor Mohammed Elbeltagi told the GDN that using tools like ChatGPT for medical advice is often a double-edged sword and can cause more harm than good.
“While it can be helpful, the risks of misuse are very real and in some cases could be fatal,” he said. “I have seen many contradictory examples in my clinic.
“One example was a mother who panicked over a simple swelling. She came to me in a state of extreme concern after asking her device about her child’s cervical lymph nodes. The chatbot incorrectly suggested the cause was lymphoma.”
Dr Elbeltagi
After examining the child, Dr Elbeltagi found that rather than a group of blood and lymph tumours the condition was common, chronic inflammation that required a simple follow-up treatment and the child was fine.
He cited another instance where a mother who consulted ChatGPT about a rash on her child’s skin. “Due to errors in the data she entered, it suggested her child had Kaposi’s Sarcoma, a condition often linked to HIV,” he said, adding that this led to a ‘complete psychological breakdown’ of the mother.
“These stories highlight a crucial point that ChatGPT is a language model, not a medical professional,” he added.
Dr Elbeltagi also explained that ChatGPT provides information based on patterns in its training data, not on clinical experience, physical examination or an understanding of the patient’s full medical history.
“The reality of its output is entirely dependent on the input it receives and the user’s ability to interpret it,” he added.
“Just like a powerful horse, it needs to be handled with expertise and caution.”
Dr Alromaihi
Awali Hospital consultant endocrinologist Dr Dalal Alromaihi admitted that while digital tools and searches can be harmful, they have benefits, such as empowering people to become more aware and engaged in their care.
However, as Dr Elbeltagi stated, these tools do not ‘replace a thorough medical history, physical examination or the wise judgement that comes from clinical training and experience’.
A response from ChatGPT about a swelling lymph node
“I have encountered cases where patients relied heavily on AI-generated or Internet-sourced information,” she said, adding that it has led to anxiety, misinterpretations and resistance to evidence-based recommendations.
“My approach is to acknowledge their effort in seeking information, clarify any misconceptions and explain the reasoning behind my advice in simple language,” she added.
She pointed out that AI can be a valuable tool if used responsibly and correctly with proper use of prompts and insisting on gathering information from highly reliable scientific resources.
“I have encouraged some of my patients to use AI to translate information from dietitians to personlised food menus,” she explained.
“What we do not want is for AI to fill the void of need for more information. I also offer my patients my contacts to reach out with questions between visits,” she said.
Dr Ramachandran
Meanwhile, American Mission Hospital primary care head and general practitioner Dr Babu Ramachandran told the GDN that in the past, patients often relied on Google and Yahoo search engines for self-diagnosis, usually receiving vague or incomplete results.
Today, with more advanced technology like ChatGPT, online tools provide far more detailed information, sometimes to the point of ‘overwhelming’ patients.
“The problem is it creates a sense of fear and panic and makes people rush to the hospital when there is no need,” he said, adding that AI will be ‘the future’, therefore doctors and patients must collaborate to adapt to new times. However, in any medical profession, you need the human element and the personal touch that comes with a doctor and a patient in a consultation.”
julia@gdnmedia.bh