Dailyhunt
World Health Day 2026: Why AI chatbots won't make good doctors

World Health Day 2026: Why AI chatbots won't make good doctors

Deccan Herald 1 week ago

Most people after falling sick won't right away book a doctor's appointment but many would ask the AI chatbot in their phone to help them with the symptoms.

On this World Health Day 2026, the way Indians are approaching healthcare is quietly changing with a growing dependency on artificial intelligence.

From confusing physical symptoms to medicine verification, it is not uncommon for people to treat AI as the first line of physician.

However, experts say these chatbots are not ready to act as doctors yet.

While these large language model (LLM) chatbots are theoretically capable enough to easily pass a medical exam, they are not as nuanced or intuitive as real-room physicians.

An expert speaking to The Conversation said the role of artificial intelligence in medicine remains real but it is going to be more supportive than revolutionary.

Specialised AI versus commercial bots

The healthcare industry is investing heavily on safely integrating artificial intelligence into clinical workflows, creating specialised bots capable of processing medical information at high speed and accuracy in clinical settings.

However, these chatbots are different from commercial ones in terms of the database fed into them.

The specialised AI chatbots are trained on narrow databases, for instance, creating a tool that can accurately detect one specific kind of cancer, achieving a high accuracy in the domain.

Commercial bots are built on broader, generic databases and can struggle with clinical accuracy. Most people have access to commercial chatbots to evaluate their symptoms, offering generic interpretation.

AI chatbots mimic human empathy, don't make reliable therapists

Missing empathy and context

The diagnosis of a health condition in a real-life clinic is not a straightforward answer but a messier process with a lot of back and forth between the patient and the attending physician.

The whitecoat sitting across the table is not just answering a patient's questions but also drawing out information from them and putting them into a medical context.

Sometimes, patients describe symptoms in an incomplete sense and can question in an unpredictable sequence, a real-room doctor often fills these gaps and is trained to read between the lines, said an article published in The Conversation.

The manner in which a physician understands the symptoms is subtle, intuitive and far more advanced than simply recognising patterns and using nuanced language to interpret them.

Speaking to DH, Dr. Vikram Jeet Singh, a senior consultant in internal medicine at Aakash healthcare (Dwarka) said: A real-room diagnosis involves taking a detailed clinical history, physically examining the patient, asking for specific risk factors, and in some cases relying on intuition gained with years of clinical experience. AI bots, such as Chat GPT, cannot conduct any physical examinations or detect minor clinical features like changes in skin color, observing breathing rate, or neural reactions. Such constraints ensure that AI can hardly distinguish between conditions that might have similar symptoms yet completely different management strategies."

With some AI chatbots having a user bias, there is a risk of them reassuring false interpretation by the user or in other cases diluting the need for immediate care in risky situations, said Dr Singh.

More of a data organiser than a verdict

It is common for people to get confused with blood reports and the numbers on it. A few digits are within limits, other marked in black. The waiting time between getting a report and consulting a doctor is often emotionally taxing for most people. This is when they rush to AI chatbots to make sense of the report.

As per the expert, artificial intelligence can help a person in putting the findings of a medical report into simple language, however, it cannot put them into a context.

"A variation in blood parameters can be of little significance in one patient but a clinical concern in another based on age, comorbid conditions, drugs, and concurrent symptoms. Likewise, the imaging reports frequently need to be associated with the physical findings and the symptom progression, which cannot be properly analyzed by AI alone," said Dr Singh.

Postponing real care

The expert highlighted the risk of AI chatbots indirectly causing the user to delay seeing a doctor and sometimes ignoring symptoms that need urgent attention.

Also, in a real-life diagnosis, a doctor takes responsibility for the decisions made inside the four walls of a clinic but who will make an AI chatbot accountable in cases of missed or misdiagnosis, added Dr Singh.

"AI systems are unable to conduct a personalized follow-up or emergency intervention when needed," he further said.

The expert further stressed to limit the use of AI chatbots to organise information, deepen understanding of medical conditions and summarize text.

Dailyhunt
Disclaimer: This content has not been generated, created or edited by Dailyhunt. Publisher: Deccan Herald