Ubie Health, Docus AI, Ada: what UK doctors must know about the AI symptom checkers their patients are using

Ubie Health, Docus AI, Ada: what UK doctors must know about the AI symptom checkers their patients are using

Introduction: "doctor, I googled my symptoms... and the AI said..."

It’s a scenario now familiar to almost every UK clinician: a patient arrives for their appointment, phone in hand, and begins the consultation with, "So, I used this app, and it said I might have..." The era of "Dr. Google" has evolved. Patients are increasingly turning to sophisticated AI symptom checker platforms like Ubie Health, Docus AI, and the well-established Ada Health before they even book an appointment.

While patient engagement in their health is positive, the use of these tools raises important questions about accuracy, patient safety AI, and professional responsibility. As a clinician, you are not expected to be a software regulator, but understanding the landscape of these patient-facing tools is becoming crucial. This article provides a practical guide for UK clinicians on how these platforms work, their accuracy, and the important safety standards, governed by the MHRA, that separate a helpful tool from a risky one.

The landscape of symptom checkers

A growing number of AI platforms are now available to the public. When a patient mentions an app they've used, it's likely to be one of these key players:

  • The Well-Known: Ada Health Ada Health is one of the most established symptom checkers in the UK and EU. It has been available for years and is a good example of a company that has engaged with regulators. Many see it as a more advanced Babylon Health alternative for patient-facing triage.

  • The New Wave: Ubie Health and Docus AI Newer entrants are gaining popularity. Ubie Health, a platform originally from Japan, uses an extensive question-and-answer format to guide users toward potential conditions. Other platforms like Docus AI and Doctronic are also part of a broader trend where patients use AI to get health insights, from analysing symptoms to generating reports from their medical records for a second opinion. The common feature is that they take user input to produce a list of potential conditions or next steps.

The critical question: how accurate are they?

The core promise of any AI symptom checker is accuracy. Many companies, including Ubie and Ada, have published studies on their performance, often using metrics like "Top-1 accuracy" (the AI's first suggestion is correct) and "Top-5 accuracy" (the correct diagnosis is within the AI's top five suggestions).

While some studies show impressive results, it is vital for clinicians to understand that:

  1. Accuracy varies widely between different tools and for different conditions.
  2. Statistical accuracy is not a clinical diagnosis. An AI identifies patterns and correlations in data; it does not conduct a physical examination, understand a patient's full context, or apply clinical judgment. It's a statistical guide, not a medical conclusion.

The UK safety standard: is it a medical device?

This is the most important question from a UK regulatory and safety perspective. The Medicines and Healthcare products Regulatory Agency (MHRA) has a clear position: if an app takes patient-specific data to suggest a diagnosis, triage them to a specific level of care, or recommend a treatment, it is legally classified as a medical device.

This is not a minor detail. For an app to be legally marketed as a medical device in the UK, it must bear a UKCA mark. This mark is a declaration by the manufacturer that they have:

  • Met rigorous safety and performance standards.
  • Implemented a quality management system.
  • Appointed a clinical safety officer.
  • Established processes for risk management and post-market surveillance.

As a clinician, the presence or absence of a UKCA mark on a diagnostic or triage app is a key indicator of its manufacturer's commitment to quality and safety under UK law.

How to counsel your patients

When a patient brings up an AI symptom checker, it presents an opportunity for education. Rather than dismissing their research, you can guide them on how to use these tools more safely.

  • Advise on purpose, not diagnosis: Encourage patients to use these tools to organise their thoughts and symptoms before an appointment. An AI-generated list of symptoms can be a helpful starting point for a consultation, but it should not be treated as a self-diagnosis.
  • Encourage transparency: Advise patients to be curious about the tools they use. Encourage them to use apps from identifiable companies that are transparent about their regulatory status (i.e., if they are UKCA marked) and the evidence they use.

Conclusion: the clinician's role vs. the patient's tool

Patient-facing AI symptom checkers are here to stay, and their sophistication will only increase. They can empower patients to be more engaged in their health, but they are fundamentally different from the professional-grade tools used by clinicians.

They represent one side of the healthcare coin. While your patients use symptom checkers to explore 'what' they might have, iatroX is the tool designed for you, the clinician, to determine 'what to do next' based on trusted, traceable UK national guidelines from NICE, CKS, and the BNF. It's the professional's co-pilot for evidence-based decision-making.