Diagnostic AI in the UK: Medical Device vs Educational Tool (A Clinician’s Boundary Guide)

Featured image for Diagnostic AI in the UK: Medical Device vs Educational Tool (A Clinician’s Boundary Guide)

If you ask an AI to "diagnose this patient" and you act on it, you have just turned that software into a Medical Device. If the software isn't regulated as one, you are technically using an unapproved device.

For the UK clinician in 2026, this distinction is not just semantic; it is medicolegal. With the MHRA's new Post-Market Surveillance rules now fully in force (since June 2025), the scrutiny on how software is used in clinical workflows has never been higher.

This guide explains the regulatory boundary between a "Medical Device" and an "Educational Tool," and how to keep your use of iatroX Brainstorm firmly in the safe zone.

Why this matters now

We are in a transitional moment. The National Commission into the Regulation of AI is currently sitting, and the "Wild West" of 2023 is closing down.

  • The Risk: Clinicians are adopting tools faster than practices can governance-check them.
  • The Reality: You remain professionally accountable for every decision, regardless of whether it came from a textbook, a colleague, or an algorithm.

MHRA in plain English

Under UK Medical Device Regulations (UK MDR 2002 and subsequent updates), software becomes a medical device based on its "Intended Purpose."

  • It IS a Medical Device if: The manufacturer claims it calculates, predicts, or provides a diagnosis for a specific individual to be used for a medical purpose (e.g., "This image shows melanoma").
  • It IS NOT a Medical Device if: It acts as a reference source, educational aid, or generic knowledge retrieval tool that does not perform an interpretive calculation on individual patient data (e.g., "Here are the criteria for melanoma").

The Loophole (and the Trap): Even if a tool labels itself "for education only," if you use it to make a direct diagnostic decision without independent verification, you may be using it "off-label," transferring liability to yourself.

GMC in plain English

The GMC’s updated Good Medical Practice (2024) and learning materials on AI make the professional standard clear:

  1. You are the pilot: "You must be satisfied that the systems you use are safe."
  2. Verification is mandatory: You cannot blame the AI. If you prescribe a drug based on an AI recommendation and it is wrong, it is treated as if you made the error.
  3. Transparency: You must be honest about how you use tools.

The “safe zone” model for diagnostic-support AI

To practice safely, map your AI use to this traffic light system.

ZoneStatusUsage
GREENEducational / SupportReasoning Support. "List the differential diagnosis for [X]." "What questions should I ask?" "What are the red flags?"<br>This is using AI as a super-powered textbook.
AMBER⚠️ Interpretive SupportPattern Recognition. "Does this photo look like ringworm?"<br>Requires caution. You must verify against a validated atlas (like VisualDx).
RED🛑 Autonomous ModeDirect Action. "Diagnose this patient." "Write the prescription based on this history."<br>Never do this with generic AI. Only use UKCA-marked medical devices for this step.

Where iatroX Brainstorm sits

iatroX Brainstorm is explicitly engineered to sit in the Green Zone.

  • Educational Aid: We describe Brainstorm as a "clinical reasoning and differential diagnosis explorer."
  • No Definitive Answers: It does not say "The patient has [X]." It says "Consider [X] if [Y] is present."
  • Augmentation, Not Replacement: It is designed to broaden your thinking (the "Brainstorm") so that you can narrow the diagnosis using your clinical judgement and verified investigations.

By using it to generate questions and possibilities rather than answers, you remain the diagnostic authority.

Practical documentation language (copy-paste)

When you use AI support in a complex case, document your "Human-in-the-Loop" process to protect yourself.

Copy this into your notes: "Differential diagnosis broadened using iatroX Brainstorm checklist. Suggestions reviewed; red flags for [Condition A] and [Condition B] excluded via history and exam. Management plan verified against NICE CKS [Topic]."

This statement confirms:

  1. You used the tool as a checklist (Green Zone).
  2. You performed the exclusion (Clinical Judgement).
  3. You verified the plan (Source of Truth).

Want to reason safer, not just faster? Use the Brainstorm tool on iatroX as your compliant clinical wingman.

Share this insight