iatroX, DxGPT, Glass Health, Medwise AI: Responsible clinical AI for differentials & reasoning (a 2025 guide)

Featured image for iatroX, DxGPT, Glass Health, Medwise AI: Responsible clinical AI for differentials & reasoning (a 2025 guide)

AI-driven differential diagnosis tools are here, but not all are safe or suitable for NHS practice. While UK clinicians are broadly positive about AI, adoption is rightly held back by concerns over safety, provenance, and clinical confidence.

A responsible approach must move beyond "AI magic" and focus on auditable, evidence-based tools that support, not replace, clinical judgment. This guide focuses on four platforms leading this charge, each with a different, responsible approach: iatroX (UK-gated knowledge), DxGPT (public-sector differential pilots), Glass Health (ambient decision support), and Medwise AI (NHS-deployed guideline search).

This is a practical guide for the frontline: junior doctors on the acute take, practice nurses in primary care, and AHPs like paramedics and pharmacists who need fast, cited, and safe answers.

The responsibility test: 5 criteria for a safe AI reasoning tool

Before using any AI tool for clinical reasoning in 2025, it must pass a 5-point test:

  1. Provenance-first answers: Does it show its work? The tool must be "gated" or "retrieval-augmented," meaning it draws answers from a trusted knowledge base (NICE, CKS, SIGN, BNF, or local NHS pathways) and provides clear citations.
  2. Structured outputs: It should provide a structured, auditable output, such as a ranked differential, a "don't-miss" red flag list, and suggested initial tests, rather than a single, confident, "black box" answer.
  3. Abstain or hand-off: The tool must be able to state when it is unconfident or when the query is outside its scope, handing back full control to the clinician.
  4. Auditability: You must be able to log the interaction. The platform should allow you to save the clinical prompt and the AI's output, ideally to a CPD log or clinical record.
  5. NHS-aligned assurance: The supplier must be compliant with NHS standards, including a completed DTAC (Digital Technology Assessment Criteria) and DCB0129/0160 clinical safety case. Novel diagnostic tools should be aligning with the MHRA's AI Airlock.

Four tools to know in 2025: iatroX, DxGPT, Glass Health, and Medwise AI

3.1 iatroX (UK-centric, NHS-friendly)

  • What it is: An AI clinical reference and reasoning layer built specifically for UK practice. It operates over a gated knowledge base of NICE, CKS, SIGN, and BNF guidelines. It includes an "Ask" feature for direct questions, a "Brainstorm" feature for differential diagnosis, and an adaptive "Quiz" for learning.
  • Why it’s responsible: It uses an algorithmic, RAG-style search to find and synthesise answers from vetted UK guidance. Answers are citation-first, directly linking the user to the source material. Learning and queries can be logged to a professional CPD portfolio.
  • Best for: Junior doctors in UK primary or urgent care, advanced clinical practitioners (ACPs), nurse practitioners (ANPs), and pharmacists who need answers grounded in UK policy.
  • Typical output: A structured brainstorm including a 3-5 point differential, "don't-miss" conditions, suggested investigations, and direct links to the UK sources it used.

3.2 DxGPT (Rare/complex, paeds/IM focus)

  • What it is: A GPT-4-based tool designed for a single, narrow task: generating a ranked, top-five differential diagnosis with a supporting rationale. It is being piloted at scale across Madrid’s public health system (SERMAS).
  • Why it’s responsible: It is designed to counter anchoring bias and premature closure by forcing clinicians to consider alternatives. Its use in a large, public-sector deployment provides a model for responsible evaluation. It excels with complex, multi-system cases or rare diseases.
  • Watch-outs: It is not a general-purpose query tool. Its performance can drop with vague or lay-person prompts, and it absolutely requires a clinician-in-the-loop to interpret its suggestions.

3.3 Glass Health (Encounter-aware CDS)

  • What it is: An ambient Clinical Decision Support (CDS) platform. It can listen to a patient encounter and evolve a differential diagnosis in real-time. It then uses this reasoning to help draft clinical documentation.
  • Why it’s responsible: It focuses on augmenting the clinician during the encounter, showing its "thinking" as the differential evolves with new information. It exposes the evidence for its suggestions, drawing from PubMed and FDA drug data.
  • Best for: Hospitalists, emergency or urgent care clinicians, and advanced practice nurses who have a high volume of documentation and complex encounters.

3.4 Medwise AI (NHS guideline lookup)

  • What it is: A UK-based medical information platform that functions as a powerful AI search layer for national and local guidance. It is already live in over 1,000 NHS organisations.
  • Why it’s responsible: It is an enterprise-grade, NHS-deployed tool. Its primary function is not to invent new reasoning but to find the correct, locally-approved guideline at the point of care. It answers "what does NICE/my trust say?" with high precision.
  • Best for: Nurses, pharmacists, and AHPs who need fast, definitive answers on guideline-specific pathways, drug formularies, or local referral criteria, rather than a wide-open differential.

Role-based playbooks: How to use these tools safely

Junior/resident doctors

  • Widen the net: Start with iatroX Brainstorm or Glass Health on the acute take to widen your differential and check for "don't-miss" diagnoses.
  • Verify first: Immediately click the cited NICE/CKS links from iatroX to verify the guideline steps before acting.
  • Challenge bias: For a puzzling or complex medical patient, run the anonymised summary through DxGPT to see if you have missed a rare metabolic, genetic, or infective possibility, then escalate to your senior.
  • Log everything: Save your prompt and the AI output to your portfolio as a CPD or learning log.

Nurses (community/primary care)

  • Guideline clarification: Use Medwise AI or iatroX Ask for fast, guideline-level answers (e.g., "stepwise management of Type 2 Diabetes," "red flags for new-onset headache").
  • Know when to hand-off: If the AI output suggests a higher-acuity pathway or shows uncertainty, use this as a clear trigger to hand off to a doctor or ACP. Avoid using differential-only tools for unstable patients.

AHPs (paramedics, pharmacists, physios)

  • Citation-first Q&A: Rely on Medwise AI and iatroX for high-stakes, specific questions (e.g., "drug interactions for co-codamol and sertraline," "cauda equina red flags for physios"). The citation is your safety net.
  • Operate within scope: Use differential tools only for conditions within your scope of practice (e.g., a physio brainstorming shoulder pain) and ideally with oversight from a supervising clinician.

Governance, assurance & the NHS view

You cannot use a tool just because it looks good. NHS England's 2025 AI utilisation reports and the new National Commission for AI Regulation make it clear: governance is everything.

  • Why gating matters: Tools trained on the open internet are far more likely to "hallucinate" or invent plausible-sounding errors. Platforms like iatroX and Medwise AI, which only search a trusted database of UK guidelines, materially reduce this risk.
  • What NHSE wants: The NHS is looking for patient-safety cases, data security (DTAC), and clinical safety compliance (DCB0129/0160). Novel tools will be expected to engage with the MHRA's AI Airlock.
  • What you must document: To be safe, your note should include the tool's name/version, your prompt, the AI's output summary, any source links you verified, and your final human decision.

Conclusion

Used responsibly, AI reasoning tools can enhance clinical judgment, reduce cognitive load, and counter diagnostic bias. Used irresponsibly, they are a significant clinical risk.

The future of safe clinical AI is not a "black box" that gives you the answer. The future is a citation-first, auditable co-pilot that is "gated" to trusted knowledge. It's a tool that forces you to check the source, enhances your reasoning, and keeps the clinician, and you, firmly in the loop.

Calls to action

  • For clinicians: Pick one gated tool (iatroX or Medwise AI) for guideline queries and one differential expander (DxGPT or Glass Health) for reasoning. Trial them on 10 anonymised cases and log your reflections.
  • For NHS leaders: Require "provenance-first" retrieval and full audit logs in every AI procurement. Mirror the public evaluation model from Madrid’s DxGPT pilot to build a UK-specific evidence base.

Share this insight