Executive summary
Glass Health has captured the attention of the clinical world with its promise of "ambient clinical intelligence." It is not just a dictation tool; it is an AI platform that claims to listen to your consultation and generate a structured differential diagnosis (DDx) and management plan in real-time, pulling data from the EHR to inform its reasoning.
This "evolving differential" capability is powerful, but for UK clinicians and digital leaders, it raises critical questions about safety and governance. Does it ingest identifiable patient data? Where is that data stored? And how does its advice align with UK-specific pathways like NICE? This review unpacks the features of Glass Health, the essential governance questions you must ask before using it in the NHS, and how it compares to UK-centric, reference-first tools like iatroX.
What it is / isn’t
- What it is: An AI clinical decision support (CDS) platform that offers documentation assistance and diagnostic support. It can draft assessment and plan sections, suggest differentials, and integrate with EHRs to pull patient context.
- Not a replacement: It is not a substitute for clinician judgement, robust clinical governance, or adherence to mandatory UK referral pathways.
The concept clinicians are actually searching for
The term "ambient clinical intelligence" describes a system that does more than transcribe. It "listens" to the clinical narrative as it unfolds—understanding the history, medication list, and exam findings—to construct a reasoning model in the background. It is reasoning support over time, not just speech-to-text.
What Glass says it does
- Differential drafting: It generates a structured differential diagnosis and suggests next steps for investigation and management (Feature Page).
- "Evolving differential": As you take a history or the system ingests new data, the differential updates dynamically (Ambient Page).
- EHR integration: It can pull demographics, history, medications, labs, vitals, and imaging reports directly into the AI platform to inform its outputs.
- API capability: It offers an API for developers to build differential diagnosis and treatment plan generation into other applications.
The UK clinician’s first question: “Where does patient data go?”
If a tool ingests patient-specific EHR data, your governance bar must be significantly higher than for a simple "reference-only" search tool.
You must ask:
- Storage & Retention: Is patient data processed in the UK/EU? How long is it retained? Is it used to train their models?
- Access Controls: Who has access to this data?
- Clinical Safety: Is there a DCB0129 clinical safety case? Has the Trust completed a DCB0160? A Data Protection Impact Assessment (DPIA) is mandatory.
- NHS Expectation: Decision-support tools that influence care generally require strong assurance. If it is generating a personalised treatment plan, it likely falls under the MHRA regulations for Software as a Medical Device (SaMD).
Where Glass could be valuable
- Rapid A&P drafts: For complex medical admissions, having an AI draft a structured Assessment & Plan can be a significant time-saver.
- Prompting “can’t-miss” considerations: The "cognitive forcing function" of seeing a differential list can help prevent premature closure.
- Consistency: It can help standardise the structure and detail of documentation across a team.
- Education: It is a powerful tool for trainees to reflect on their reasoning steps after a case.
Failure modes to discuss frankly
- Automation bias: Real-time suggestions can be persuasive. There is a risk that a tired clinician might accept a plausible-sounding but incorrect plan without sufficient critique.
- Over-smoothing uncertainty: AI can make a messy, complex clinical picture look tidier than it is, potentially masking diagnostic uncertainty.
- EHR context pitfalls: If the problem list or medication history in the EHR is outdated or inaccurate, the AI's reasoning will be flawed ("garbage in, garbage out").
- Governance drift: A tool procured as a "note-taking aid" can easily drift into being used as a diagnostic device, changing its risk profile.
The UK pathway reality check
NICE CKS and NICE Guidelines remain the default reference backbone for UK decisions. An ambient tool might suggest a treatment plan based on US or global evidence that contradicts local funding or antimicrobial stewardship policies. Outputs must always be auditable and verifiable against UK standards.
Glass vs DxGPT vs iatroX (category table)
| Feature | Glass Health | DxGPT | iatroX |
|---|---|---|---|
| Primary Job | Ambient Reasoning + Documentation | DDx Expansion | UK Citation-First Retrieval + Learning |
| Typical Use | Live clinical workflow | Rare-case thinking | "What does UK guidance say?" |
| Data Ingestion | High (EHR integration) | Vignette-based | None (Reference only) |
| UK Readiness | Global Tool | Research/Pilot | Explicit Alignment & UKCA/MHRA Class I |
| Best User | Acute / Hospital | Specialist / Academic | GP / Trainee / Student |
Where iatroX fits (and why it’s different)
iatroX is positioned as a reference-first tool rather than an "EHR-ingesting ambient" platform. This offers a safer, lower-friction adoption path in many UK settings because it does not require processing live patient data.
- Regulatory Status: As per its store listing and website, iatroX is a UKCA-marked, MHRA-registered Class I medical device for informational and educational use.
- Independence: It is a no-ads, independent, clinician-built platform.
- Workflow:
- Use VisualDx or Glass for the open question ("what could it be?").
- Use iatroX Ask for the closed question ("what does the UK guidance say to do next, and show me the citations?").
- Use iatroX Brainstorm to create a structured reflection note for your portfolio.
If your practice/Trust is evaluating ambient AI: the 12 questions checklist
- What is the intended purpose? (Is it a scribe or a diagnostic aid?)
- Does it ingest identifiable patient data?
- Is there a clinical safety case (DCB0129)?
- How does it cite sources for its treatment plans?
- How are hallucinations mitigated?
- Where is data processed and stored?
- What data is retained and for how long?
- Is there audit logging of every AI interaction?
- Does it integrate into the system of record?
- How is model drift monitored?
- What is the incident reporting route?
- Who is accountable for the output?
FAQs
- Is Glass Health safe to use in the NHS?
- Only if it has been formally approved by your organisation, with a completed DPIA and clinical safety case. Using it with patient data without approval is a governance breach.
- Does Glass integrate with EHRs?
- Yes, it advertises integration capabilities to pull demographics, history, meds, and labs, but this requires enterprise-level setup.
- Is ambient AI the same as dictation?
- No. Dictation transcribes what you say. Ambient AI "listens" to the conversation between you and the patient and synthesises it.
- What is the UKCA/MHRA bar for clinical software?
- Any software that uses patient data to influence a clinical decision (diagnosis or treatment) is likely a medical device and requires UKCA marking and MHRA registration.
