NHS England has published and updated specific guidance for AI-enabled ambient scribing products used across health and care settings in England. Originally published in April 2025 and updated to Version 2 in April 2026, the guidance covers implementation, clinical safety, information governance, and integration. A separate IG guidance — reviewed by the ICO and National Data Guardian — was published with a template DPIA in March 2026. This is a nationally guided category with a supplier registry, structured governance expectations, and large-scale deployment support.
For clinicians using or considering ambient voice technology (AVT), understanding the governance framework is as important as understanding the technology itself.
What Is Ambient Voice Technology?
A system that listens to a clinical consultation in the background — without requiring the clinician to dictate or type — processes relevant parts of the conversation, and produces structured outputs. Those outputs may include a consultation summary, clinical note, referral letter, patient message, SNOMED CT coding suggestions, or task lists.
NHS England defines these tools as AI-enabled ambient scribing products used for clinical or patient documentation and workflow support, including advanced ambient voice technologies (AVTs).
The technology is operational at scale. Tandem Health powers Accurx Scribe for 200,000+ NHS staff. Tortus AI integrates with EMIS and SystmOne across 3,500+ practices. Heidi Health is widely used in UK and Australian primary care. Nuance DAX/Dragon Copilot operates within the Microsoft ecosystem. Multiple suppliers are listed on the NHS England AVT Supplier Registry.
A major NHS England-sponsored study demonstrated a 23.5% increase in direct patient interaction time and an 8.2% reduction in overall appointment length. East Lancashire Hospitals NHS Trust is hosting a national programme to deploy AVT across the NHS in England, reflecting system-level ambition.
Consent and Transparency
The governance question clinicians ask most frequently: "Do I need consent?"
NHS England's IG guidance provides a nuanced answer. Explicit consent is not legally required for using ambient scribes in individual care, provided the processing has an appropriate legal basis under UK GDPR — typically Article 6(1)(e) (public task) and Article 9(2)(h) (health or social care purposes). However, transparency is essential and non-negotiable.
Clinicians must inform patients at the start of any session that an ambient scribe is in use. NHS England states this is necessary because "ambient scribes are a new technology, and it is not reasonable to assume that people will know they are being used without being informed." Suggested wording is provided: "During your appointment today I will be using an ambient scribe to help me to take notes. It's a tool that will record our conversation and then automatically take notes about what we have talked about."
Patients should be told what the tool does, what happens to the recording, and that they can ask questions. Information must also be included in the organisation's privacy notice. If a patient objects, the clinician should proceed without the scribe — this should be logistically straightforward and must not create pressure to accept.
The transparency requirement is both a legal obligation and a trust-building measure. Patients who discover retrospectively that AI was listening during their consultation will understandably feel their privacy was violated — even if the processing was technically lawful. Proactive disclosure builds the trust that sustainable AVT adoption requires. This is particularly important for sensitive consultations — mental health, safeguarding, sexual health, and domestic abuse contexts — where patients may be especially sensitive to recording.
Information Governance
Data processing agreements. Organisations must document arrangements with all parties involved in sending, receiving, or using data. NHS England provides a template DSPA.
Data minimisation. Audio recordings and transcripts should typically be deleted once a verified summary has been produced. Retention requires documented justification — such as quality assurance — and must comply with organisational retention policies.
DPIA. Required before deployment. NHS England published a template DPIA specifically for AI-enabled ambient scribes in March 2026. The ICO provides an additional AI and data protection risk toolkit.
Staff awareness. Staff must understand permitted uses, verification requirements, individual rights compliance, and escalation procedures.
Data processing roles. Organisations must establish whether the AVT supplier acts as data processor or data controller — and ensure contracts reflect this accurately.
Privacy notices. AVT use must be included in the patient-facing privacy notice — not just communicated verbally during individual consultations.
Clinical Safety
Clinical safety case. NHS England references DCB 0129 (manufacturer) and DCB 0160 (deploying organisation) standards. Organisations should understand clinical hazards specific to their deployment and maintain a documented safety case.
Human review is mandatory. Healthcare professionals retain full responsibility for accuracy. Every AI-generated output must be checked and corrected before being saved. Additional validation is recommended in complex scenarios — translation, multi-party consultations, safeguarding contexts, and any situation where nuance or sensitivity is heightened.
Error reporting. Organisations need clear processes for clinicians to report errors. These should feed into the clinical safety case and inform ongoing hazard management.
Audit trails. Tracking what the AI generated, what the clinician modified, and what was saved supports clinical governance, quality improvement, and medico-legal defensibility.
Integration with EHR Systems
When an AI scribe writes back into EMIS or SystmOne, it creates entries in the permanent clinical record. NHS England's updated April 2026 guidance specifically emphasises that safe integration with existing clinical systems is critical.
The risks of autopopulating the record are specific and consequential. A coding suggestion mapping "possible asthma" to confirmed "asthma" creates a false diagnostic entry affecting disease registers, recalls, prescribing, and insurance reports. A note including examination findings not actually performed creates medico-legal liability. A safety-netting message omitting key red flags creates clinical risk even if the note is professionally formatted.
Integration is where AVT moves from convenient to consequential — and where verification becomes a patient safety requirement.
The Unresolved Question: Documentation or Decision Support?
As scribes evolve to include coding suggestions, referral drafting, investigation prompts, and safety-netting generation, the boundary between documentation and clinical decision support blurs. A coding suggestion is a recommendation. A referral draft includes clinical prioritisation. A safety-netting prompt is clinical advice. The regulatory classification may need to evolve alongside the technology.
For clinicians: treat every AI-generated output as a draft requiring verification, regardless of how polished or confident it appears.
Where iatroX Fits
Documentation tools reduce admin. Clinicians still need a trusted place to check guidance, calculate risk, and record learning. AVT captures the consultation. iatroX helps answer the clinical question. CPD captures the learning. Calculators support verification.
