Key takeaways
- AI-enabled devices in the consultation room — scribe wearables (Heidi Remote, Plaud NotePin), AI stethoscopes (Eko CORE 500), AI ECG readers (PMcardio) — all involve processing patient data in ways that require explicit, informed consent.
- Consent requirements differ by jurisdiction and by device type. Recording a conversation (ambient scribing) has different consent implications from performing an AI-augmented physical examination (AI stethoscope) or photographing an ECG (PMcardio).
- In the UK, the NHS England guidance on ambient scribes (April 2025) mandates explicit patient consent, human verification of all AI-generated outputs, and clear privacy documentation. The GMC's confidentiality guidance and the ICO's GDPR framework underpin these requirements.
- In the US, HIPAA governs the handling of protected health information (PHI), and the AMA's AI guidance emphasises transparency and patient autonomy. State-level recording consent laws (one-party vs two-party) add complexity.
- The single most important thing is that consent must be meaningful, not performative. A laminated sign in the waiting room is necessary but not sufficient. The patient must understand what the device does, what happens to their data, and how to decline.
- For clinicians who want AI-powered clinical support without any patient data capture — no recording, no audio processing, no device-mediated data collection — iatroX provides free, guideline-grounded clinical reference and reasoning support that involves no patient-identifiable information in normal use.
Why consent matters more now
The arrival of AI-enabled physical devices in the consultation room changes the consent landscape in ways that software-only tools did not.
When a clinician uses iatroX to check a guideline, or looks up a drug interaction on the BNF via the iatroX Knowledge Centre, no patient data is captured or processed. The interaction is between the clinician and the knowledge base. No consent is required because no patient information enters the system.
When a clinician uses an AI scribe wearable, the dynamic is fundamentally different. The device is recording the patient's voice, processing their words, and generating a clinical document from the encounter. The patient's health information — often highly sensitive — is captured, transcribed, transmitted, and stored by a third-party technology provider.
Similarly, an AI stethoscope captures and analyses heart sounds and ECG data. An AI ECG app photographs and interprets a patient's electrocardiogram. A handheld ultrasound with AI captures and stores medical images.
Each of these involves processing special category data under GDPR (health data, Article 9) or protected health information under HIPAA. The consent requirements are not optional — they are legal obligations.
UK: what the guidance says
NHS England guidance on ambient scribes (April 2025)
The NHS England guidance on ambient voice technology (AVT) is the most specific UK framework for AI-enabled recording in clinical settings. Its key requirements:
- Explicit patient consent must be obtained before any recording begins. The patient must be informed that an AI tool will be listening to and processing the consultation.
- The right to decline must be clearly communicated and respected without any detriment to the patient's care. If a patient declines, the consultation proceeds without the AI tool.
- Human verification is mandatory. Every AI-generated clinical note must be reviewed and approved by the clinician before it enters the patient record.
- A Data Protection Impact Assessment (DPIA) must be completed by the deploying organisation.
- The tool must be registered — the guidance references the need for MHRA registration and compliance with the Digital Technology Assessment Criteria (DTAC).
GMC confidentiality guidance
The GMC's Confidentiality: good practice in handling patient information (updated 2024) establishes the foundational principles:
- Patients have a right to expect that information about them will be held in confidence.
- Where information is shared with third parties (including technology providers), patients should be informed and, where practicable, their consent obtained.
- Clinicians must be satisfied that the security arrangements for any technology they use are adequate to protect patient information.
The GMC does not yet have AI-specific guidance for clinical devices, but the existing confidentiality framework applies directly. A scribe wearable that transmits patient audio to a third-party cloud platform is sharing patient information with a third party — this must be disclosed to the patient.
ICO and GDPR
Under UK GDPR:
- Health data is special category data (Article 9). Processing requires an explicit legal basis — typically Article 9(2)(h) (healthcare provision) for clinical use.
- Patients must be informed about the processing through a privacy notice that covers: what data is collected, the purpose of processing, who the data is shared with (including the AI vendor), where the data is stored, how long it is retained, and the patient's rights (access, erasure, objection).
- A DPIA is required for any processing that is likely to result in a high risk to individuals — which AI processing of health consultation audio almost certainly does.
- Data minimisation applies: only the minimum data necessary for the purpose should be collected and retained.
US: what the guidance says
HIPAA
Under HIPAA:
- The AI vendor is typically a Business Associate and must have a signed Business Associate Agreement (BAA) with the covered entity (the practice or health system).
- Patient audio, transcripts, and AI-generated notes constitute protected health information (PHI).
- The HIPAA Privacy Rule requires that patients receive a Notice of Privacy Practices (NPP) that describes how their PHI is used and shared. The use of AI scribe tools should be reflected in this notice.
- The HIPAA Security Rule mandates administrative, physical, and technical safeguards for electronic PHI — including encryption, access controls, and audit trails.
AMA guidance on AI
The AMA's Augmented Intelligence in Health Care policy (updated 2023) emphasises:
- Transparency: Patients should be informed when AI is being used in their care.
- Patient autonomy: Patients should have the ability to consent to or decline the use of AI tools.
- Physician oversight: AI outputs should be subject to human review and clinical judgement.
- Data privacy: AI tools should adhere to existing privacy and security frameworks.
State recording consent laws
This is a critical and often overlooked dimension for US clinicians using ambient scribe wearables.
US states are divided into one-party consent and two-party (all-party) consent jurisdictions for recording conversations:
- One-party consent states (e.g., New York, Texas, Ohio): Only one party to the conversation needs to consent to the recording. The clinician's consent is sufficient.
- Two-party consent states (e.g., California, Florida, Illinois, Washington, Maryland, Massachusetts, and others): All parties must consent to the recording. This means the patient must affirmatively consent before any ambient recording begins.
In two-party consent states, recording a patient consultation without explicit consent is not just a HIPAA concern — it is potentially a criminal offence under state wiretapping laws. Clinicians in these states must obtain and document patient consent before activating any ambient recording device.
Canada and Australia
Canada (PIPEDA + Provincial legislation)
Under PIPEDA and applicable provincial health privacy legislation (e.g., Ontario's PHIPA, Alberta's HIA, British Columbia's PIPA):
- Consent for the collection, use, and disclosure of personal health information must be informed and meaningful.
- The Canadian Medical Association (CMA) has issued guidance on AI in healthcare emphasising transparency, accountability, and patient engagement.
- Some provinces require that health information remain within Canada. If an AI device syncs data to servers outside the country, this may create a compliance issue.
Australia (Privacy Act + Australian Privacy Principles)
Under the Australian Privacy Principles (APPs):
- APP 3 (Collection): Health information should only be collected with the individual's consent, unless an exception applies.
- APP 5 (Notification): The individual must be notified about the collection, including its purpose, who it may be shared with, and how to access or correct it.
- AHPRA (the Australian Health Practitioner Regulation Agency) has not issued AI-specific guidance for clinical devices, but existing professional standards of transparency and informed consent apply.
Practical consent scripts for clinicians
For ambient scribe wearables (Heidi Remote, Plaud, Accurx Scribe, etc.)
Before the consultation begins:
"Before we start, I'd like to let you know that I use an AI-assisted tool to help with my clinical notes. It listens to our conversation and creates a draft of the consultation notes, which I then review and check before saving to your record. No audio is stored permanently — only the written notes, which I verify personally. You're completely free to say no, and it won't affect your care in any way. Would you be comfortable with that?"
If the patient declines:
"Absolutely, no problem at all. I'll take notes the usual way. Let's carry on."
Key principles: Name the tool's function (not the brand). Emphasise human review. Make the opt-out frictionless and consequence-free.
For AI stethoscopes (Eko CORE 500)
The consent dynamic here is different. An AI stethoscope performs a physical examination — something patients already expect. The AI analysis is an enhancement of the standard examination, not a new type of data capture.
During the examination:
"I'm going to listen to your heart and lungs with a digital stethoscope. It has some built-in technology that helps me pick up sounds I might otherwise miss. Is that OK with you?"
Most patients will not distinguish between an AI-enhanced stethoscope and a traditional one during a routine exam. However, if the device records and stores audio, or transmits data to a cloud service, the privacy notice should reflect this.
For AI ECG interpretation (PMcardio)
"I'm going to take a photo of your ECG and run it through an AI tool that helps me check the interpretation. The image is encrypted and processed securely. Is that alright?"
For handheld ultrasound (Butterfly iQ3)
Standard ultrasound consent practices apply. If the images are stored in a cloud service (as Butterfly's platform enables), the privacy notice should reflect this.
Common patient concerns and how to address them
"Is it recording everything?"
"The tool listens during our consultation to help me create accurate notes. It doesn't record outside of our appointment, and I review everything before it goes into your record."
"Where does my data go?"
"The audio is processed by a secure, healthcare-certified system that complies with [UK data protection law / HIPAA / Australian Privacy Principles]. Your data is encrypted and is not shared with anyone outside your care team."
"Can I see what it wrote?"
"Absolutely. You have the right to see your medical records, including any notes generated with AI assistance. Just ask at reception."
"What if I don't want it?"
"That's completely fine. It won't change anything about your appointment. I'll just take notes the traditional way."
"Is the AI making decisions about my care?"
"No. The AI helps me with documentation and can flag things for me to look at more closely, but every clinical decision is mine. I always review and check everything."
Updating your practice's privacy documentation
If you introduce any AI-enabled device that captures, processes, or stores patient data, the following documents need updating:
1. Privacy notice / Fair processing notice
Add a clear statement about the use of AI tools. Example:
"We may use AI-assisted technology during consultations to help clinicians create clinical notes, enhance diagnostic examinations, or interpret test results. These tools process your health information securely and in accordance with [UK GDPR / HIPAA]. You will always be asked for your consent before any AI tool is used, and you may decline without any impact on your care. All AI-generated outputs are reviewed by a clinician before being added to your record."
2. Data Protection Impact Assessment (DPIA) — UK
Required under GDPR for high-risk processing. Should cover:
- Description of the processing (what data, what device, what vendor)
- Necessity and proportionality assessment
- Risks to individuals (data breach, inaccurate AI outputs, device loss)
- Mitigation measures (encryption, consent process, human verification, vendor BAA/DPA)
3. Notice of Privacy Practices (NPP) — US
Update to reflect the use of AI tools and the involvement of any new Business Associates.
4. Patient-facing communications
Consider adding a brief, plain-English notice in:
- The waiting room (poster or digital screen)
- The practice website
- New patient registration materials
- Appointment confirmation messages
A framework for evaluating consent requirements by device type
| Device type | Data captured | Consent model | Key regulatory requirement |
|---|---|---|---|
| Ambient scribe wearable (Heidi Remote, Plaud, Accurx Scribe) | Patient voice, conversation content | Explicit verbal consent before each consultation (or session-level consent with clear opt-out) | UK: NHS England AVT guidance, DPIA. US: HIPAA + state recording laws. |
| AI stethoscope (Eko CORE 500) | Heart/lung sounds, ECG data | Standard examination consent (enhanced by privacy notice if data is stored/transmitted) | UK: MHRA status, privacy notice. US: FDA clearance, HIPAA. |
| AI ECG interpretation (PMcardio) | ECG image (photographed) | Informed consent for photography and AI processing | UK/EU: CE certification (Class IIb). US: FDA status. |
| Handheld ultrasound (Butterfly iQ3) | Ultrasound images | Standard imaging consent (enhanced by privacy notice if cloud-stored) | UK: MHRA status. US: FDA clearance, HIPAA. |
| Clinical reference tool (iatroX) | None (no patient data in normal use) | No patient consent required | UK: MHRA-registered. Free for all clinicians. |
This framework illustrates a critical distinction: not all clinical AI tools require patient consent. Tools that operate on clinical knowledge (guidelines, evidence, educational content) rather than patient data — like iatroX, Ask iatroX, and the iatroX Knowledge Centre — involve no patient-identifiable information and therefore do not trigger consent requirements.
Special considerations
Paediatrics
For children under 16 (UK) or under 18 (US), consent for AI device use must be obtained from a parent or legal guardian — unless the child is assessed as Gillick competent (UK) or meets mature minor doctrine criteria (US). The additional complexity of explaining AI tools to parents, particularly in time-pressured consultations, should not be underestimated.
Mental health and sensitive specialties
In psychiatry, psychotherapy, substance misuse, and sexual health settings, the sensitivity of the conversation content is maximal. Ambient recording may be perceived as particularly intrusive, and refusal rates may be higher. Clinicians in these specialties should consider whether the efficiency gain from ambient scribing justifies the potential impact on therapeutic rapport.
Patients who lack capacity
For patients who lack capacity to consent (e.g., due to cognitive impairment, acute confusion, or learning disability), the decision to use AI recording devices must be made in the patient's best interests, in accordance with the Mental Capacity Act 2005 (UK) or equivalent legislation. A best-interests assessment should be documented.
Multilingual consultations
AI scribe tools vary in their multilingual capability. Heidi claims support for 110+ languages; others may be less comprehensive. If a consultation is conducted through an interpreter, consent must be obtained from the patient (through the interpreter) and the privacy implications of a third-party AI tool processing a consultation in a language that the clinician may not fully understand should be considered.
Our view
Consent for AI devices in clinical practice is not a box-ticking exercise. It is a conversation — one that must be honest, accessible, and genuinely voluntary.
The good news is that most patients, when properly informed, are comfortable with AI tools that improve their care. The evidence from NHS scribe pilots, Heidi's UK deployments, and US primary care studies consistently shows that patient acceptance is high when the consent process is transparent and the opt-out is genuine.
The bad news is that the consent landscape is fragmented, jurisdiction-specific, and rapidly evolving. Clinicians are expected to navigate NHS England guidance, GMC principles, ICO requirements, MHRA classifications, and (in the US) a patchwork of federal and state laws — all while running a busy clinic.
The practical advice is straightforward:
- Tell the patient what you are doing, in plain language, before you do it.
- Make the opt-out easy and consequence-free.
- Update your privacy documentation.
- Complete a DPIA (UK) or update your NPP (US).
- Choose tools with transparent data handling — and verify their claims independently.
And for the parts of your AI toolkit that do not involve patient data at all — clinical reference, guideline lookup, exam preparation, differential diagnosis reasoning — tools like iatroX let you harness the power of AI without any of the consent complexity.
Related reading on iatroX
- When AI leaves the screen: how physical devices are changing the clinical consultation in 2026
- Heidi Remote: what a dedicated AI scribe hardware device means for clinical documentation
- On-device clinical AI: why Heidi Remote and offline-first scribes matter for data privacy
- NHS AI scribes 2025: a compliant buyer's guide (MHRA, DTAC, DSPT)
- AI in UK healthcare: understanding trust, transparency, and tools like iatroX
- MHRA regulations on medical device apps: what UK clinicians need to know
- Risks of general AI for medical advice in the UK
