The best use of AI in a consultation may be the one that makes the clinician feel more present, not less. That simple idea — AI that increases human connection rather than replacing it — is the most compelling argument in Heidi's impact data. It also challenges the assumption that "more technology in the consultation" necessarily means "less humanity in the consultation."
The Screen as the Hidden Third Person
Every clinician knows the experience: the patient is talking about their symptoms, their worries, their pain — and the clinician is looking at the screen, typing, coding, navigating menus, and trying to listen at the same time. The screen becomes the hidden third person in the consultation — consuming the clinician's visual attention, fragmenting their cognitive focus, and creating a physical barrier between clinician and patient.
Patients notice. Research consistently shows that patients feel more heard, more respected, and more satisfied when the clinician maintains eye contact. Trust, rapport, and therapeutic alliance — the foundations of effective clinical communication — all depend on the clinician being present with the patient, not present with the keyboard.
The documentation is necessary. The eye contact is necessary. In the traditional consultation, they compete for the same resource: the clinician's attention.
How Ambient Scribes Change the Consultation Dynamic
Ambient scribes like Heidi change the competition dynamic by removing the typing task from the consultation. The AI listens in the background. The clinician listens to the patient. The documentation happens simultaneously rather than competitively. After the consultation, the clinician reviews and approves the AI-generated draft.
Heidi's data from the Modality Partnership deployment supports the impact: over 75% of GPs reported feeling a stronger connection with patients, and patients cited improved eye contact and more personable consultations. One hundred per cent of patients in the Modality evaluation accepted the technology — a notably high acceptance rate for any new technology in a healthcare setting.
The implication is powerful: the AI is not replacing human interaction. It is creating the conditions for more of it. The clinician who is not typing can listen more deeply, observe non-verbal cues more carefully, respond more naturally, and be more fully present during the clinical encounter.
Why Consent and Transparency Remain Central
Patient acceptance of ambient scribing — however high — should not be assumed. NHS England's guidance requires clinicians to inform patients at the start of any session that an ambient scribe is in use. The patient should understand what the tool does, what happens to the recording, and that they can ask questions.
This transparency is particularly important for sensitive consultations — mental health, safeguarding, sexual health, domestic abuse — where patients may be especially sensitive to recording. The clinician should be prepared to proceed without the scribe if the patient objects or if the clinical situation makes recording inappropriate.
Consent and transparency are not barriers to adoption — they are conditions for sustainable adoption. Patients who feel informed and in control are more likely to accept and trust the technology over time.
Why Clinical AI Should Support, Not Replace, the Clinician-Patient Relationship
The Heidi data illustrates a principle that applies to all clinical AI: the best tools are those that make the clinician more effective at being a clinician — not those that make the clinician redundant. A scribe that gives back eye contact makes consultations more human. A clinical knowledge tool that provides a fast, cited answer makes clinical reasoning faster without replacing it. A calculator that quantifies risk supports the clinical decision without making it.
Whether the tool is an ambient scribe like Heidi or a clinical knowledge platform like iatroX, the test is the same: does it help the clinician be more present, more accurate, and more useful to the patient?
Try iatroX — clinical AI that supports the clinician's reasoning, not replaces it →
