On-device clinical AI: why Heidi Remote and offline-first scribe tools matter for data privacy

Featured image for On-device clinical AI: why Heidi Remote and offline-first scribe tools matter for data privacy

Key takeaways

  • On-device clinical AI means that speech-to-text processing (and potentially more) happens on the physical hardware the clinician carries, rather than being streamed to a cloud server during the consultation. Heidi Remote, launched today, is the most prominent example in the clinical AI scribe market.
  • The privacy benefit is real but bounded. On-device processing eliminates the risk of audio data being intercepted during transmission or residing on third-party cloud servers during the most sensitive phase — the live consultation. But once the transcript is synced to the cloud for AI note generation, the downstream privacy model is identical to any cloud-based scribe.
  • Device loss is the new data breach vector. A cloud-only scribe cannot be "lost in a corridor." A physical device containing a day's worth of encrypted patient consultations can be. Organisations must plan for this.
  • Regulatory frameworks (GDPR, HIPAA, APP, PIPEDA) do not yet specifically address dedicated on-device clinical AI hardware. Existing principles apply — but the novelty of the form factor means Data Protection Impact Assessments and clinical safety cases need updating.
  • Clinicians should assess the entire data lifecycle, not just the on-device phase. Where does data go after sync? Who has access? Is it used for model training? What are the retention and deletion policies? These questions matter more than the hardware's marketing pitch.

The shift to on-device: what has changed?

For the past three years, the dominant architecture for clinical AI scribes has been cloud-first. The clinician's device (usually a laptop, phone, or tablet) captures audio, either streams it in real time or uploads it shortly after the session, and a cloud-based pipeline — typically involving an automatic speech recognition (ASR) model followed by a large language model (LLM) — produces the clinical note.

This architecture has driven the rapid adoption of tools like Heidi Health, Tortus, Accurx Scribe (powered by Tandem Health), Nabla, Abridge, and Nuance DAX Copilot. Cloud processing offers access to the largest, most capable AI models (GPT-4, Claude, Gemini), virtually unlimited compute, and continuous model improvement without requiring the end device to be upgraded.

But it has also created a persistent tension — one that clinicians, information governance teams, and patients have all felt:

The audio of a private medical consultation is leaving the room in real time.

For many clinicians, this has been an acceptable trade-off. The note quality is good, the time savings are substantial (HSJ has modelled potential NHS savings of approximately £834 million annually if scaled nationally), and the consent process is manageable. But for others — particularly those working in psychiatry, sexual health, paediatrics, substance misuse, military medicine, or occupational health — the idea of streaming patient audio to a third-party cloud server has been a hard stop.

What Heidi Remote changes

On 19 March 2026, Heidi Health unveiled Heidi Remote, a dedicated wearable device that records consultations with a single button press and runs speech-to-text transcription entirely on the device itself, using on-device AI models deployed in partnership with Argmax. The device stores encrypted audio and transcripts locally, with no dependency on a phone, browser, or internet connection during the encounter.

This is not the first time on-device transcription has been attempted in healthcare. Apple's on-device speech recognition (used by some clinicians for dictation) and certain configurations of Dragon Medical have offered local processing options. But Heidi Remote is the first purpose-built, clinician-specific hardware device that combines ambient multi-speaker consultation capture with on-device AI transcription in a wearable form factor.

It is a genuine architectural shift. And its implications for data privacy deserve serious examination.


The data privacy case: where on-device processing helps

1. Eliminating in-transit risk

The single clearest privacy advantage of on-device processing is the elimination of data transmission during the consultation. In a cloud-first architecture, audio is either streamed via WebSocket/HTTPS in real time or uploaded shortly after the session. Both create a window during which the data is in transit — and therefore theoretically vulnerable to interception, man-in-the-middle attacks, or routing through jurisdictions with weaker data protection regimes.

With on-device processing, the audio never leaves the device during the encounter. There is no transmission to intercept. For environments with strict data sovereignty requirements — such as NHS Trusts subject to GDPR Article 44 transfer restrictions, or US military and VA healthcare systems — this is a meaningful compliance benefit.

2. Reducing third-party cloud exposure

Every cloud-based scribe relies on infrastructure provided by a hyperscaler (AWS, GCP, Azure) or a specialist AI provider. Even with robust contractual safeguards (Data Processing Agreements, BAAs under HIPAA, Standard Contractual Clauses under GDPR), the data controller (the healthcare organisation) is entrusting patient audio to a chain of processors and sub-processors.

On-device processing shortens this chain. During the transcription phase, the data controller's data stays on a device physically within their premises. No sub-processor is involved. For information governance teams conducting Data Protection Impact Assessments (DPIAs), this simplifies the risk calculus for the capture-and-transcription step.

3. Connectivity independence

This is a practical privacy benefit that is easy to overlook. In a cloud-first model, poor connectivity does not just degrade performance — it can force clinicians to adopt workarounds. Some buffer audio on the phone's local storage and upload later, creating an unencrypted local cache. Others switch to a personal hotspot, routing clinical audio through a consumer mobile network. Still others simply give up and take manual notes, losing the scribe's benefit entirely.

Heidi Remote's offline-first design means the tool works identically whether the clinician is in a well-connected urban GP surgery or a remote rural clinic with intermittent 4G. The privacy model is consistent regardless of infrastructure. This is particularly relevant for Australian rural and remote practice, Canadian northern territories, and NHS sites with notoriously poor internal Wi-Fi.


The data privacy case: where on-device processing does not help

This is where the analysis needs to be more careful than the marketing copy.

1. The cloud still processes your notes

On-device transcription gets you a raw transcript of the consultation. But the value of an AI scribe — the structured SOAP note, the coded diagnosis, the referral letter, the billing codes — is generated by a large language model. As of today, no wearable device has the compute power to run a model of the size and sophistication required for high-quality clinical note generation. Heidi's own documentation confirms that Heidi Evidence is "built in part on Claude, Anthropic's AI models" and that the platform uses cloud-based AI for note generation.

This means the data flow is:

  1. On-device (privacy-protected phase): Audio capture → on-device ASR → encrypted local transcript
  2. Cloud (standard cloud-scribe privacy model): Transcript syncs to Heidi's servers → LLM processes transcript into structured note → note is available in Heidi's platform for clinician review

The on-device phase protects the raw audio. But the transcript — which contains the same clinical information, often in more structured and searchable form — still enters the cloud. All the standard privacy questions therefore still apply to the transcript: Where are Heidi's servers? Who has access? Is the transcript used for model fine-tuning? What is the retention period? Can it be deleted on request?

The on-device processing is a genuine improvement to step 1 of the pipeline, but it does not alter the privacy model of steps 2 onwards. Clinicians and IG teams who focus only on the hardware's marketing and neglect the downstream cloud processing would be making a category error.

2. Device loss: a new attack vector

Cloud-based scribes have many risks, but "being left on a bus" is not one of them. A physical device that stores encrypted audio from an entire day of consultations introduces a risk that simply does not exist in a software-only model.

Let us be concrete. A GP doing home visits clips Heidi Remote to their lapel at 8am, records 15 consultations across three sites, and loses the device from their coat pocket somewhere between site two and site three. The device now contains encrypted audio from up to 15 patient encounters.

Under GDPR Article 33, the data controller must assess whether this constitutes a personal data breach and, if there is a risk to individuals, report it to the ICO within 72 hours. Under HIPAA, the loss of a device containing encrypted PHI must be assessed against the Breach Notification Rule's risk assessment factors. Under Australia's Notifiable Data Breaches scheme, a similar assessment is required under the Australian Privacy Principles.

Encryption is the primary mitigation. If the device uses strong, hardware-backed encryption (e.g., AES-256 with a secure element), and the encryption keys are not stored on the device itself, then the data is unreadable without the clinician's credentials. In that scenario, the loss is a security incident but may not meet the threshold for a reportable breach. But "may not" is doing a lot of work in that sentence. The assessment still needs to happen, the incident still needs to be logged, and the organisation still needs a policy for managing it.

Questions clinicians and IG teams should ask:

  • What encryption standard does Heidi Remote use? Is it hardware-backed?
  • Are encryption keys stored on the device or derived from the user's credentials?
  • Is there a remote-wipe capability?
  • What is the maximum amount of data (hours of audio, number of consultations) that can accumulate on a single device before it must sync?
  • Does Heidi provide a template incident-response protocol for device loss?

3. Patient perception is not solved

The privacy argument for on-device processing is fundamentally a technical argument. It addresses where data is processed and stored. But patient privacy concerns are not purely technical — they are also perceptual and relational.

A patient's comfort with being recorded during a consultation depends on trust, understanding, and context. A small device clipped to a clinician's collar — one the patient may not immediately recognise or understand — raises its own perceptual challenges. Is it always recording? Can the patient verify that it has been turned off? Does it look like a surveillance device?

These are not hypothetical concerns. The NHS England guidance on ambient scribes was written precisely because the perceptual and consent dimensions of ambient recording are as important as the technical ones. A dedicated hardware device does not eliminate the need for:

  • Clear, accessible patient information (ideally in the waiting room and at the start of the consultation)
  • An explicit consent mechanism that the patient can easily decline
  • A visible indicator that the device is recording (e.g., an LED)
  • The ability for the patient to request that a recording be stopped and deleted

Comparing data architectures: how the leading scribes handle your data

To put Heidi Remote in context, it is worth comparing the data architectures of the main clinical AI scribes available to UK, US, and Australian clinicians.

FeatureHeidi Remote (on-device)Heidi (app/browser)NablaTortusAccurx ScribeNuance DAX
Audio captureDedicated wearablePhone/laptop micPhone/laptop micPhone/laptop micLaptop (via Accurx)Room mic / laptop
Transcription locationOn-device (Argmax)CloudCloudCloudCloudCloud (Azure)
Audio during consultEncrypted, local onlyStreamed/bufferedStreamedStreamed/bufferedStreamedStreamed
Audio retention policyTBCConfigurableZero retention (claimed)ConfigurableConfigurableConfigurable
Note generationCloud (Claude-based)Cloud (Claude-based)CloudCloudCloudCloud (Azure/GPT)
MHRA registeredTBCTBCTBCClass I (IIa pending)Class ITBC for UK
Offline capableYes (primary mode)Partial (mobile)NoNoNoNo

Nabla's "zero retention" model is worth highlighting as a contrasting approach to the same problem. Rather than solving privacy by keeping data on-device, Nabla's architecture processes the audio in the cloud but claims to delete it immediately after the note is generated — retaining nothing. This is a different philosophical answer to the same question: rather than "keep the data local," it says "process and destroy." Both approaches have trade-offs; neither is a complete solution.


Regulatory implications across jurisdictions

United Kingdom (GDPR + MHRA)

Under GDPR, the key questions for on-device processing are:

  • Data controller obligations: The healthcare organisation remains the data controller even if data is processed on a vendor-supplied device. The DPIA must be updated to reflect the new hardware and data flow.
  • Lawful basis: The lawful basis for processing (likely GDPR Article 6(1)(e) — public task, or Article 9(2)(h) — healthcare) does not change with on-device processing.
  • MHRA classification: If Heidi Remote's on-device transcription is considered to inform clinical decisions (as opposed to merely facilitating documentation), it may fall within the scope of the Medical Devices Regulations 2002 and require appropriate classification and registration.

United States (HIPAA)

Under HIPAA:

  • Business Associate Agreement (BAA): Heidi would likely need to be a Business Associate. On-device processing does not eliminate this requirement if the data ultimately reaches Heidi's cloud.
  • Encryption Safe Harbor: If the device uses encryption that meets NIST standards, and the encryption key is not stored with the device, then a lost device may qualify for the HIPAA Breach Notification Rule's encryption safe harbor — meaning no breach notification would be required.
  • State-level privacy laws: Some US states (e.g., California under CCPA/CPRA, New York under SHIELD Act) have additional requirements that may apply.

Australia (Privacy Act + APP)

Under the Australian Privacy Principles:

  • APP 11 (Security): The healthcare organisation must take reasonable steps to protect personal information from loss. A dedicated device containing patient audio must be managed as a security-sensitive asset.
  • Notifiable Data Breaches scheme: Loss of a device containing health information triggers an assessment under the NDB scheme, even if the data is encrypted.

Canada (PIPEDA + Provincial legislation)

Under PIPEDA and applicable provincial health privacy legislation (e.g., Ontario's PHIPA, Alberta's HIA):

  • Consent: Explicit, informed consent for the recording and AI processing of consultations is required.
  • Data residency: Some provinces require health information to remain within Canada. If on-device processing occurs in Canada but cloud sync routes through US or Australian servers, this could create a compliance issue.

Where iatroX fits: privacy by design without hardware dependency

At iatroX, we have taken a fundamentally different approach to clinical AI privacy — one that does not require clinicians to carry a dedicated device, manage hardware, or worry about device loss.

iatroX is a clinical knowledge and reference tool, not an ambient scribe. It does not record consultations, capture audio, or process patient-identifiable data in the course of normal use. When a clinician uses Ask iatroX to check a guideline, Brainstorm to explore a differential diagnosis, or the iatroX Knowledge Centre to navigate NICE, CKS, SIGN, and BNF resources, the interaction is between the clinician and the evidence — not between the clinician's patient and a recording device.

This is not a criticism of AI scribes. The time savings are real and well-evidenced. But it is a reminder that the clinical evidence and reasoning component of a clinician's AI toolkit can and should be evaluated independently from the documentation component. You do not need to accept a single vendor's entire stack — scribe, evidence, communications, and hardware — to benefit from AI in your clinical workflow.

iatroX is MHRA-registered, free for all clinicians and students, and designed to be used alongside whatever documentation tool you choose — whether that is Heidi (with or without Remote), Tortus, Accurx Scribe, Nabla, or pen and paper.

For clinicians preparing for exams, the iatroX Quiz provides adaptive, curriculum-mapped question banks for UKMLA, MRCP, MRCGP AKT, USMLE Step 2 CK, MCCQE, AMC, and more — all without any ambient recording, patient audio, or hardware dependency.


A practical privacy checklist for clinicians evaluating any AI scribe

Whether you are considering Heidi Remote, a cloud-based scribe, or any other clinical AI tool, here is what to assess:

During the consultation:

  • Where is the audio processed? On-device or cloud?
  • Is it streamed in real time or buffered?
  • What happens if connectivity drops — is data lost, cached locally, or queued?

After the consultation:

  • Where is the transcript stored? Which country, which provider?
  • Is the audio retained after the note is generated? For how long?
  • Can you request deletion of a specific consultation's data?

For model training:

  • Is any patient data (audio, transcript, or notes) used to train or fine-tune the vendor's AI models?
  • If so, is this with explicit consent, or is it buried in the terms of service?
  • Can you opt out of model training while still using the service?

For device security (hardware-specific):

  • What encryption standard is used?
  • Is there remote-wipe capability?
  • What is the maximum data accumulation before sync?
  • What is the incident-response protocol for a lost device?

For regulatory compliance:

  • Does the tool have the required MHRA registration for the UK market?
  • Is it compliant with NHS DTAC?
  • Does the vendor provide a BAA (for US HIPAA) or a DPA (for UK GDPR)?
  • Has the vendor completed the DCB0129 clinical safety case, and will they support you with DCB0160?

Conclusion: on-device is an important evolution, not a destination

Heidi Remote represents a meaningful step forward in the privacy architecture of clinical AI. On-device transcription eliminates in-transit risk during the consultation, reduces reliance on cloud infrastructure during the most sensitive phase, and ensures the tool works in any connectivity environment. These are real, measurable benefits.

But on-device processing is not a privacy panacea. The transcript still enters the cloud. The device can be lost. The patient still needs to understand and consent to being recorded. And the regulatory frameworks — GDPR, HIPAA, APP, PIPEDA — still apply in full, with the added complexity of managing a physical device fleet.

The most important thing clinicians can do is assess the entire data lifecycle of any clinical AI tool — not just the phase the vendor's marketing highlights. On-device processing is one piece of a much larger privacy puzzle. The other pieces — data residency, retention policies, model training practices, consent workflows, incident response, and regulatory compliance — matter just as much, and they are not solved by a wearable device, no matter how elegantly designed.


Related reading on iatroX


Share this insight