Information Governance for GP Trainees Using AI: GDPR, Patient Data, and What Not to Enter

Featured image for Information Governance for GP Trainees Using AI: GDPR, Patient Data, and What Not to Enter

The golden rule: if the tool is not approved for patient data within your practice or trust IG framework, do not enter identifiable patient information. That is the entire framework in one sentence.

Consumer AI Tools (Never Enter Patient Data)

ChatGPT, Gemini, Copilot, Claude consumer, and any other general-purpose AI tool. These are not medical devices. They have no clinical data assurances. Data may be stored, processed internationally, and used for training. Even "anonymised" data can be re-identifiable.

Common trainee mistakes: creating SCA practice cases from real patients in ChatGPT. Entering patient scenarios for clinical advice. Sharing consultation details for reflection prompts. Asking clinical questions with identifiable context.

Approved Clinical AI (Check Governance Status)

Heidi Health, Accurx Scribe, Tortus AI — approved for consultation data within their specific DTAC-compliant governance frameworks. Check that your practice has completed the required DPIA and governance assessment.

Regulated Clinical AI

iatroX — UKCA-marked, MHRA-registered Class I medical device. Designed for clinical data within its governance framework.

What to Do

Discuss AI use with your practice's Caldicott Guardian and IG lead. Check your trust/practice IG policy. Ask before you act. When in doubt, do not enter patient data.

Consequences

IG breaches are taken seriously by the GMC, ICO, and your employing trust. Potential fitness to practise implications. A ChatGPT query with patient details is a data breach — regardless of your good intentions.

Where iatroX Fits

iatroX is UKCA-marked and MHRA-registered — built with clinical data governance at its core. When you need guideline-grounded clinical answers, use a tool designed for clinical use.

Share this insight