AI tools are everywhere in GP training — scribes, Q-banks, reflection scaffolding, clinical reference. The question is no longer whether to use them but how to use them without undermining the clinical skill development that training exists to produce.
Safe Uses
Revision and knowledge consolidation — iatroX adaptive quiz, Passmedicine, Pastest. Communication rehearsal — MedTutor AI, SCA Prep. Topic exploration from guideline-based tools — Ask iatroX, CKS. Structuring reflective notes — Learner+, iatroX CPD. Quiz-based self-assessment.
Proceed-With-Caution Uses
Diagnostic support in ambiguous cases — only after generating your own differential first. Summarising guidelines — verify against the primary source. Drafting clinical letters — every word needs review before sending.
Unsafe Uses
Entering patient-identifiable data into consumer AI (ChatGPT, Gemini, Copilot). Using AI-generated clinical notes without review. Submitting AI-generated reflections as genuine. Using AI as a substitute for clinical supervision.
The IG Framework
Is the tool approved for patient data? If not, no patient information enters it. Full stop. MHRA-registered tools (iatroX, Tortus) and DTAC-compliant tools (Heidi, Accurx, Tortus) have documented governance. Consumer LLMs do not.
The Skill Development Question
If an AI scribe writes your notes, do you develop note-writing skills? If AI generates your differential, do you develop clinical reasoning? If AI writes your reflections, do you develop reflective skills? The answer in each case: not fully. Use AI to augment, not replace, the thinking process.
The Framework
Think first. Verify with AI. Own the decision. This applies to clinical decisions, prescribing, documentation, and reflection. AI is the verification layer, not the decision layer.
Where iatroX Fits
iatroX is UKCA-marked and MHRA-registered — designed for clinical use with appropriate governance. Use it for guideline-grounded Q&A and adaptive revision with confidence in the regulatory framework.
