In aviation, they say: "The superior pilot uses his superior judgment to avoid situations which require the use of his superior skill."
In medicine, we tend to rely almost entirely on "superior skill." We pride ourselves on the heroic rescue, the difficult diagnosis made at 5 PM on a Friday. But reliance on heroism is a sign of a fragile system.
Most clinical errors in 2026 are not "knowledge deficits." You didn't miss the sepsis because you forgot what sepsis is; you missed it because you were interrupted three times, the printer was jammed, and your cognitive bandwidth was saturated.
This is Human Factors: the science of how humans perform under pressure. Here is how to apply the hard-won lessons of aviation to the messy reality of the NHS.
Why most clinical risk is system + human factors
We often view error as a "moral failing" (I wasn't careful enough). Human Factors views error as a "system output."
- Task Saturation: The human brain can process 5–9 items of information at once. In a busy clinic, you are juggling 20.
- Ambiguity: Aviation crashes often happen when instruments give conflicting data. Medicine is made of conflicting data.
- Fatigue: A tired pilot is grounded. A tired doctor is often the senior decision-maker.
The aviation concepts that translate cleanly
To design a safer workflow, you need to name the threats.
Situational Awareness (SA)
SA is not "knowing what is happening." It is knowing what will happen next.
- Level 1 (Perception): "The BP is 90/60."
- Level 2 (Comprehension): "The patient is hypotensive."
- Level 3 (Projection): "If I don't give fluids now, they will arrest in 10 minutes."
- Clinical failure: We often get stuck at Level 1 (collecting data) without moving to Level 3 (anticipating the crash).
Task Saturation (The Funnel)
As workload increases, your "attentional funnel" narrows. You stop hearing the nurse; you stop seeing the peripheral data. This is why fixation errors happen—you focus intensely on the difficult intubation while the patient's oxygen saturation drops unnoticed.
The “Startle Effect”
When something unexpected happens (e.g., a patient collapses in the waiting room), the brain dumps adrenaline. For the first 10–30 seconds, your cognitive IQ drops by 20 points.
- Countermeasure: Procedural memory. This is why we drill "ABCDE"—so your hands know what to do while your brain reboots.
Crew Resource Management (CRM) — the core behaviours
CRM is the set of non-technical skills used to catch errors.
- Graded Assertiveness (PACE): How a junior speaks to a senior without being rude.
- Probe: "I noticed the BP is low."
- Alert: "I am worried about the BP."
- Challenge: "We need to address the BP now."
- Emergency: "Stop! The patient is unsafe."
- Closed-Loop Communication: Never say "Give 1mg adrenaline." Say "Give 1mg adrenaline," and wait for the reply: "1mg adrenaline given."
- Briefs/Debriefs: The pre-flight check ensures everyone knows the plan. The post-flight debrief ensures we learn from the near-misses.
Practical “clinic-ready” safety redesigns
You don't need a simulator to use this. Implement these tomorrow.
1. The 90-second Pre-Clinic Brief (The Huddle) Before the doors open, grab the reception/nursing team.
- "Who is short-staffed today?"
- "Which patients are 'watch-list' (e.g., palliative, aggressive)?"
- "When are we taking breaks?"
- Result: Shared mental model.
2. The 3-point Handover When passing a patient or a task to a colleague, cut the waffle.
- What changed: "Since admission, his creatinines have risen."
- What worries me: "He looks drier than the numbers suggest."
- What I need you to do: "Review fluid status at 4 PM." (Specific action, specific time).
3. The “Sterile Cockpit” Rule In aviation, below 10,000 feet, non-essential conversation is banned.
- Clinical application: During "high-risk" tasks (e.g., calculating a paediatric dose, writing a controlled drug script), enforce a No Interruption Zone. Put on a "Do Not Disturb" vest or establish a "hand up" signal.
A GP workflow lens (where it fails in real life)
General Practice is an interruption factory.
- Inbox Overload: We process hundreds of results rapidly (Level 1 SA). The risk is "click-through fatigue," missing the abnormal result because it was buried in 50 normals.
- Fix: Batch process. Do not multitask results while on the phone.
- Repeat Prescribing: This is the most common source of error.
- Fix: Use the "Two-Challenge Rule." If something feels "off" (e.g., a dose change), check it twice or ask a colleague. Never guess.
Where AI helps (and where it worsens risk)
AI is a double-edged sword for Human Factors.
- Helps: It reduces Cognitive Load. Summarisation tools (like Brainstorm) can strip away the noise and present the key data, freeing up your brain for decision-making.
- Harms: Automation Bias. If the AI says "Normal," you are statistically less likely to check the raw data. This creates a "false level of safety."
Where iatroX fits
We designed iatroX specifically to address Cognitive Forcing.
When you are tired, you skip steps. You forget to check the interaction; you forget the rare differential.
- Brainstorm: Acts as your "Co-Pilot Checklist." It forces you to pause and ask: "Have I considered the Red Flags?" It is a structured workflow that prevents skipping the safety steps.
- Knowledge Centre: Reduces "Tab Chaos." Instead of opening 15 Google tabs (which destroys situational awareness), you have one trusted, canonical index for your guidelines.
Summary You cannot "try harder" to be safe. You must design safer systems. Use the Pre-Clinic Brief, the Sterile Cockpit for high-risk tasks, and tools like iatroX as a cognitive forcing function to catch the errors your tired brain might miss.
Ready to add a safety layer to your workflow? Use Brainstorm as your cognitive checklist for complex cases.
