Clinical Audit vs Quality Improvement vs Research: What's the Difference and Which Do You Need?

Featured image for Clinical Audit vs Quality Improvement vs Research: What's the Difference and Which Do You Need?

The question "is this an audit, QI, or research?" seems pedantic until you realise the answer determines whether you need ethics approval, how you describe it on your portfolio, and whether it scores points on a specialty training application.

Many junior doctors complete a project, label it incorrectly, and either overclaim (calling an audit "research" when it has no ethics approval) or underclaim (calling a genuine QI project "just an audit" and missing out on portfolio points). Here's the distinction that matters.

The definitions

Clinical audit: Measuring current practice against a defined standard, identifying gaps, implementing change, and re-measuring. The standard already exists (a guideline, a protocol, a policy). You're asking: "Are we doing what we said we'd do?" You are not generating new knowledge — you're checking compliance. Example: "Are 90% of our diabetic patients receiving an annual foot check as per NICE guidelines?"

Quality improvement (QI): Implementing a change to improve a process or outcome, then measuring whether the change worked. QI uses iterative cycles (PDSA — Plan, Do, Study, Act). You're asking: "Can we make this better?" You're not testing a hypothesis — you're improving a system. Example: "Can we reduce our A&E door-to-antibiotic time for sepsis by introducing a nurse-initiated pathway?"

Research: Generating new knowledge through systematic investigation. You're asking a question whose answer isn't known. This requires a hypothesis, a methodology, and — crucially — ethics committee approval (HRA/REC in the UK). Example: "Does early mobilisation after hip fracture reduce 30-day mortality?"

The decision flowchart

Am I testing a new treatment, intervention, or hypothesis? → Yes = Research. Get ethics approval.

Am I measuring practice against an existing standard? → Yes = Audit.

Am I changing a process and measuring the effect using iterative cycles? → Yes = QI.

Am I doing something that could be described as "let's try this and see if it works"? → Probably QI if it's a process change, possibly research if it involves patient-level interventions. When in doubt, ask your trust's R&D department.

Why it matters for your portfolio

Specialty training applications score audit and QI differently — and both differently from research publications.

Audit typically scores under "Clinical Governance" or "Audit." A completed audit cycle (audit → change → re-audit showing improvement) scores more than a single audit. Most person specifications require evidence of two completed audit cycles for maximum points. Crucially: the re-audit is what scores — a single audit without a re-audit is an incomplete cycle.

QI may score under "Quality Improvement" (if the person specification has a separate category) or under "Audit" (if it doesn't). QI projects with measurable outcomes and documented PDSA cycles are increasingly valued. Some specialties now explicitly differentiate QI from audit and award separate points.

Research scores under "Publications" and "Presentations" — but only if it's been published, presented, or has documented ethics approval. A research project that was never published scores zero on most person specifications. An audit that was completed and re-audited scores more than an unpublished research project.

The practical implication: For portfolio building, two completed audit cycles and one QI project are more achievable and more reliably scored than one research project. Research is valuable if you're targeting an academic career or a highly competitive specialty that weights publications heavily — but for most trainees, audit and QI provide better return on time invested.

How to choose

If you have 3 months and need portfolio points: Do a clinical audit with a re-audit. Choose a topic where the standard is clear (NICE guideline adherence) and the data is accessible (electronic records). Complete the cycle: measure, implement a change (even a simple one like a poster or protocol reminder), re-measure.

If you have 6 months and want to demonstrate improvement skills: Do a QI project with documented PDSA cycles. Choose a process problem that annoys the team (long waiting times, incomplete handovers, missed observations) — you'll get better engagement and the results will be visible.

If you want a publication: Do research — but be realistic about the timeline and commitment. A systematic review is the most accessible research format for juniors (no ethics approval, no data collection, achievable in 3–6 months with discipline).


iatroX supports clinical governance with NICE-aligned guidelines for setting audit standards and CPD tracking for documenting your learning.

Share this insight