The Multi-Specialty Recruitment Assessment (MSRA) is used for national recruitment into GP training, a number of specialty training programmes, and core training posts. It tests two domains: Clinical Problem Solving (CPS) — applied clinical knowledge across a broad curriculum — and Situational Judgement (SJT) — professional dilemmas requiring ranking or selection of appropriate actions.
The MSRA is competitive — scores determine ranking for specialty allocation, making every mark consequential. The time pressure is significant, and the SJT component requires a specific approach that differs from standard clinical SBA preparation.
What Makes a Good MSRA Revision App?
Dual-domain coverage. The MSRA tests both CPS and SJT. A revision tool must cover both — clinical reasoning for CPS and professional judgement scenarios for SJT. A Q-bank that covers only clinical medicine misses half the assessment.
CPS breadth. The CPS component covers medicine, surgery, paediatrics, O&G, psychiatry, pharmacology, and general practice — similar in breadth to UKMLA but at a slightly more advanced applied level. Coverage must be comprehensive.
SJT-specific practice. SJT questions require ranking actions or selecting the most and least appropriate responses — a format that requires specific practice to develop the reasoning framework. Generic clinical questions do not prepare for SJT.
Speed and timing. The MSRA is time-pressured. Candidates need timed practice to develop the pacing required to complete both papers within the allotted time. Untimed revision does not prepare for the speed challenge.
Mock exams. Full-length timed mocks that combine CPS and SJT in realistic proportions.
Comparison
| App | MSRA CPS | MSRA SJT | Mocks | Spaced repetition | Adaptive |
|---|---|---|---|---|---|
| PassMedicine | Yes | Yes | Yes | No | No |
| Pastest | Yes | Yes | Yes | No | No |
| iatroX | Yes | Yes | Yes | Yes | Semantic adaptive |
Where iatroX Fits
iatroX covers both CPS and SJT domains of the MSRA, with questions mapped to the exam curriculum and format. Mock exam mode reproduces exam-day timing. Spaced repetition resurfaces missed clinical and professional concepts. Semantic adaptive learning identifies related weaknesses — recognising that errors across acute management, prescribing safety, and guideline application may share a common clinical reasoning gap.
For GP trainees, the MSRA Q-bank connects naturally to MRCGP AKT preparation — both test primary care-relevant clinical knowledge, and revision for one reinforces the other.
How to Use iatroX for MSRA Revision
- Start with separate diagnostic blocks for CPS and SJT to identify baseline performance in each domain.
- Focus CPS revision on the broadest clinical areas first — medicine, prescribing, paediatrics, O&G.
- Practise SJT separately — develop the ranking framework before integrating with CPS.
- Use timed mocks combining both domains from 6 weeks before the exam.
- Use spaced repetition throughout to prevent knowledge decay across the broad curriculum.
Start MSRA revision with iatroX →
MSRA Exam Format and Key Facts
The MSRA consists of two papers: Clinical Problem Solving (97 questions, 2h15m) and Professional Dilemmas (50 questions, 1h35m), with a mixture of SBA and EMQ formats. The exam fee is ~£250. Unlike pass/fail exams, the MSRA produces a competitive ranking score — scores determine training programme allocation, making every mark potentially decisive for career trajectory.
Clinical Problem Solving Paper
The CPS paper tests clinical decision-making across general medicine at specialty training entry level. High-yield topics include: acute presentations (ACS, PE, acute abdomen — investigation and initial management), chronic disease management (diabetes, COPD, heart failure — NICE-concordant stepwise management), prescribing safety (interactions, contraindications, monitoring), and clinical investigations (interpreting bloods, ECGs, imaging).
Professional Dilemmas Paper
Professional Dilemmas is where many candidates lose marks. Questions present workplace scenarios requiring judgement about appropriate professional behaviour — patient safety, teamwork, leadership, integrity, and NHS values. The key is understanding the GMC's Good Medical Practice framework and Duties of a Doctor. Candidates who approach PD questions with clinical reasoning rather than professional values reasoning consistently underperform.
MSRA Competitor Landscape
PassMedicine and Pastest offer MSRA-specific question banks. BMJ OnExam provides blueprint-mapped content. Dedicated MSRA SJT courses cover the Professional Dilemmas paper. iatroX provides adaptive question selection across both CPS and PD domains, with analytics that separately track clinical and professional dilemma performance.
Building an Effective MSRA Study Strategy
Effective MSRA preparation follows a structured progression from broad coverage to targeted consolidation. Because scores are used for competitive ranking, every additional mark matters — this is not a pass/fail exam where 'good enough' suffices.
Phase 1 — Foundation building (weeks 1-4 of a 8-12-week plan). Work through questions by topic area in untimed mode. The goal is broad coverage, not speed. Read every explanation thoroughly, including why incorrect options are wrong. Flag topics where understanding feels superficial rather than confident. Use iatroX's topic filters to ensure systematic coverage rather than gravitating toward comfortable subjects.
Phase 2 — Gap identification and targeted revision (weeks 5-8). Review analytics to identify persistent weak areas. Shift from broad coverage to targeted work on the topics where performance lags. iatroX's adaptive algorithm prioritises questions from areas where the candidate has demonstrated uncertainty, ensuring revision time is spent where it will have the greatest impact. Spaced repetition scheduling resurfaces previously answered questions at intervals optimised for long-term retention.
Phase 3 — Exam simulation and consolidation (final 4+ weeks). Transition to timed practice and full mock exams. Mock exams should replicate exam conditions as closely as possible — full-length, timed, with no interruptions. Review mock performance not just for content gaps but for pacing, question interpretation, and decision-making under time pressure. iatroX's mock exam mode generates exam-length papers that mirror the real assessment format.
Active recall vs passive reading. The evidence for active recall in medical education is robust. Answering questions, retrieving information from memory, and testing oneself are consistently more effective than re-reading notes or textbooks. A well-structured Q-bank provides the scaffolding for active recall — each question is a retrieval opportunity, each explanation is a learning event. Combined with spaced repetition, this produces durable knowledge that persists to exam day and beyond.
Analytics-driven adjustment. Static study plans assume every candidate starts from the same baseline and progresses at the same rate. Analytics-driven preparation — where study allocation adjusts based on actual performance data — is significantly more efficient. iatroX's dashboard shows per-topic accuracy, trend data, and comparison between areas, enabling candidates to make evidence-based decisions about where to spend their limited revision time.
How iatroX Supports MSRA Preparation
iatroX provides several features specifically relevant to MSRA candidates:
Adaptive question selection. Rather than presenting questions randomly, iatroX's adaptive algorithm analyses performance patterns and selects questions that target demonstrated weak areas. Revision time is spent where it will have the greatest impact on exam readiness, not reinforcing already-strong topics.
Spaced repetition scheduling. Previously answered questions are re-presented at intervals calibrated to the spacing effect. Incorrectly answered questions return sooner; correctly answered questions are spaced further apart. This produces durable long-term retention rather than fragile short-term recall.
Mock exam mode. Full-length, timed mock exams replicate the structure and time constraints of the real assessment. Mock analytics show per-topic performance, pacing data, and score trends across multiple attempts — enabling candidates to track improvement and identify persistent gaps.
Study planning. Personalised study plans based on exam date, available study time, and current performance level. Plans adapt as the candidate progresses, shifting emphasis toward areas where improvement is most needed.
Multi-platform access. Available on web, iOS, and Android — enabling revision during commutes, placements, and breaks without losing progress or analytics data. Progress syncs across all devices automatically.
Clinical AI integration. Ask iatroX provides guideline-grounded clinical queries powered by RAG over NICE, CKS, BNF, EMC, and NHS content — enabling candidates to verify management approaches against current UK guidelines during revision. Over 80 clinical calculators cover scoring systems and decision tools used in daily practice. CPD tracking with FourteenFish integration means the platform serves beyond exam preparation into ongoing professional development.
MHRA-registered platform. iatroX holds UKCA marking and MHRA Class I registration — a regulatory standard that most revision platforms do not hold, reflecting the platform's clinical decision support capabilities alongside exam preparation.
2026 Revision Strategy and Resource Checklist
Candidates should treat every revision resource as an exam-performance tool, not simply as a content library. The strongest platforms make the candidate practise the same cognitive task the real exam demands: reading a vignette, identifying the discriminating clinical clue, choosing the safest answer, and learning from the distractors. For this reason, the most useful comparison is not "which app has the most questions?" but "which app produces the most improvement per hour of revision?"
The key capability is ranked recruitment performance across Clinical Problem Solving and Professional Dilemmas. That means a revision app should provide more than topic filters. It should let candidates build a representative exam mix, practise in timed mode, revisit missed concepts, and see whether performance is improving across the domains that actually matter. The NHS Medical Hub MSRA structure is the essential reference because MSRA preparation has to respect the two independently timed components: Professional Dilemmas and Clinical Problem Solving.
A practical way to evaluate a question bank is to inspect ten explanations before committing. Strong explanations usually do four things: they identify the diagnosis or principle being tested, explain why the correct answer is safer or more appropriate than the alternatives, show why the distractors are tempting but wrong, and link the point back to a repeatable exam rule. Weak explanations simply restate the answer. In high-stakes medical exams, that difference matters because candidates lose marks at the margin: two options may look plausible, but only one is most appropriate in that clinical context.
A Practical 12-16 weeks Study Workflow
A sensible MSRA plan should begin with a mixed diagnostic block rather than a favourite topic. The purpose is not to score highly on day one; it is to expose the initial pattern of weakness. Once the baseline is clear, the first phase should focus on broad curriculum coverage. Candidates should work in untimed mode, read explanations carefully, and convert recurrent errors into a small number of revision rules: "what did I miss?", "what clue should have changed my answer?", and "what will I do next time I see this pattern?"
The second phase should become more selective. This is where iatroX's adaptive learning and semantic similarity approach become useful. Instead of merely showing that a candidate is weak in a large topic such as cardiology, respiratory medicine, paediatrics or prescribing, the platform can identify clusters of related errors across apparently separate labels. A candidate who repeatedly misses questions involving breathlessness, anticoagulation, heart failure and renal dosing may not have four unrelated weaknesses; they may have one underlying weakness in integrated cardiorenal decision-making. Targeting that root gap is more efficient than simply serving another random block from the same broad category.
The final phase should be dominated by timed work and mocks. Untimed practice builds knowledge, but timed practice builds the exam behaviour: reading stems efficiently, resisting overthinking, managing uncertainty and recovering after difficult questions. Candidates should deliberately practise curriculum coverage, question interpretation, time management, weak-area correction and durable recall. These are the areas where a good app should force active recall rather than passive recognition.
What iatroX Adds Beyond a Traditional Q-Bank
iatroX is positioned as a revision layer and a clinical reasoning layer. The question bank provides curriculum-mapped practice, mocks, spaced repetition and adaptive recommendations. Ask iatroX, calculators and CPD logging then connect that revision to clinical practice. This matters because most candidates are not revising in isolation; they are revising while working, on placement, preparing for another exam, or moving between health systems.
The practical advantage is continuity. A candidate can use iatroX for focused practice, switch to a mock, clarify a guideline-linked point, return to missed concepts through spaced repetition, and then use the same broader platform in clinical work. For candidates preparing for more than one assessment, multi-exam access also reduces duplication. Knowledge built for one exam often supports another, but only if the platform is organised around reusable clinical concepts rather than isolated exam silos.
Candidate Checklist Before Subscribing
Before choosing a revision resource, candidates should check:
Does it match the exam format? SBA, MCQ, EMQ, calculation, written response and case-simulation exams require different practice behaviours.
Does it map to the curriculum or blueprint? Large question volume is less useful if the distribution does not reflect the real assessment.
Does it support timed mocks? Exam performance depends on pacing and endurance, not knowledge alone.
Does it resurface missed concepts? Without spaced repetition, early revision decays while later topics are being covered.
Does it show actionable analytics? Topic percentages are useful, but the best systems identify the clinical reasoning pattern behind repeated errors.
Does it fit real working life? Mobile access, short practice blocks and continuity across devices are not luxuries for clinicians; they are what make consistent revision possible.
