Medical specialty certification exams studied according to the Ottawa Quality Criteria: a systematic review.

Staudenmann, Daniel; Waldner, Noemi; Lörwald, Andrea; Huwendiek, Sören (2023). Medical specialty certification exams studied according to the Ottawa Quality Criteria: a systematic review. BMC medical education, 23(1), p. 619. BioMed Central 10.1186/s12909-023-04600-x

[img]
Preview
Text
s12909-023-04600-x.pdf - Published Version
Available under License Creative Commons: Attribution (CC-BY).

Download (1MB) | Preview

BACKGROUND

Medical specialty certification exams are high-stakes summative assessments used to determine which doctors have the necessary skills, knowledge, and attitudes to treat patients independently. Such exams are crucial for patient safety, candidates' career progression and accountability to the public, yet vary significantly among medical specialties and countries. It is therefore of paramount importance that the quality of specialty certification exams is studied in the scientific literature.

METHODS

In this systematic literature review we used the PICOS framework and searched for papers concerning medical specialty certification exams published in English between 2000 and 2020 in seven databases using a diverse set of search term variations. Papers were screened by two researchers independently and scored regarding their methodological quality and relevance to this review. Finally, they were categorized by country, medical specialty and the following seven Ottawa Criteria of good assessment: validity, reliability, equivalence, feasibility, acceptability, catalytic and educational effect.

RESULTS

After removal of duplicates, 2852 papers were screened for inclusion, of which 66 met all relevant criteria. Over 43 different exams and more than 28 different specialties from 18 jurisdictions were studied. Around 77% of all eligible papers were based in English-speaking countries, with 55% of publications centered on just the UK and USA. General Practice was the most frequently studied specialty among certification exams with the UK General Practice exam having been particularly broadly analyzed. Papers received an average of 4.2/6 points on the quality score. Eligible studies analyzed 2.1/7 Ottawa Criteria on average, with the most frequently studied criteria being reliability, validity, and acceptability.

CONCLUSIONS

The present systematic review shows a growing number of studies analyzing medical specialty certification exams over time, encompassing a wider range of medical specialties, countries, and Ottawa Criteria. Due to their reliance on multiple assessment methods and data-points, aspects of programmatic assessment suggest a promising way forward in the development of medical specialty certification exams which fulfill all seven Ottawa Criteria. Further research is needed to confirm these results, particularly analyses of examinations held outside the Anglosphere as well as studies analyzing entire certification exams or comparing multiple examination methods.

Item Type:

Journal Article (Review Article)

Division/Institute:

04 Faculty of Medicine > Medical Education > Institute for Medical Education
04 Faculty of Medicine > Medical Education > Institute for Medical Education > Assessment and Evaluation Unit (AAE)

UniBE Contributor:

Staudenmann, Daniel Lukas, Lörwald, Andrea Carolin, Huwendiek, Sören

Subjects:

300 Social sciences, sociology & anthropology > 370 Education
600 Technology > 610 Medicine & health

ISSN:

1472-6920

Publisher:

BioMed Central

Language:

English

Submitter:

Pubmed Import

Date Deposited:

31 Aug 2023 10:51

Last Modified:

25 Mar 2024 09:48

Publisher DOI:

10.1186/s12909-023-04600-x

PubMed ID:

37649019

Uncontrolled Keywords:

Medical education Reliability Specialty certification examination Validity

BORIS DOI:

10.48350/185929

URI:

https://boris.unibe.ch/id/eprint/185929

Actions (login required)

Edit item Edit item
Provide Feedback