Development of a Competency-based Multisource Feedback Instrument for Residents

Hennel, Eva Kathrin; Subotic, Ulrike; Berendonk, Christoph; Stricker, Daniel; Harendza, Sigrid; Huwendiek, Sören (2021). Development of a Competency-based Multisource Feedback Instrument for Residents. In: AMEE 2021. virtual conference, Dundee. 27. - 30. August 2021.

Introduction: Multisource Feedback (MSF) is one form of workplace-based assessment which can support
medical training by providing regular feedback from various perspectives [1]. Typically, MSF consists of
feedback given by several raters via structured MSF questionnaires which list the expected competencies.
This feedback is collected and delivered in written form or by a supervisor. There are a variety of
instruments for different settings; undergraduate and postgraduate training, with a formative or
summative purpose. However, no MSF instrument for residency training is publicly available in German. By
developing such an instrument, we could encourage the use of MSF and thus support residency training.
Hence, we aimed to develop a German-language MSF questionnaire based on the CanMEDS roles and to
show its validity for residency training.
Methods: The study took place at a Swiss university children's hospital with residents following a training in
paediatric surgery or paediatrics. Mini-CEX and DOPS were already offered, but no further training on
structured feedback. Following the criteria of validity as described by Cook [2], we aimed for four sources
of validity evidence: (i) Content: We based the content on the MSF literature, blueprints of competency,
and expert-team discussions. Reviewing international MSF instruments, the mini-PAT [3] was chosen as a
base and adapted to local needs with the formative use and the CanMEDS roles as the underlying
framework. (ii) Response Process: The clarity of items was examined in a think-aloud study. Residents,
raters and supervisors were trained on MSF and feedback. The response process was assessed by the
analysis of narrative comments, “unable to comment” ratings and evaluation data. (iii) Internal structure:
The internal structure was assessed by exploratory factor analysis, and inter-rater reliability by
generalisability analysis. Data were collected during two runs of MSF on 81 occasions of MSF. (iv)
Consequences: We analysed the residents’ learning goals and the progress as reported via MSF.
Results: The MSF questionnaire consists of 15 items and one global rating, which are rated on a scale and
offer space for narrative comments each. Additionally, there are open questions for further suggestions.
Investigation of validity evidence revealed: (i) Clinical competence is comprehensively addressed; (ii) The
items are understood as intended and evaluation showed good acceptance and usability; (iii) Factor
analysis showed a one-factor solution, a Cronbach’s alpha of 0.951 and an inter-rater reliability of 0.797 for
the second run with 12 raters; (iv) Residents formulated individual learning goals and MSF ratings hinted at
development.
Discussion and Conclusions: We developed a competency-based MSF questionnaire for residency training
in German language and described evidence of validity in terms of content, response processes, internal
structure and consequences [4]. Based on the CanMEDS roles and assessed with surgical and non-surgical
residents, this MSF questionnaire should be of use for other specialties, too. As the lack of instruments in
the local language can be a hurdle for medical education, this description of the development based on an
222
international instrument adapting both the language and content to the local setting, might be exemplary
and helpful for others. The strengths of this study include the description of response process and
consequences, which address a gap in the literature. The results are limited to the observations from one
clinic, which might impair the generalizability. Future studies should focus on further consequences of
MSF, besides the residents competencies.
References: 1. Boursicot, K., et al. (2020). "Performance assessment: Consensus statement and
recommendations from the 2020 Ottawa Conference." Med Teach: 1-10.
2. Cook, D. A. and T. J. Beckman (2006). "Current concepts in validity and reliability for psychometric
instruments: theory and application." The American journal of medicine 119(2): 166. e167-166. e116.
3. Archer, J., et al. (2008). "mini-PAT (Peer Assessment Tool): a valid component of a national assessment
programme in the UK?" Adv Health Sci Educ Theory Pract 13(2): 181-192.
4. Hennel, E. K., et al. (2020). "A german-language competency-based multisource feedback instrument for
residents: development and validity evidence."

Item Type:

Conference or Workshop Item (Abstract)

Division/Institute:

04 Faculty of Medicine > Medical Education > Institute for Medical Education > Assessment and Evaluation Unit (AAE)

UniBE Contributor:

Hennel, Eva Kathrin, Berendonk, Christoph, Stricker, Daniel, Huwendiek, Sören

Language:

English

Submitter:

Susanne Yvonne Moser-Eichenberger

Date Deposited:

24 Jan 2023 16:11

Last Modified:

24 Jan 2023 23:26

URI:

https://boris.unibe.ch/id/eprint/163139

Actions (login required)

Edit item Edit item
Provide Feedback