Gianinazzi, Micòl E; Rueegg, Corina S; Zimmerman, Karin; Kuehni, Claudia E; Michel, Gisela (2015). Intra-Rater and Inter-Rater Reliability of a Medical Record Abstraction Study on Transition of Care after Childhood Cancer. PLoS ONE, 10(5), e0124290. Public Library of Science 10.1371/journal.pone.0124290
|
Text
Gianinazzi PLoSOne 2015.pdf - Published Version Available under License Creative Commons: Attribution (CC-BY). Download (1MB) | Preview |
BACKGROUND
The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability.
METHOD
Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen's kappa.
FINDINGS
For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen's kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen's kappa 0.70-0.83) with high agreement ranging from 86% to 100%.
CONCLUSIONS
Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.
Item Type: |
Journal Article (Original Article) |
---|---|
Division/Institute: |
04 Faculty of Medicine > Pre-clinic Human Medicine > Institute of Social and Preventive Medicine (ISPM) |
UniBE Contributor: |
Kühni, Claudia, Michel, Gisela |
Subjects: |
600 Technology > 610 Medicine & health 300 Social sciences, sociology & anthropology > 360 Social problems & social services |
ISSN: |
1932-6203 |
Publisher: |
Public Library of Science |
Language: |
English |
Submitter: |
Doris Kopp Heim |
Date Deposited: |
27 May 2015 09:13 |
Last Modified: |
05 Dec 2022 14:47 |
Publisher DOI: |
10.1371/journal.pone.0124290 |
PubMed ID: |
26001046 |
BORIS DOI: |
10.7892/boris.69175 |
URI: |
https://boris.unibe.ch/id/eprint/69175 |