Publication: Reliability of medical record abstraction by non-physicians for orthopedic research
Open/View Files
Date
2013
Published Version
Journal Title
Journal ISSN
Volume Title
Publisher
BioMed Central
The Harvard community has made this article openly available. Please share how this access benefits you.
Citation
Mi, Michael Y, Jamie E Collins, Vladislav Lerner, Elena Losina, and Jeffrey N Katz. 2013. “Reliability of medical record abstraction by non-physicians for orthopedic research.” BMC Musculoskeletal Disorders 14 (1): 181. doi:10.1186/1471-2474-14-181. http://dx.doi.org/10.1186/1471-2474-14-181.
Research Data
Abstract
Background: Medical record review (MRR) is one of the most commonly used research methods in clinical studies because it provides rich clinical detail. However, because MRR involves subjective interpretation of information found in the medical record, it is critically important to understand the reproducibility of data obtained from MRR. Furthermore, because medical record review is both technically demanding and time intensive, it is important to establish whether trained research staff with no clinical training can abstract medical records reliably. Methods: We assessed the reliability of abstraction of medical record information in a sample of patients who underwent total knee replacement (TKR) at a referral center. An orthopedic surgeon instructed two research coordinators (RCs) in the abstraction of inpatient medical records and operative notes for patients undergoing primary TKR. The two RCs and the surgeon each independently reviewed 75 patients’ records and one RC reviewed the records twice. Agreement was assessed using the proportion of items on which reviewers agreed and the kappa statistic. Results: The kappa for agreement between the surgeon and each RC ranged from 0.59 to 1 for one RC and 0.49 to 1 for the other; the percent agreement ranged from 82% to 100% for one RC and 70% to 100% for the other. The repeated abstractions by the same RC showed high intra-rater agreement, with kappas ranging from 0.66 to 1 and percent agreement ranging from 97% to 100%. Inter-rater agreement between the two RCs was moderate with kappa ranging from 0.49 to 1 and percent agreement ranging from 76% to 100%. Conclusion: The MRR method used in this study showed excellent reliability for abstraction of information that had low technical complexity and moderate to good reliability for information that had greater complexity. Overall, these findings support the use of non-surgeons to abstract surgical data from operative notes.
Description
Other Available Sources
Keywords
Medical record review, Reliability, Kappa statistic, Total knee replacement
Terms of Use
This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service