Reliability of medical record abstraction by non-physicians for orthopedic research

DSpace/Manakin Repository

Reliability of medical record abstraction by non-physicians for orthopedic research

Citable link to this page

 

 
Title: Reliability of medical record abstraction by non-physicians for orthopedic research
Author: Mi, Michael Y; Collins, Jamie E; Lerner, Vladislav; Losina, Elena; Katz, Jeffrey N

Note: Order does not necessarily reflect citation order of authors.

Citation: Mi, Michael Y, Jamie E Collins, Vladislav Lerner, Elena Losina, and Jeffrey N Katz. 2013. “Reliability of medical record abstraction by non-physicians for orthopedic research.” BMC Musculoskeletal Disorders 14 (1): 181. doi:10.1186/1471-2474-14-181. http://dx.doi.org/10.1186/1471-2474-14-181.
Full Text & Related Files:
Abstract: Background: Medical record review (MRR) is one of the most commonly used research methods in clinical studies because it provides rich clinical detail. However, because MRR involves subjective interpretation of information found in the medical record, it is critically important to understand the reproducibility of data obtained from MRR. Furthermore, because medical record review is both technically demanding and time intensive, it is important to establish whether trained research staff with no clinical training can abstract medical records reliably. Methods: We assessed the reliability of abstraction of medical record information in a sample of patients who underwent total knee replacement (TKR) at a referral center. An orthopedic surgeon instructed two research coordinators (RCs) in the abstraction of inpatient medical records and operative notes for patients undergoing primary TKR. The two RCs and the surgeon each independently reviewed 75 patients’ records and one RC reviewed the records twice. Agreement was assessed using the proportion of items on which reviewers agreed and the kappa statistic. Results: The kappa for agreement between the surgeon and each RC ranged from 0.59 to 1 for one RC and 0.49 to 1 for the other; the percent agreement ranged from 82% to 100% for one RC and 70% to 100% for the other. The repeated abstractions by the same RC showed high intra-rater agreement, with kappas ranging from 0.66 to 1 and percent agreement ranging from 97% to 100%. Inter-rater agreement between the two RCs was moderate with kappa ranging from 0.49 to 1 and percent agreement ranging from 76% to 100%. Conclusion: The MRR method used in this study showed excellent reliability for abstraction of information that had low technical complexity and moderate to good reliability for information that had greater complexity. Overall, these findings support the use of non-surgeons to abstract surgical data from operative notes.
Published Version: doi:10.1186/1471-2474-14-181
Other Sources: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3684546/pdf/
Terms of Use: This article is made available under the terms and conditions applicable to Other Posted Material, as set forth at http://nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of-use#LAA
Citable link to this page: http://nrs.harvard.edu/urn-3:HUL.InstRepos:11708556
Downloads of this work:

Show full Dublin Core record

This item appears in the following Collection(s)

 
 

Search DASH


Advanced Search
 
 

Submitters