Open Access Open Badges Research article

The University of Texas Houston Stroke Registry (UTHSR): implementation of enhanced data quality assurance procedures improves data quality

Mohammad H Rahbar123*, Nicole R Gonzales4, Manouchehr Ardjomand-Hessabi3, Amirali Tahanan3, Melvin R Sline4, Hui Peng3, Renganayaki Pandurengan3, Farhaan S Vahidy4, Jessica D Tanksley5, Ayodeji A Delano4, Rene M Malazarte4, Ellie E Choi4, Sean I Savitz4 and James C Grotta4

Author Affiliations

1 Division of Epidemiology, Human Genetics, and Environmental Sciences, The University of Texas School of Public Health at Houston, Houston, TX 77030, USA

2 Division of Clinical and Translational Sciences, Department of Internal Medicine, The University of Texas-Houston Medical School, Houston, TX 77030, USA

3 Biostatistics/Epidemiology/Research Design Core, Center for Clinical and Translational Sciences, The University of Texas Health Science Center at Houston, Houston, TX 77030, USA

4 Department of Neurology, Stroke Program, The University of Texas-Houston Medical School, Houston, TX 77030, USA

5 The University of Texas Medical Branch at Galveston, Houston, TX 77030, USA

For all author emails, please log on.

BMC Neurology 2013, 13:61  doi:10.1186/1471-2377-13-61

Published: 15 June 2013



Limited information has been published regarding standard quality assurance (QA) procedures for stroke registries. We share our experience regarding the establishment of enhanced QA procedures for the University of Texas Houston Stroke Registry (UTHSR) and evaluate whether these QA procedures have improved data quality in UTHSR.


All 5093 patient records that were abstracted and entered in UTHSR, between January 1, 2008 and December 31, 2011, were considered in this study. We conducted reliability and validity studies. For reliability and validity of data captured by abstractors, a random subset of 30 records was used for re-abstraction of select key variables by two abstractors. These 30 records were re-abstracted by a team of experts that included a vascular neurologist clinician as the “gold standard”. We assessed inter-rater reliability (IRR) between the two abstractors as well as validity of each abstractor with the “gold standard”. Depending on the scale of variables, IRR was assessed with Kappa or intra-class correlations (ICC) using a 2-way, random effects ANOVA. For assessment of validity of data in UTHSR we re-abstracted another set of 85 patient records for which all discrepant entries were adjudicated by a vascular neurology fellow clinician and added to the set of our “gold standard”. We assessed level of agreement between the registry data and the “gold standard” as well as sensitivity and specificity. We used logistic regression to compare error rates for different years to assess whether a significant improvement in data quality has been achieved during 2008–2011.


The error rate dropped significantly, from 4.8% in 2008 to 2.2% in 2011 (P < 0.001). The two abstractors had an excellent IRR (Kappa or ICC ≥ 0.75) on almost all key variables checked. Agreement between data in UTHSR and the “gold standard” was excellent for almost all categorical and continuous variables.


Establishment of a rigorous data quality assurance for our UTHSR has helped to improve the validity of data. We observed an excellent IRR between the two abstractors. We recommend training of chart abstractors and systematic assessment of IRR between abstractors and validity of the abstracted data in stroke registries.

Stroke; Registry; Quality assurance procedures; Inter-rater reliability; Validity; Quality control; Error rate; Gold standard