Email updates

Keep up to date with the latest news and content from BMC Medical Education and BioMed Central.

Open Access Highly Accessed Research article

Development and validation of the ACE tool: assessing medical trainees’ competency in evidence based medicine

Dragan Ilic1*, Rusli Bin Nordin2, Paul Glasziou3, Julie K Tilson4 and Elmer Villanueva5

Author Affiliations

1 Department of Epidemiology & Preventive Medicine, School of Public Health & Preventive Medicine, Monash University, Level 6, The Alfred Centre, 99 Commercial Rd, Melbourne VIC 3004, Australia

2 Jeffrey Cheah School of Medicine and Health Sciences, Monash University, Johor Bahru, Malaysia

3 Faculty of Health Sciences and Medicine, Bond University, Robina, Australia

4 Division of Biokinesiology and Physical Therapy, University of Southern California, Los Angeles, USA

5 Gippsland Medical School, Monash University, Churchill, Australia

For all author emails, please log on.

BMC Medical Education 2014, 14:114  doi:10.1186/1472-6920-14-114

Published: 9 June 2014

Abstract

Background

While a variety of instruments have been developed to assess knowledge and skills in evidence based medicine (EBM), few assess all aspects of EBM - including knowledge, skills attitudes and behaviour - or have been psychometrically evaluated. The aim of this study was to develop and validate an instrument that evaluates medical trainees’ competency in EBM across knowledge, skills and attitude.

Methods

The ‘Assessing Competency in EBM’ (ACE) tool was developed by the authors, with content and face validity assessed by expert opinion. A cross-sectional sample of 342 medical trainees representing ‘novice’, ‘intermediate’ and ‘advanced’ EBM trainees were recruited to complete the ACE tool. Construct validity, item difficulty, internal reliability and item discrimination were analysed.

Results

We recruited 98 EBM-novice, 108 EBM-intermediate and 136 EBM-advanced participants. A statistically significant difference in the total ACE score was observed and corresponded to the level of training: on a 0-15-point test, the mean ACE scores were 8.6 for EBM-novice; 9.5 for EBM-intermediate; and 10.4 for EBM-advanced (p < 0.0001). Individual item discrimination was excellent (Item Discrimination Index ranging from 0.37 to 0.84), with internal reliability consistent across all but three items (Item Total Correlations were all positive ranging from 0.14 to 0.20).

Conclusion

The 15-item ACE tool is a reliable and valid instrument to assess medical trainees’ competency in EBM. The ACE tool provides a novel assessment that measures user performance across the four main steps of EBM. To provide a complete suite of instruments to assess EBM competency across various patient scenarios, future refinement of the ACE instrument should include further scenarios across harm, diagnosis and prognosis.

Keywords:
Evidence based medicine; Assessment; Medical students