Email updates

Keep up to date with the latest news and content from BMC Medicine and BioMed Central.

Journal App

google play app store
Open Access Research article

A controlled trial of the effectiveness of internet continuing medical education

Linda Casebeer1*, Sally Engler1, Nancy Bennett2, Martin Irvine3, Destry Sulkes3, Marc DesLauriers1 and Sijian Zhang4

Author Affiliations

1 Outcomes, Inc., 300 Riverchase Pkwy E, Birmingham, AL 35244, USA

2 Consultant, Outcomes, Inc., 300 Riverchase Pkwy E, Birmingham, AL 35244, USA

3 Medscape, LLC, 76 Ninth Avenue, New York, NY 10011, USA

4 University of Alabama at Birmingham, Birmingham, AL, USA

For all author emails, please log on.

BMC Medicine 2008, 6:37  doi:10.1186/1741-7015-6-37

Published: 4 December 2008

Abstract

Background

The internet has had a strong impact on how physicians access information and on the development of continuing medical education activities. Evaluation of the effectiveness of these activities has lagged behind their development.

Methods

To determine the effectiveness of a group of 48 internet continuing medical education (CME) activities, case vignette surveys were administered to US physicians immediately following participation, and to a representative control group of non-participant physicians. Responses to case vignettes were analyzed based on evidence presented in the content of CME activities. An effect size for each activity was calculated using Cohen's d to determine the amount of difference between the two groups in the likelihood of making evidence-based clinical decisions, expressed as the percentage of non-overlap, between the two groups. Two formats were compared.

Results

In a sample of 5621 US physicians, of the more than 100,000 physicians who participated in 48 internet CME activities, the average effect size was 0.75, an increased likelihood of 45% that participants were making choices in response to clinical case vignettes based on clinical evidence. This likelihood was higher in interactive case-based activities, 51% (effect size 0.89), than for text-based clinical updates, 40% (effect size 0.63). Effectiveness was also higher among primary care physicians than specialists.

Conclusion

Physicians who participated in selected internet CME activities were more likely to make evidence-based clinical choices than non-participants in response to clinical case vignettes. Internet CME activities show promise in offering a searchable, credible, available on-demand, high-impact source of CME for physicians.