Skip to main content
  • Research article
  • Open access
  • Published:

Introducing evidence based medicine to the journal club, using a structured pre and post test: a cohort study

Abstract

Background

Journal Club at a University-based residency program was restructured to introduce, reinforce and evaluate residents understanding of the concepts of Evidence Based Medicine.

Methods

Over the course of a year structured pre and post-tests were developed for use during each Journal Club. Questions were derived from the articles being reviewed. Performance with the key concepts of Evidence Based Medicine was assessed. Study subjects were 35 PGY2 and PGY3 residents in a University based Family Practice Program.

Results

Performance on the pre-test demonstrated a significant improvement from a median of 54.5 % to 78.9 % over the course of the year (F 89.17, p < .001). The post-test results also exhibited a significant increase from 63.6 % to 81.6% (F 85.84, p < .001).

Conclusions

Following organizational revision, the introduction of a pre-test/post-test instrument supported achievement of the learning objectives with a better understanding and utilization of the concepts of Evidence Based Medicine.

Peer Review reports

Background

The Residency Review Committee guidelines for Family Medicine require teaching the skills necessary for the critical appraisal of the medical literature. Most residency training programs accomplish this in the setting of a monthly journal club. At our program, faculty involved with organizing these sessions noted an ongoing frustration with the attendance, preparation, participation and focus of the residents.

Within the framework of a needs assessment, residents were surveyed to identify issues directly related to the utility of the Journal Club. This opportunity to reorganize also allowed reexamination of the goals and objectives for the Journal Club in line with those suggested by others [1]. A formal curriculum review with new goals and objectives was put in place, acknowledged by renaming the sessions as "Evidence Based Medicine /Journal Club". Paramount was a focus on critical appraisal and clinical epidemiology from the Evidence-based perspective.

A syllabus of material on the approach to Evidence-based Medicine was chosen to provide the skills necessary to allow the residents to become comfortable with identifying valid articles, potentially impacting their practice of medicine [2]. For this component of the objectives we combined the search for "poem's" (Patient Oriented Evidence that Matters) from Slawson and Shaunessey[3, 4] with Sackett's emphasis on clinical epidemiology and the rigorous assessment of validity [5]. The importance of this knowledge base was reinforced by using the first didactic session of the academic year to provide a formal review of the materials. At this session, incoming residents began with a formal test of their skills. They were then presented with the syllabus of selected articles from the JAMA series on the Users Guides to the Literature [6], Slawson and Shaunsessey's work on becoming information masters [3, 4], web site references and access to relevant texts.

An approach to evaluating the impact of these changes represents the balance of this article.

Methods

Participants were family medicine residents in a University-based residency program. The PGY1 information was collected but not included in this report, as they were not available at the initiation of the revised Journal Club in Module 10 of our 13 Module academic year.

In order to measure the progress in meeting the objectives of the revised Journal Club, a written pre-test and post-test was chosen as a primary method of evaluating each Journal Club session. This consisted of 10 to 12 questions focused on the principles of Evidence Based Medicine and clinical epidemiology, as well as the key content of the articles. The items were prepared by the Director for Journal Club (JSC) and appeared on both the Pre and Post-tests. Keeping in mind the subjective nature of creating appropriate questions it was felt that well constructed questions could still demonstrate both reliability and validity. Reliability has been equated to the repeatability or stability of responses on repeated administration. Face validity was demonstrated via clear unambiguous questions. Content validity was achieved through inclusion of questions incorporating the key concepts of Evidence Based Medicine and clinical epidemiology. This can be seen in a representative quiz presented initially to faculty and annually to incoming residents to assess their EBM skills (appendix 1).

Analysis of results was accomplished using SPSS version 6.1 for Windows[7] on a microcomputer. Where appropriate ANOVA and two-tailed t-tests for paired samples were used.

Results

The new format for the journal club was implemented in the 10th module of the 13 module academic year. This first module was designated for the study as Module A. For consistency only those residents present for an entire year were included in the analysis. This amounted to 35 residents in their second and third year of the residency (table 1). Due to scheduling conflicts, no Journal Clubs were completed during 4 separate modules (Quarterly Grand Rounds X 2, Thanksgiving and Christmas).

Table 1 Pre-test and Post-test Results for the Median and Mean

As analysis of the results indicated a non-normal distribution of scores and therefore results are presented using the median rather than the mean for each module. Median group performance on the Pre-test improved from 54.5 % to 78.9 % over the course of the year (figure 1). A significant linear trend for Pre-test scores was shown between modules (F 89.17, p < .001). Median Post-test results moved from 63.6 % to 81.6 % with a significant linear trend (F 85.84, p < .001) (figure 2). The overall Pre to Post-test difference reached significance (F 2.04, p =.046).

Figure 1
figure 1

Median Pre Test Scores by Module with Best Fit Trend Line

Figure 2
figure 2

Median Post Test Scores by Module with Best Fit Trend Line

Discussion

The discussions during the monthly Journal Club began with an obvious focus derived from the content of the Pre-test. It was expected that the Pre-test would help to identify participants who had not read or understood the impact of the articles and help to focus the discussion. The Pre-test also allowed the department to begin to track, in a sequential fashion, objective measures of performance for individual residents. The Post-test represented the group educational dynamic.

Identifying residents with problems on the Pre-test or a lack of response on the Post-test was obvious and allowed timely information to be forwarded to their academic advisors for remediation. As a bonus, tracking attendance via a turned in quiz was facilitated and this information also was forwarded to the residents academic advisor.

When critical appraisal skills among resident physicians have been formally assessed, correct responses have varied between 33 and 42 % suggesting limited integration of these concepts [8]. This was confirmed at our institution with an overall average of 32% for incoming PGY1 residents over the preceding three years. Even after specific targeted interventions, correct responses have increased only to 67 % [9]. The attainment of a Pre-test median score of 79.0 % at the end of the first year of our intervention was certainly comparable and significantly exceeded those results [9].

Studies have also indicated that there is no correlation between self-assessed competence and actual ability in these skills [10]. With the Pre-test there is direct feedback regarding comprehension and competence. This therefore gives us a quantitative assessment of the individual residents facility with these critical concepts, as they evolve over time.

There also appears to be little evidence that didactic Continuing Medical Education can yield any effect on performance [11, 12]. According to these authors the emphasis of successful educational programs is based upon a pre-course assessment of needs, opportunities to practice relevant skills and after course activities that reinforce and facilitate change. Our Pre-test assesses needs, the discussion offers opportunities to practice skills and our Post-test reinforces and facilitates change [11].

Conclusion

The introduction of a pre-test and post-test helped to provide a uniformity in focus for the Journal Club. More importantly this effort yielded improvements in objective measures of preparation and comprehension of the key concepts of Evidence-based Medicine. Despite the additional faculty workload, the ability to meet structured learning goals made the continuing effort worthwhile.

Additional files

Appendix 1: Evidence Based Medicine / Journal Club Example Quiz. Word document.

References

  1. Valentini RP, Daniels SR: The Journal Club. Postgrad Med J. 1997, 73: 81-85.

    Article  Google Scholar 

  2. Haynes RB: Where's the Meat in Clinical Journals?. ACP Journal Club. 1993, 119: A22-A23.

    Google Scholar 

  3. Slawson DC, Shaughnessy AF, Bennett JH: Becoming a Medical Information Master: Feeling Good About Not Knowing Everything. JFP. 1994, 38: 505-513.

    Google Scholar 

  4. Shaugnessy AF, Slawson DC, Bennett JH: Becoming an Information Master: A Guidebook to the Medical Information Jungle. JFP. 1994, 39: 489-499.

    Google Scholar 

  5. Sackett DL: How to Read Clinical Journals: V: To Distinguish Useful from Useless or Even Harmful Therapy. CMAJ. 1981, 124: 1156-1162.

    Google Scholar 

  6. Evidence Based Medicine Working Group: Users' Guides to the Medical Literature. JAMA. 1993, 270: 2093-2095. (I), 2598-2601 (II a), 1994,271: 55-63 (II b), 703-707 (III), 1615-1619 (IV), 1994,272: 234-237 (V), 1367-1371 (VI), 1995,273: 1292-1295 (VII a), 1630-1632 (VII b), 1995,274: 570-574 (VIII a), 1630-1632 (VIII b), 1800-1804 (IX), 1996,275: 554-558 (X), 1435-1439 (XI), 1997,277: 1232-1237 (XII), 1552-1557 (XIII a), 1802-1806 (XIII b).

    Article  Google Scholar 

  7. SPSS Version 6.1 for Windows. SPSS Inc. Chicago. 1993

  8. Linzer M, Brown JT, Frazier CM, et al: Impact of a Medical Journal Club on House-Staff Reading Habits, Knowledge and Critical Appraisal Skills: A Randomized Control Trial. JAMA. 1988, 260: 2537-2541. 10.1001/jama.260.17.2537.

    Article  Google Scholar 

  9. Stern DT, Linzer M, O'Sullivan PS, Weld L: Evaluating Medical Residents' Literature-appraisal Skills. Acad Med. 1995, 70: 152-154.

    Article  Google Scholar 

  10. Emerson JS: Use of Statistical Analysis in the New England Journal of Medicine. NEJM. 1983, 309: 709-713.

    Article  Google Scholar 

  11. Davis DA: The Science and Practice of Continuing Medical Education: A Study in Dissonance. ACP Journal Club. 1993, 118: A-18.

    Google Scholar 

  12. Linzer M, DeLong ER, Hupart KH: A Comparison of Two Formats for Teaching Critical Reading Skills in a Medical Journal Club. J of Med Educ. 1987, 62: 690-692.

    Google Scholar 

Pre-publication history

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J Steven Cramer.

Additional information

Competing interests

None declared.

Authors’ original submitted files for images

Rights and permissions

Reprints and permissions

About this article

Cite this article

Cramer, J.S., Mahoney, M.C. Introducing evidence based medicine to the journal club, using a structured pre and post test: a cohort study. BMC Med Educ 1, 6 (2001). https://doi.org/10.1186/1472-6920-1-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-1-6

Keywords