Skip to main content

The relationship between organisational characteristics and the effects of clinical guidelines on medical performance in hospitals, a meta-analysis

Abstract

Objective

To measure the effectiveness of strategies to implement clinical guidelines andthe influence of organisational characteristics on hospital care.

Methods

Systematic review and meta regression analysis including randomisedcontrolled trials, controlled clinical trials and controlled before-and-after studies.

Results

53 studies were identified, including 81 comparisons. The total effect of allintervention strategies appeared to be Odds ratio 2.13 (SD 1.72-2.65). Interventionstrategies (such as educational material, reminders, feedback) and other professionalinterventions that mostly comprised revisions of professional roles were found to berelatively strong components of multi faceted interventions. Outcomes of organisationaleffect modifiers were better in a learning environment in inpatient studies than inoutpatient studies. Interventions developed outside hospitals yielded better outcomes; OR4.62 (SD 2.82-7.57) versus OR 1.78 (SD 1.36-2.23).

Conclusion

Both single and multifaceted interventions seemed to be effective in hospitalsettings. Evidence for the effects of organisational determinants remained limited.

Background

Systematic reviews in various health care settings have demonstrated that different implementation interventions have varying effects. [1, 2]. Most interventions to implement clinical guidelines focused on changing professional behaviour, but there is increasing awareness that factors related to the social, organisational and economical context can also be important determinants of guideline implementation[3]. For instance, a recent study on the implementation of screening guidelines in ambulatory settings has confirmed the influence of a number of organisational factors, such as mission, capacity and professionalism[4]. Despite increasing attention to organisational determinants of guideline implementation, research evidence on the relevance of specific factors is still limited. Insight into these factors is important as it can improve the effectiveness of implementation interventions by tailoring interventions to local circumstances. For example, different interventions may be more effective at academic hospitals than at community hospitals.

Most reviews on guideline implementation were conducted on implementation across settings, or implementation in primary care settings[5]. The literature on guideline implementation in hospital settings has not yet been reviewed separately. Therefore, we reviewed the effect of different intervention strategies to implement clinical guidelines at hospitals, and explored the impact of specific organisational factors on the effectiveness of these interventions. Hospitals are complex organisational systems whose primary aim is to deliver clinical care to individual patients. Management theories on change and innovation were analysed to derive specific factors for this explorative study. We identified the following factors that moght modify the effects of interventions: sufficient management support, appropriate learning environment, functional differentiation and local consensus on the intended changes (figure 1).

Figure 1
figure 1

Conceptual framework to assess the relationship between organisational or implementation aspects and clinical outcomes.

Theories on leadership and on quality management have suggested that support for an innovation from hospital management has a positive impact on its adoption [7–9]. The impact of management support may be based on power, incentives or facilitation. Hospital managers may also act as role models by implementing the innovation. Thus we hypothesized that implementation interventions are more effective if the effort is clearly supported by the local leaders.

The learning environment comprises a second set of factors. The underlying mechanism is that the availability of knowledge in the organisation enhances the adoption of innovations. This is consistent with existing theory on organisational learning, which suggests that an organisation's capacity to learn as an organisation is a crucial feature[9]. Teaching hospitals create a specific learning environment for trainers and trainees. Therefore we expected that implementation interventions are more effective in teaching hospitals than in non-teaching hospitals.

Functional differentiation is another factor that is expected to influence the uptake of new information or procedures in practice[10]. A higher level of specialisation and a higher level of technical expertise in the organisation may enhance implementation. The level and diversity of knowledge may be larger in settings with a range of medical disciplines, in which there is involvement of consultants, other physicians and non-physician practitioners. We therefore hypothesized that higher functional differentiation is positively associated with the effectiveness of implementation interventions.

Finally, we expected that promoting ownership through local consensus about clinical guideline recommendations and implementation strategies may also be associated with better uptake[11]. Organisational learning theory suggests that information gathering, shared perceptions of performance gaps and an experimental mind-set are important factors for learning in organisations[9]. Specific group cultures at hospitals appear to be associated with patient outcomes[12]. Theory on complex adaptive systems suggests that innovations should not be specified in detail in order to promote ownership and that 'muddling-through' should steer the guideline implementation process[13], while theory on adult learning adds that implementation should be tailored to each individual's learning needs[14]. We hypothesized that guideline implementation interventions would be most effective when developed within a hospital rather than derived from sources outside a hospital.

This systematic literature review aimed to assess the effectiveness of implementation and quality improvement interventions in hospital settings and to test our hypotheses on the impact of organisational factors.

Methods

Inclusion/exclusion

Only studies with a concurrent control group of the following designs were included:

  • Randomised controlled trials (RCTs), involving individual randomisation or cluster randomisation on the level of the hospital, ward or professional.

  • Controlled clinical trials or controlled before-and-after studies.

Participants: the studies described the performance of medical health care professionals working at the hospitals. Medical centres, health centres or clinics without an inpatient department were excluded. Ambulatory departments and clinics that fell directly under hospital management were included.

Intervention: studies that evaluated interventions to implement guidelines were included. If the guidelines were aimed at multi-professional groups or other health care professionals, studies were only included if the results on medical health care professionals were reported separately, or medical health care professionals represented more than 50% of the target population. Studies that evaluated the introduction of guidelines targeted at undergraduate medical students were excluded.

Outcome: objective measures of provider behaviour, such as proportion of patients treated in accordance with guidelines. Only studies reporting dichotomous measures were included.

Literature search

Studies were identified from a systematic review of guideline dissemination and implementation strategies across all settings[2]. Details of the search strategies and their development are described elsewhere[2]. Briefly, electronic searches were made of the following databases: Medline (1966–1998), HEALTHSTAR (1975–1998), Cochrane controlled trial register (4th edition 1998), EMBASE (1980–1998), SIGLE (1980–1988) and the Cochrane Effective Practice and Organisation of Care group specialised register. For the review of interventions in all settings, over 150,000 hits were screened: 5000 were considered potentially relevant papers and full text articles of 863 were retrieved for assessment. In total, 235 studies were included in the systematic review of strategies across all settings. These studies were screened to identify potentially relevant studies for the hospital based review; we identified 108 studies conducted in hospital settings, of which 23 did not have a concurrent control group (were interrupted time series designs) and 32 other studies had continous measures. Therefore 53 of the 108 studies met our inclusion criteria.

Data-extraction

The study followed the methods proposed by the Cochrane Effective Practice and Organisation of Care (EPOC) group[14]. Two independent reviewers extracted data on study design, methodological quality, participants, study settings, target behaviours, characteristics of interventions and study results, according to the EPOC checklist[15]. A second data extraction was done to assess potential organisational effect modifiers in hospital studies. Management support was regarded as positive if the manuscript gave information on direct support from the hospital management for the intervention, such as funding, or when the project was initiated by the hospital management or was set up as a result of hospital quality improvement strategies. "Academic hospital" was taken as the proxy for learning environment. Functional differentiation was operationalized by noting whether more than one specialty had been involved in the intervention, e.g. internal medicine and gynaecology, or whether more than one type of physician had been involved, e.g. specialists and residents, or when other professions, e.g. trained nurses, had been directly involved in the implementation process. Local consensus was regarded as being present when explicit information was given that the guidelines had been developed at the hospital or when major adaptations had been made to external guidelines before introduction at this hospital. Local consensus was also considered to be present when the implementation strategies had been developed at the hospital.

Analysis

Analysis was based on the theoretical framework depicted in figure 1. The effect of the different intervention strategies on clinical outcomes was expected to be influenced by the organisational effect modifiers listed under the headings leadership, learning environment, functional differentiation and local consensus. Effects and modifiers may have different influences on clinical outcomes in inpatient or outpatient settings (figure 1).

In each comparison, the primary process of care measure was extracted, as defined by the authors. If multiple process of care measures were reported and none of them were defined as being the primary variable, effect sizes were ranked and the median value was taken. Effect sizes were constructed so that treatment benefits were denoted positively.

All statistical analyses were performed using the proc mixed procedure by SAS version 6.12. First we estimated the treatment effect (log odds ratio) and the variances in this effect for each comparison weighted for variance within the study and between studies. These estimated effects were used as responses in a random effect meta-regression model, in which we corrected for multiple comparisons in a single study. In most studies a unit of analysis error was found and insufficient data were presented to calculate cluster sizes. First we ignored the unit of analysis error to analyse all the studies included. Then, from the studies that reported sufficient data on the number of participants and professionals, a sensitivity analysis was performed. We did this by calculating the design effect by using the cluster sizes in each study and assuming a constant and conservative intracluster correlation of 0.20,[16] after which we re-ran the meta-regression model with and without a correction for a unit of analysis error.

To measure the effect of each individual intervention strategy, effect sizes were adjusted for other intervention components that appeared in at least one third of the studies on each strategy. Adjustment for other interventions that appeared less frequently was not possible due to small numbers. To evaluate the relative effectiveness of different intervention components, covariates were included in the model.

Results

The 53 trials yielded 81 comparisons. The appendix gives an overview of these studies and the intervention components of each comparison compared to the intervention components, if present, in the control group [see Additional file 1]. The trials consisted of 39 randomised controlled trials (of which 32 were clustered randomised controlled trials), 7 controlled clinical trials and 7 controlled before-and-after studies; 19 were inpatients studies [17–36], 28 were outpatient studies [37–64] and 6 had mixed settings [65–70]. In the 81 comparisons, 22 involved a single intervention. Mean number of interventions per comparison was 2.5 (SD 1.3).

Table 1 shows the results of the meta-analysis on the effects of various intervention components. When taken together, the Odds ratio in all intervention strategies appeared to be 2.13 (SD 1.72–2.65). These total results are visualised in a Forest plot in figure 2. The Odds ratio for the forest plot is slightly different because we could not correct for different interventions within studies. Single interventions consisted of reminders or feedback; no other intervention strategies were applied as a single intervention strategy. Overall odds ratios were 2.18 in single intervention studies, versus 1.77 in studies that had more than one intervention component, the so-called multifaceted intervention studies. Intervention components applied most frequently were reminders, feedback, educational meetings and educational material. Only a few studies included outreach visits, consensus meetings, financial interventions or the role of an opinion leader. With regard to the specific components of individual intervention strategies, we found that all components showed positive effects, except for consensus meetings, outreach and financial interventions, possibly due to small numbers.

Figure 2
figure 2

forest plot: total effect of guideline implementation in hospitals.

Table 1 Effectiveness of specific intervention components

To learn more about the contribution of each intervention strategy to a multifaceted approach, we adjusted for co-operating intervention components to identify each unique contribution made by a specific intervention component when the other intervention components remained constant. Although adjustments could only be made for other intervention components that were co-operating in at least one third of the comparisons, we saw substantial changes in the results. The effects of educational material, reminders and feedback remained statistically significant, while the effects of educational meetings and patient-mediated interventions disappeared. The effect of the latter might be explained by the other co-operating intervention components. Furthermore, the revision of professional roles appeared to be a strong component in the intervention strategies besides organisational interventions, although the latter were not significant. Sensitivity analysis showed that, within the 47 comparisons in which cluster sizes could be calculated, adjustment for clustering effects showed some small changes in the effect sizes. There were no effect sizes that had become non significant due to adjustment for a design effect. Table 2 describes the effect of the different organisational factors on outcome measures. For most organisational effect modifiers, no significant differences were found in outcomes. Academic hospitals showed greater improvements in inpatient care only compared with community general hospitals.

Table 2 Effects of the presence of organisational features to improve the quality of care at hospitals

However, in outpatient studies, community hospitals showed significantly larger effects. Furthermore, interventions that had not been developed internally, but had originated from outside the hospital, led to better outcomes, especially in outpatient studies.

Discussion

This is the first systematic review to make an in depth exploration of guideline implementation in relation to the organisational characteristics of hospitals. Not only multifaceted interventions seemed to be effective, but also single interventions, contrary to our expectation that multifaceted interventions would prevail[69]. Single intervention strategies, particularly reminders, known to be effective in other settings, also appeared to be effective strategies in hospitals. Although a multi-faceted intervention including reminders may be effective, a single reminder strategy might provide a clearer or more consistent message and thus have more impact. Furthermore, educational material, reminders, feedback and revision of professional roles had more effect than other intervention strategies. We did not confirm our hypotheses on the influence of organisational factors, except for a learning environment in inpatient settings. Contrary to our expectations, effects were greater at community hospitals in outpatient studies. In some multi-centre studies, no difference was found between academic and community hospitals [27, 28]. or only a moderate positive effect was found for academic hospitals[70]. External factors, such as the origin of the intervention, seemed to have more impact than internal factors.

This review of the research literature has the limitation that it was an explorative retrospective study that may have suffered from publication bias. There may also have been reporting bias, because different studies with different aims were brought together and compared, while their research question was not to measure the influence of contextual factors. Due to limited organisational data within the studies, the validity of the measurements of organisational effect modifiers can be challenged. Another limitation of the study concerned the validity of the determinants, e.g. with regard to teaching hospitals as a proxy for learning environment, when no other valid measurements for learning environment could be found in the manuscripts. Other potential effect modifiers had to be disregarded, because there were no data present, for example organisational slack[9]. Also the number of studies was limited, partly because studies were left out if they did not produce dichotomous measures or did not have a concurrent group (interrupted time series designs). Adding these studies might have given our analysis greater power and perhaps other conclusions. The analysis was not corrected for clustering effects, which may have led to an overestimation of the statistical significance of the effects. Finally the study is limited by the fact that the exhaustive search strategy and data extraction inhibited the inclusion of studies published after 1998. An update of this review in a new study is recommended, including the possibility of exploring the inclusion of interupted time series and studies with continous measures.

Theories on organisational effect modifiers are mostly based on study results from a wide range of fields inside and outside health care and when they are based on health care, this mostly concerns primary care. Despite the pertinence of regarding organisational factors as being crucial for quality improvement, there is only limited research evidence for the claims made. The influence of possible organisational effect modifiers on the quality of care at hospitals needs more research attention and more evidence is needed from inside not outside hospitals for the development of theories in this field. Ongoing research should develop validated measures of potentially important organisational constructs and explore their influence on quality of care.

Conclusion

There is no 'magic bullet' in terms of the most effective strategy or organisational effect modifiers for the implementation of change within hospitals. On an organisational level, barriers against and facilitators for effective interventions are unclear. Depending on the management policy and other local factors, such as funds available and motivation of the health care personnel, hospitals might wish to focus on building a learning organisation [71] or on adopting proven, effective, strategies from outside.

References

  1. Grol R, Grimshaw JM: From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003, 362: 1225-1230. 10.1016/S0140-6736(03)14546-1.

    Article  PubMed  Google Scholar 

  2. Grimshaw JM, Thomas RE, MacLennan G, Fraser C, ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L, Wensing M, Dijkstra R, Donaldson C: Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Ass. 2004, 8 (6).

  3. Di Blasi Z, Harkness E, Ernst E, Georgiou A, Kleijnen J: Influence of context effects on health outcomes: a systematic review. Lancet. 2001, 357 (9258): 757-762. 10.1016/S0140-6736(00)04169-6.

    Article  CAS  PubMed  Google Scholar 

  4. Vaughn TE, McCoy K, BootsMiller BJ, Woolson RF, Sorofman B, Tripp-Reimer T, Perlin J, Doebbeling BN: Organizational predictors of adherence to ambulatory care screening guidelines. Med Care. 2002, 40 (12): 1172-1185. 10.1097/00005650-200212000-00005.

    Article  PubMed  Google Scholar 

  5. Grimshaw JM, Shirran L, Thomas R, Mowatt G, Fraser C, Bero L, Grilli R, Harvey E, Oxman A, O'Brien MA: Changing provider behavior: an overview of systematic reviews of interventions. Med-Care. 2001, 39 (8 Suppl 2): 2-45.

    Google Scholar 

  6. Lipman T: Power and influence in clinical effectiveness and evidence-based medicine. Fam Pract. 2000, 17 (6): 557-563. 10.1093/fampra/17.6.557.

    Article  CAS  PubMed  Google Scholar 

  7. Gustafson DH, Hundt AS: Findings of innovation research applied to quality management principles for health care. Health Care Manage Rev. 1995, 20 (2): 16-33.

    Article  CAS  PubMed  Google Scholar 

  8. Weiner JP, Parente ST, Garnick DW, Fowles J, Lawthers AG, Palmer RH: Variation in office-based quality. A claims-based profile of care provided to Medicare patients with diabetes. JAMA. 1995, 273 (19): 1503-1508. 10.1001/jama.273.19.1503.

    Article  CAS  PubMed  Google Scholar 

  9. Nevis EC, Di Bella AJ, Gould JM: Understanding organizations as learning systems. Sloan Management Review. 1995, 73: 73-85.

    Google Scholar 

  10. Damanpour F: Organizational innovation: a meta-analysis of effects of determinants and moderators. Acad Manage J. 1991, 34: 555-590. 10.2307/256406.

    Google Scholar 

  11. Dickinson E: Using marketing principles for healthcare development. Qual Health Care. 1995, 4: 40-4.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Shortell SM, Jones RH, Rademaker AW, Gillies RR, Dranove DS, Hughes EF, Budetti PP, Reynolds KS, Huang CF: Assessing the impact of total quality management and organizational culture on multiple outcomes of care for coronary artery bypass graft surgery patients. Med Care. 2000, 38 (2): 207-217. 10.1097/00005650-200002000-00010.

    Article  CAS  PubMed  Google Scholar 

  13. Plsek PE, Wilson T: Complexity, leadership, and management in healthcare organisations. BMJ. 2001, 323 (7315): 746-749. 10.1136/bmj.323.7315.746.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. Tassone MR, Heck CS: Motivational orientations of allied health professionals participating in continuing education. J Contin Educ Health Prof. 1997, 17: 97-105.

    Article  Google Scholar 

  15. Bero L, Grilli R, Grimshaw JM, Mowatt G, Oxman A, Zwarenstein M: Cochrane effective practice and organization of care review group. the Cochrane Library. 2001, 3

  16. Campbell M, Grimshaw J, Steen N: Sample size calculations for cluster randomised trials. Changing Professional Practice in Europe Group. J Health Serv Res Policy. 2000, 5: 12-6.

    Article  CAS  PubMed  Google Scholar 

  17. Hay JA, Maldonado L, Weingarten SR, Ellrodt AG: Prospective evaluation of a clinical guideline recommending hospital length of stay in upper gastrointestinal tract hemorrhage. JAMA. 1997, 278 (24): 2151-2156. 10.1001/jama.278.24.2151.

    Article  CAS  PubMed  Google Scholar 

  18. Overhage JM, Tierney WM, Zhou XH, McDonald CJ: A randomized trial of "corollary orders" to prevent errors of omission. J Am Med Inform Assn. 1997, 4 (5): 364-375.

    Article  CAS  Google Scholar 

  19. Leviton LC, Goldenberg RL, Baker CS, Schwartz RM, Freda MC, Fish LJ, Cliver SP, Rouse DJ, Chazotte C, Merkatz IR, Raczynski JM: Methods to encourage the use of antenatal corticosteroid therapy for fetal maturation: a randomized controlled trial. JAMA. 1999, 281 (1): 46-52. 10.1001/jama.281.1.46.

    Article  CAS  PubMed  Google Scholar 

  20. Weingarten S, Riedinger MS, Conner L, Johnson B, Ellrodt AG: Reducing lengths of stay in the coronary care unit with a practice guideline for patients with congestive heart failure Insights from a controlled clinical trial. Med Care. 1994, 32 (12): 1232-1243.

    Article  CAS  PubMed  Google Scholar 

  21. Weingarten S: Practice guidelines and reminders to reduce duration of hospital stay for patients with chest pain. An interventional trial. Ann Intern Med. 1994, 120 (4): 257-263.

    Article  CAS  PubMed  Google Scholar 

  22. Kong GK, Belman MJ, Weingarten S: Reducing length of stay for patients hospitalized with exacerbation of COPD by using a practice guideline. Chest. 1997, 111 (1): 89-94.

    Article  CAS  PubMed  Google Scholar 

  23. Weingarten S, Riedinger MS, Hobson P, Noah MS, Johnson B, Giugliano G, Norian J, Belman MJ, Ellrodt AG: Evaluation of a pneumonia practice guideline in an interventional trial. Am J Resp Crit Care. 1996, 153 (3): 1110-1115.

    Article  CAS  Google Scholar 

  24. Fowkes FG, Davies ER, Evans KT, Green G, Hartley G, Hugh AE, Nolan DJ, Power AL, Roberts CJ, Roylance J: Multicentre trial of four strategies to reduce use of a radiological test. Lancet. 1986, 1 (8477): 367-370. 10.1016/S0140-6736(86)92327-5.

    Article  CAS  PubMed  Google Scholar 

  25. Robinson MB, Thompson E, Black NA: Evaluation of the effectiveness of guidelines, audit and feedback: improving the use of intravenous thrombolysis in patients with suspected acute myocardial infarction. Int J Qual Health Care. 1996, 8: 211-222. 10.1016/1353-4505(96)00033-6.

    CAS  PubMed  Google Scholar 

  26. Soumerai SB, McLaughlin TJ, Gurwitz J, Guagagnoli E, Hauptman PL, Borbas C, Morris N, McLaughlin B, Gao X, Willison DJ, Asinger R, Gobel F: Effect of local medical opinion leaders on quality of care for acute myocardial infarction: a randomized controlled trial. JAMA. 1998, 279 (17): 1358-1363. 10.1001/jama.279.17.1358.

    Article  CAS  PubMed  Google Scholar 

  27. Wirtschafter DD, Sumners J, Jackson JR, Brooks CM, Turner M: Continuing medical education using clinical algorithms. A controlled-trial assessment of effect on neonatal care. Am J Dis Child. 1986, 140 (8): 791-797.

    Article  CAS  PubMed  Google Scholar 

  28. Soumerai SB, Salem-Schatz S, Avorn J, Casteris CS, Ross-degnan D, Popovsky MA: A controlled trial of educational outreach to improve blood transfusion practice. JAMA. 1993, 270 (8): 961-966. 10.1001/jama.270.8.961.

    Article  CAS  PubMed  Google Scholar 

  29. Anderson FA, Wheeler HB, Goldberg RJ, Hosmer DW, Forier A, Patwardhan NA: Changing clinical practice. Prospective study of the impact of continuing medical education and quality assurance programs on use of prophylaxis for venous thromboembolism. Arch Intern Med. 1994, 154 (6): 669-677. 10.1001/archinte.154.6.669.

    Article  PubMed  Google Scholar 

  30. Sommers LS, Sholtz R, Shepherd RM, Starkweather DB: Physician involvement in quality assurance. Med Care. 1984, 22 (12): 1115-1138.

    Article  CAS  PubMed  Google Scholar 

  31. Overhage JM, Tierney WM, McDonald CJ: Computer reminders to implement preventive care guidelines for hospitalized patients. Arch Intern Med. 1996, 156 (14): 1551-1556. 10.1001/archinte.156.14.1551.

    Article  CAS  PubMed  Google Scholar 

  32. Lomas J, Enkin MW, Anderson GM, Hannah WJ, Vayda E, Singer J: Opinion leaders vs audit and feedback to implement practice guidelines. Delivery after previous cesarean section. JAMA. 1991, 265 (17): 2202-2207. 10.1001/jama.265.17.2202.

    Article  CAS  PubMed  Google Scholar 

  33. Dranitsaris G, Warr D, Puodziunas A: A randomized trial of the effects of pharmacist intervention on the cost of antiemetic therapy with ondansetron. Supportive Care in Cancer. 1995, 3 (3): 183-189. 10.1007/BF00368888.

    Article  CAS  PubMed  Google Scholar 

  34. Danchaivijitr S, Chokloikaew S, Tangtrakool T, Waitayapiches S: Does indication sheet reduce unnecessary urethral catheterization?. J Med Assoc Thai. 1992, 75 (Suppl 2): 1-5.

    PubMed  Google Scholar 

  35. Girotti MJ, Fodoruk S, Irvine-Meek J, Rotstein OD: Antibiotic handbook and pre-printed perioperative order forms for surgical antibiotic prophylaxis: do they work?. Can J Surg. 1990, 33 (5): 385-388.

    CAS  PubMed  Google Scholar 

  36. MacCosbe PE, Gartenberg G: Modifying empiric antibiotic prescribing: experience with one strategy in a medical residency program. Hosp Formul. 1985, 20: 986-999.

    CAS  PubMed  Google Scholar 

  37. Struewing JP, Pape DM, Snow DA: Improving colorectal cancer screening in a medical residents' primary care clinic. Am J Prev Med. 2002, 7 (2): 75-81.

    Article  Google Scholar 

  38. Aucott JN, Pelecanos E, Dombrowski R, Fuehrer SM, Laich J, Aron DC: Implementation of local guidelines for cost-effective management of hypertension. A trial of the firm system. J Gen Intern Med. 1996, 11 (3): 139-146.

    Article  CAS  PubMed  Google Scholar 

  39. Cheney C, Ramsdell JW: Effect of medical records' checklists on implementation of periodic health measures. Am J Med. 1987, 83 (1): 129-136. 10.1016/0002-9343(87)90507-9.

    Article  CAS  PubMed  Google Scholar 

  40. Cohen DI, Littenberg B, Wetzel C, Neuhauser D: Improving physician compliance with preventive medicine guidelines. Med Care. 1982, 20 (10): 1040-1045.

    Article  CAS  PubMed  Google Scholar 

  41. Cohen SJ, Christen AG, Katz BP, Drook CA, Davis BJ, Smith DM, Stookey GK: Counseling medical and dental patients about cigarette smoking: the impact of nicotine gum and chart reminders. Am J Public Health. 1987, 77 (3): 313-316.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  42. Litzelman DK, Dittus RS, Miller ME, Tierney WM: Requiring physicians to respond to computerized reminders improves their compliance with preventive care protocols. J Gen Intern Med. 1993, 8 (6): 311-317.

    Article  CAS  PubMed  Google Scholar 

  43. McDonald CJ, Hui SL, Smith DM, Tierney WM, Cohen SJ, Weinberger M, McCabe GP: Reminders to physicians from an introspective computer medical record. A two-year randomized trial. Ann Intern Med. 1984, 100 (1): 130-138.

    Article  CAS  PubMed  Google Scholar 

  44. Thomas JC, Moore A, Qualls PE: The effect on cost of medical care for patients treated with an automated clinical audit system. J Med Syst. 1983, 7 (3): 307-313. 10.1007/BF00993294.

    Article  CAS  PubMed  Google Scholar 

  45. Herman CJ, Speroff T, Cebul RD: Improving compliance with immunization in the older adult: results of a randomized cohort study. J Am Geriatr Soc. 1994, 42 (11): 1154-1159.

    Article  CAS  PubMed  Google Scholar 

  46. Mayefsky JH, Foye HR: Use of a chart audit: teaching well child care to paediatric house officers. Med Educ. 1993, 27 (2): 170-174.

    Article  CAS  PubMed  Google Scholar 

  47. Robie PW: Improving and sustaining outpatient cancer screening by medicine residents. South Med J. 1988, 81 (7): 902-905.

    Article  CAS  PubMed  Google Scholar 

  48. Nattinger AB, Panzer RJ, Janus J: Improving the utilization of screening mammography in primary care practices. Arch Intern Med. 1989, 149 (9): 2087-2092. 10.1001/archinte.149.9.2087.

    Article  CAS  PubMed  Google Scholar 

  49. Nilasena DS, Lincoln MJ: A computer-generated reminder system improves physician compliance with diabetes preventive care guidelines. Proc Annu Symp Comput Appl Med Care. 1995, 640-645.

    Google Scholar 

  50. Zenni EA, Robinson TN: Effects of structured encounter forms on pediatric house staff knowledge, parent satisfaction, and quality of care. A randomized, controlled trial. Arch Pediat Adol Med. 1996, 150 (9): 975-980.

    Article  CAS  Google Scholar 

  51. McDonald CJ: Use of a computer to detect and respond to clinical events: its effect on clinician behavior. Ann Intern Med. 1976, 84 (2): 162-167.

    Article  CAS  Google Scholar 

  52. Callahan CM, Hendrie HC, Dittus RS, Brater DC, Hui SL, Tierney WM: Improving treatment of late life depression in primary care: a randomized clinical trial. J Am Geriatr Soc. 1994, 42 (8): 839-846.

    Article  CAS  PubMed  Google Scholar 

  53. Marton KI, Tul V, Sox HC: Modifying test-ordering behavior in the outpatient medical clinic. A controlled trial of two educational interventions. Arch Intern Med. 1985, 145 (5): 816-821. 10.1001/archinte.145.5.816.

    Article  CAS  PubMed  Google Scholar 

  54. Mazzuca SA, Vinicor F, Einterz RM, Tierney WM, Norton JA, Kalasinski LS: Effects of the clinical environment on physicians' response to postgraduate medical education. American Educational Research Journal. 1990, 27: 473-488. 10.2307/1162932.

    Article  Google Scholar 

  55. Becker DM, Gomez EB, Kaiser DL, Yoshihasi A, Hodge RH: Improving preventive care at a medical clinic: how can the patient help?. Am J Prev Med. 1989, 5 (6): 353-359.

    Article  CAS  PubMed  Google Scholar 

  56. Rogers JL, Haring OM, Wortman PM, Watson RA, Goetz JP: Medical information systems: assessing impact in the areas of hypertension, obesity and renal disease. Med Care. 1982, 20 (1): 63-74.

    Article  CAS  PubMed  Google Scholar 

  57. Headrick LA, Speroff T, Pelecanos HI, Cebul RD: Efforts to improve compliance with the National Cholesterol Education Program guidelines. Results of a randomized controlled trial. Arch Intern Med. 1992, 152 (12): 2490-2496. 10.1001/archinte.152.12.2490.

    Article  CAS  PubMed  Google Scholar 

  58. Belcher DW: Implementing preventive services. Success and failure in an outpatient trial. Arch Intern Med. 1990, 150 (12): 2533-2541. 10.1001/archinte.150.12.2533.

    Article  CAS  PubMed  Google Scholar 

  59. Safran C, Rind DM, Davis RB, Ives D, Sands DZ, Currier J, Slack WV, Makadon HJ, Cotton DJ: Guidelines for management of HIV infection with computer-based patient's record. Lancet. 1995, 346 (8971): 341-346. 10.1016/S0140-6736(95)92226-1.

    Article  CAS  PubMed  Google Scholar 

  60. Gonzalez JJ, Ranney J, West J: Nurse-initiated health promotion prompting system in an internal medicine residents' clinic. South Med J. 1989, 82 (3): 342-344.

    Article  CAS  PubMed  Google Scholar 

  61. Brady WJ, Hissa DC, McConnell M, Wones R: Should physicians perform their own quality assurance audits?. J Gen Intern Med. 1988, 3 (6): 560-565.

    Article  CAS  PubMed  Google Scholar 

  62. Hopkins JA, Shoemaker WC, Greenfield S: Treatment of surgical emergencies with and without an algorithm. Arch Surg. 1980, 115 (6): 745-750.

    Article  CAS  PubMed  Google Scholar 

  63. Vissers MC, Biert J, Linden CJ vd, Hasman A: Effects of a supportive protocol processing system (ProtoVIEW) on clinical behaviour of residents in the accident and emergency department. Comp Methods Programs Biomed. 1996, 49 (2): 177-184. 10.1016/0169-2607(95)01714-3.

    Article  CAS  Google Scholar 

  64. Auleley GR, Ravaud P, Giraudeau B, Kerboull L, Nizard R, Massin P, Garreau de Loubresse C, Vallee C, Durieux P: Implementation of the Ottawa ankle rules in France. A multicenter randomized controlled trial. JAMA. 1997, 277 (24): 1935-1939. 10.1001/jama.277.24.1935.

    Article  CAS  PubMed  Google Scholar 

  65. Lee TH, Pearson SD, Johnson PA, Garcia TB, Weisberg MC, Guadagnoli E, Cook EF, Goldman L: Failure of information as an intervention to modify clinical management. A time-series trial in patients with acute chest pain. Ann Intern Med. 1995, 122 (6): 434-437.

    Article  CAS  PubMed  Google Scholar 

  66. Anonymous: CCQE-AHCPR guideline criteria project: develop, apply, and evaluate medical review criteria and educational outreach based upon practice guidelines: final project report. Edited by: Jefferson St. 1996, Rockville, MD: U.S. Dept. of Health and Human Services, Public Health Service, Agency for Health Care, Policy and Research

    Google Scholar 

  67. Chassin MR: A randomized trial of medical quality assurance. Improving physicians' use of pelvimetry. JAMA. 1986, 256 (8): 1012-1016. 10.1001/jama.256.8.1012.

    Article  CAS  PubMed  Google Scholar 

  68. Szilagyi PG, Rodewald LE, Humiston SG, Pollard L, Klossner K, Jones AM, Barth R, Woodin KA: Reducing missed opportunities for immunizations. Easier said than done. Arch Pediat Adol Med. 1996, 150 (11): 1193-1200.

    Article  CAS  Google Scholar 

  69. Burack RC, Gimotty PA, George J, Stengle W, Warbasse L, Moncrease A: Promoting screening mammography in inner-city settings: a randomized controlled trial of computerized reminders as a component of a program to facilitate mammography. Med Care. 1994, 32 (6): 609-624.

    Article  CAS  PubMed  Google Scholar 

  70. Evans AT, Rogers LQ, Peden JGjr, Seelig CB, Layne RD, levine MA, levin ML, Grossman RS, Darden PM, Jackson SM, Ammerman AS, Settle MB, Stritter FT, Fletcher SW: Teaching dietary counseling skills to residents: patient and physician outcomes. The CADRE Study Group. Am J Prev Med. 1996, 12 (4): 259-265.

    Article  CAS  PubMed  Google Scholar 

  71. Garside P: The learning organisation: a necessary setting for improving care?. Qual Health Care. 1999, 8 (4): 211.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Pre-publication history

Download references

Acknowledgements

We are grateful to our colleagues involved in the systematic review of guideline dissemination and implementation strategies across all settings especially Cynthia Fraser, Graeme MacLennan, Craig Ramsay, Paula Whitty, Martin Eccles, Lloyd Matowe, Liz Shirran. The systematic review of guideline dissemination and implementation strategies across all settings was funded by the UK NHS Health Technology Assessment Program. Dr Ruth Thomas is funded by a Wellcome Training Fellowship in Health Services Research. (Grant number GR063790MA). The Health Services Research Unit is funded by the Chief Scientists Office of the Scottish Executive Department of Health. Dr Jeremy Grimshaw holds a Canada Research Chair in Health Knowledge Transfer and Uptake. However the views expressed are those of the authors and not necessarily the funders.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rob Dijkstra.

Additional information

Competing interests

The author(s) declare that they have no competing interests.

Authors' contributions

RD, MW, JB and RG drew up the design and framework of this manuscript. Data extraction on profesional performances and intervention strategies was done by RT, JG, RD and MW. Data extraction on organisational features was done by RD and MW. Statistical analysis was done bij RA. All authors read and approved the final manuscript.

Electronic supplementary material

12913_2005_230_MOESM1_ESM.doc

Additional File 1: Studies to implement guidelines at hospitals. The reviewed studies are listed with details on type of study, setting, target, number of patientes, intervention strategies and post study percentages on the primary outcome measure. (DOC 133 KB)

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Authors’ original file for figure 2

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Dijkstra, R., Wensing, M., Thomas, R. et al. The relationship between organisational characteristics and the effects of clinical guidelines on medical performance in hospitals, a meta-analysis. BMC Health Serv Res 6, 53 (2006). https://doi.org/10.1186/1472-6963-6-53

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6963-6-53

Keywords