Skip to main content
  • Research article
  • Open access
  • Published:

Advancing clinical decision support using lessons from outside of healthcare: an interdisciplinary systematic review

Abstract

Background

Greater use of computerized decision support (DS) systems could address continuing safety and quality problems in healthcare, but the healthcare field has struggled to implement DS technology. This study surveys DS experience across multiple non-healthcare disciplines for new insights that are generalizable to healthcare provider decisions. In particular, it sought design principles and lessons learned from the other disciplines that could inform efforts to accelerate the adoption of clinical decision support (CDS).

Methods

Our systematic review drew broadly from non-healthcare databases in the basic sciences, social sciences, humanities, engineering, business, and defense: PsychINFO, BusinessSource Premier, Social Sciences Abstracts, Web of Science, and Defense Technical Information Center. Because our interest was in DS that could apply to clinical decisions, we selected articles that (1) provided a review, overview, discussion of lessons learned, or an evaluation of design or implementation aspects of DS within a non-healthcare discipline and (2) involved an element of human judgment at the individual level, as opposed to decisions that can be fully automated or that are made at the organizational level.

Results

Clinical decisions share some similarities with decisions made by military commanders, business managers, and other leaders: they involve assessing new situations and choosing courses of action with major consequences, under time pressure, and with incomplete information. We identified seven high-level DS system design features from the non-healthcare literature that could be applied to CDS: providing broad, system-level perspectives; customizing interfaces to specific users and roles; making the DS reasoning transparent; presenting data effectively; generating multiple scenarios covering disparate outcomes (e.g., effective; effective with side effects; ineffective); allowing for contingent adaptations; and facilitating collaboration. The article provides examples of each feature. The DS literature also emphasizes the importance of organizational culture and training in implementation success. The literature contrasts “rational-analytic” vs. “naturalistic-intuitive” decision-making styles, but the best approach is often a balanced approach that combines both styles. It is also important for DS systems to enable exploration of multiple assumptions, and incorporation of new information in response to changing circumstances.

Conclusions

Complex, high-level decision-making has common features across disciplines as seemingly disparate as defense, business, and healthcare. National efforts to advance the health information technology agenda through broader CDS adoption could benefit by applying the DS principles identified in this review.

Peer Review reports

Background

Healthcare lags behind many other disciplines in the diffusion of technology. In the area of computer-based decision support (DS), other disciplines have been using DS systems since the 1970s[1]. In the fields of defense, energy, environment, finance, business strategy, and public policy[2] tools such as knowledge management, expert systems, exploratory analysis, and data mining are ubiquitous. Although the need for DS to support clinical decision-making is considerable, the spread of health information technology (HIT) with Clinical Decision Support (CDS) in U.S. healthcare has been slow[3].

While both clinical and non-clinical DS systems have faced implementation challenges, lessons learned from other disciplines, particularly those in which DS use is more widespread, could inform efforts to advance the adoption of CDS. Interdisciplinary approaches when used in health services research, can be useful in finding solutions that can generalize across problems of a similar fundamental nature, identifying the full complexity of problems, and finding new insights[4]. Patient-safety initiatives have benefitted by applying strategies from commercial aviation to reduce medical errors[5]; likewise, HIT efforts could benefit from an interdisciplinary examination of DS applications.

Current CDS systems vary widely in their structure and function, ranging from medication dosing support to triage tools to workflow planning[6], and this variation is present in non-healthcare disciplines as well. We do not make comparisons between specific types of clinical and non-clinical DS systems (e.g., computerized-physician-order entry vs. an analogous non-clinical tool), but instead aim to synthesize general lessons learned from the body of literature on design features of DS tools in relevant, non-healthcare applications.

Previous research has documented common features of successful CDS. Kawamoto et al. described 15 features that were repeatedly found in the literature; of these, 4 were statistically associated with success (integration with workflow, giving recommendations rather than assessments, giving DS at the time and place of decision-making, and giving computer-based DS)[7]. Mollon et al. identified 28 technical, user interaction, logic-based, and developmental/administrative environment features[8]. Bates et al. used experiences in CDS to recommend ten principles for effective CDS design[9]. The existing research focuses on design features and policy recommendations to encourage adoption[10].

Early CDS efforts arose from the same Bayesian reasoning principles that gave rise to early DS tools in non-healthcare disciplines[11]. However, these early CDS tools did not find their way routine clinical practice. The degree to which subsequent CDS efforts were informed by non-healthcare experience is unknown, but we are not aware of explicit discussions of how and why non-healthcare applications might be transferrable. This review contributes to the literature by summarizing broad lessons learned, based on a fresh, and perhaps novel, interdisciplinary look outside of the clinical realm. In non-healthcare research, Burstein and Holsapple compiled examples of DS systems and applications[12, 13], but not as a comprehensive review. We found no studies that provided a review of how other disciplines’ experience with DS could inform CDS.

Objective

Our goal was to survey the literature on DS system design and implementation in non-healthcare disciplines, in order to contribute to our understanding of how to accelerate the adoption of CDS. We summarize key successes, best practices, and lessons learned from non-healthcare decision-making processes, and we identify tools that may inform the design of CDS. The primary focus is on DS systems, but as systems are only as good as the foundations they are built upon, this paper also briefly describes literature on decision-making principles.

Healthcare CDS has been defined as: "Health information technology functionality that builds upon the foundation of an electronic health record (EHR) to provide persons involved in care processes with general and person-specific information, intelligently filtered and organized, at appropriate times, to enhance health and health care"[14]. Our intent was, in reviewing non-healthcare DS, to take a broad view that would include applications based on artificial intelligence, knowledge management, neural networks, collaborative decision support, expert systems, and other methods.

Methods

This study was an interdisciplinary systematic review of literature pertaining to DS systems in non-healthcare disciplines. Interdisciplinary research can provide valuable new insights, but synthesizing articles across disciplines with highly varied standards, formats, terminology, and methods required an adapted approach. The study methodology applied the basic framework used for systematic reviews in healthcare[15]. We: 1) defined a scope of decisions and DS types that might be generalized to healthcare, 2) identified appropriate databases and performed searches within a defined timeframe for a fixed set of search terms, 3) selected abstracts for full-text review based on multiple reviewer agreement, 4) developed selection criterion to exclude lower-quality articles, and 5) reviewed the remaining articles for common themes of interest, abstracting qualitative data. Our generalization of healthcare systematic review methods approach was similar to that used by Evans and Benefield in examining educational research using a healthcare systematic review process[16].

With the aid of a reference librarian who identified relevant sources, in August 2010, we searched a number of databases for peer-reviewed and grey literature work published in 2005–2010 broadly representing basic sciences, social sciences, humanities, engineering, and business disciplines. The databases were PsychINFO, BusinessSource Premier, Social Sciences Abstracts, and Web of Science. Search terms were narrowed using an iterative process, based on the number of results returned and a review of a sample of each set of results for relevance. The final search terms were: decision support, decision system, and expert system; broader search parameters yielded too many irrelevant results. Articles with terms related to healthcare topics in the title or abstract were excluded. We also used a comprehensive database for unclassified defense work, the Defense Technical Information Center (DTIC). Using DTIC required a modified search strategy: we extended the timeframe to be from 2000–2010 in order to capture more reports, added “commander” to the search terms so as to exclude documents about low-level tactical decisions not generalizable to physicians, and relaxed constraints on publication type because relatively little of the relevant high-quality information in defense-sponsored work is disseminated in journals.

The literature on DS is enormous, and a large proportion of articles were descriptions of a single system from the subjective, and possibly biased, viewpoint of developers or users. Given the wide variability in article quality and format across multiple non-healthcare disciplines, we applied criteria that selected for more objective review-style work and for decision support relating to the high or moderately high-level decisions akin to those made by physicians. The first selection criterion was that the work was a review, overview, discussion of lessons learned, or evaluative piece across a discipline, or for a design or implementation aspect of DS. This was an indirect strategy for assessing DS system success or quality, as those included in reviews were felt to represent at least some level of success. The second criterion required the work to describe decisions with an element of individual human judgment, to ensure comparability to CDS. This excluded systems that fully automated decisions, such as models and systems related to optimization, industry / manufacturing, and systems engineering. We also excluded systems for which the object of decision-making was the organization, such as those dealing with organizational finance, long-term strategy, forecasting, or monitoring, as this also does not generalize well to clinical decisions for patient care. We abstracted findings related to one of three themes:

  1. 1)

    Characteristics of decisions being supported general features of the decisions that the systems in question were designed to support, with a focus on areas of similarity between healthcare and non-healthcare settings.

  2. 2)

    Principles used in decision-making – theoretical approaches about how to structure and make decisions, which in turn could be used to conceptualize DS tools.

  3. 3)

    Design features of DS tools – lessons learned, successful examples, and recommendations on how to improve the design and implementation of non-clinical DS aids, tools, and systems.

The initial search of five databases generated 890 unique results covering a wide range of disciplines. Of this set, one study author reviewed the titles and abstracts to identify a preliminary set that met the selection criteria, identifying 74 for further review. These 74 articles represented many disciplines, including environment, agriculture, natural resource management, urban planning, defense, law, and business disciplines. The other two study authors reviewed the preliminary set and identified 38 articles that met both selection criteria for full-text review. Where there was disagreement between authors, we erred on the side of inclusion. During the full-text review, 11 articles were excluded since they did not meet the two inclusion criteria or did not have any information related to these three themes of interest, leaving a final sample size of 27 references represented in this review. Of these 27 references, 9 related to defense topics[1725], 2 related to business[26, 27], 1 related to law[28], and 15 related to decision science generally[2943]. The initial set of articles included a broad range of disciplines, but the articles that passed the selection criteria for study quality and relevance, and for included content related to one the three common themes, were from a smaller set of disciplines. We also included a few select works that we knew were highly relevant from our expertise in decision making theory, but that did not appear in the literature search due to limitations of the search strategy[14, 4448]. The additional references sharpen key concepts and principles that did, however, arise in the initial search.

Results

Decision characteristics

Although the content of decisions varies across disciplines, the nature of the decisions and the DS tools used to inform them may have similarities. Table1 compares similar characteristics of clinical and non-clinical decision parameters, with the purpose of setting the theoretical framework for subsequent DS tool development. Clinical and military decision making may appear to be polar opposites on the surface, but the examples in Table1 show how high-level decision makers face many of the same types of challenges. Namely, they must make decisions that: 1) are urgent and require a quick initial response, 2) require rapid adaptation to changing circumstances and new information, 3) have life-or-death consequences, 4) have uncertain responses from opponents/adversaries due to incomplete, imperfect information and unpredictable behavior, and 5) require an understanding and ability to synthesize the interaction of various risks. DS tools serve to operationalize decision making approaches, so it is important to view decision characteristics as the framework for subsequent nuts and bolts comparisons of specific tools.

Table 1 Analogies between clinical and defense decisions

Cognitive biases are also similar across clinical and non-clinical decision-makers. Individuals ranging from military commanders to manufacturing floor managers to trial juries may be subject to cognitive distortions that include availability, representativeness, anchoring and adjustment, and confirmation biases[17, 28, 33, 37]. Humans’ probabilistic reasoning ability is poor, and both physicians and patients may inaccurately interpret single probabilities, conditional probabilities, and relative risks when making decisions[52]. An objective, accurate understanding of the complete situation – often termed “situational awareness” or “the common operating picture” in defense – can mitigate these biases, which would otherwise lead the decision-maker down the wrong path[17, 22]. Likewise, new clinical diagnoses must be made in the context of the patient’s overall medical history as opposed to a purely problem-focused evaluation; otherwise, the problem may be misidentified.

Principles of decision-making

Decision support is only as good as its underlying foundation, and an understanding of how to frame decision-making, regardless of how or whether DS tools are used, is a useful starting foundation. Principles of decision-making identified in the review address: 1) the distinction between rational-analytic vs. naturalistic-intuitive decision-making styles, 2) the utility of a flexible, adaptive, robust, approach that considers multiple criteria and possibilities (often reflected as alternative scenarios), 3) and the notion of appropriate levels of trust in recommendations.

Two general approaches to decision-making have taken turns in the spotlight over time[53], described by Davis, Kulick and Egner[44] and Davis and Kahan[18] as the rational-analytic and naturalistic-intuitive styles. The former draws upon data, models, and breadth-first evaluation of a set of options to make decisions, whereas the latter exploits experience and judgment. Rational-analytic decision-making drove much of the initial development of DS systems[22]. It is useful in synthesizing large amounts of information and mitigating some cognitive and intellectual biases, but it is limited by narrowly defined computer-based rules and models, which cannot adapt to contextual, big-picture considerations[17, 18, 29]. Naturalistic-intuitive decision-making, on the other hand, considers a human element to be critical even in situations highly amenable to automation, as people can identify novel patterns, exploit synergies, and be more creative – i.e., finding solutions that would not have been found through procedural reasoning within a narrow methodology. However, it may lead to preventable error and is prone to cognitive and other biases. Even in an industrial manufacturing application, which is highly amenable to automation, Metatioxis noted that “the production manager is the expert who knows the whole production environment and its special features and who can handle changing and unexpected events, multiple objectives and other similar aspects where human flexibility is necessary”[35].

A well-rounded approach draws upon the strengths of each decision-making style, while minimizing the respective weaknesses[18]. Figure1 provides an example of one way of framing decisions, which could be used in a DS system, that hypothetically uses rational-analytic methods to generate a set of scenarios that could in turn, be selected based on naturalistic-intuitive reasoning. Multi-scenario analysis such as this uses rational analysis not to optimize and provide a single answer, but to provide the decision-maker with a broad view of multiple hypotheses[22], relevant systems[20], or “branches and sequels” of possibilities[19]. As stated by Davis and Kahan, “high-level decision makers are commonly afflicted with deep uncertainties that materially affect the choice of a course of action … the solution is to adopt a course of action that is as flexible, adaptive, and robust as possible”[18]. Decisions that depend upon the validity of a single set of assumptions and criteria are not robust to uncertainties[18] and allow little room for judgment[33]. The key in DS system design is to provide enough information to give decision-makers a comprehensive view that mitigates the “fog of war” but not to overload them with so much information that it creates a “glare of war” – i.e., information overload[22, 45, 46].

Figure 1
figure 1

Example of rational-analytic and naturalistic decision styles combined.

Additionally, options generated by a DS system must be viewed with the appropriate level of truste.g., about whether the system performs as expected, or whether it has the right content and structure to provide accurate and appropriate recommendations for the situation. Some decision-makers may place inappropriately low levels of trust in DS systems, due to fear of technology and automation, concern about the reliability of the overall system or its recommendations, or discomfort with black box methodologies[23, 24, 26]. Other users may be overly confident about the system’s ability, in which case the value of their intuitive expertise is diminished.

Decision support system design features

In problem domains where DS is expected to be valuable, developers in non-healthcare fields have devised successful strategies for handling many problematic characteristics of decisions, while also paying attention to the principles of decision-making just discussed. These may be described in the literature as successes, lessons learned, or recommendations for improvement. They are organized in Table2 as specific features of the design and implementation of DS tools, aids, and systems – ranging from non-technical decision aids to fully networked systems. The design features that emerged from the literature as keys to successful DS are elaborated upon in Table2 and were grouped into six categories:

  1. 1)

    DS that provides broad, system-level, big picture perspectives helps to mitigate tunnel vision and support informed decision-making;

  2. 2)

    DS customized to users, settings, and purposes improves performance, whereas generic systems tend to perform poorly in specific problem domains;

  3. 3)

    DS protocols must be transparent in order to gain the appropriate level of user confidence;

  4. 4)

    DS must organize and present data effectively in order to fully mitigate the cognitive barriers to processing large volumes of information;

  5. 5)

    DS should allow users to generate/view multiple scenarios and options simultaneously in order to reap the benefits of both rational-analytic and naturalistic-intuitive decision-making; and

  6. 6)

    Collaborative, group-based DS should be designed to draw upon multi-disciplinary expertise

Table 2 Design features of non-clinical decision support applications

Decision support system implementation

Good system design is an obvious prerequisite for effective implementation, as detailed in Table2. Beyond design considerations, two overall themes emerged related to implementation: 1) the need for continual system improvements, and 2) effective user training that addresses individual and organizational issues during adoption. Although these are not new lessons for CDS research, their recurrence in non-clinical DS research should strengthen them as a priority in CDS. Also, we describe a few insights about how these issues are addressed in non-healthcare realms.

Continual improvement is important in non-clinical DS since it addresses the need for systems to be flexible, adaptive, and avoid perpetuating rules or data that are inaccurate or outdated. DS systems should be continually re-evaluated and fine-tuned, and implementation should not be viewed as a one-time task. Systems should never be viewed as final, since they must change frequently based on the nature of the problem, users, and environment. DS development should follow a three-part cycle – initiation, analysis, and delivery, which should be re-visited[33]. With respect to user training, DS implementation may fail because users are not properly trained on how to use the system, do not fully understand the system’s capabilities and efficient ways to take advantage of them, and do not understand their roles and responsibilities. The consequence is inappropriate or non-use of a tool. The user training process could avoid these problems by: informing users about the reliability of the decision aid to encourage appropriate levels of trust[24]; instructing users on the nature of change and teach them new behaviors in order to “unfreeze” them from old ways of thinking about DS[33]; encouraging continuous learning from the tool[35]; and providing realistic training simulations[26].

In addition to the content of the training, a strong theme was the importance of organizational support. Leadership should motivate decision-makers to use systems by: building an accepting, supportive, non-coercive environment, and encouraging the consistent, continued use of systems[43]. They should find a champion, an in-house expert who is also an end user and decision-maker, who will promote the system within the organization[27]. Finally, they should define roles and responsibilities of each user and decision-maker, clarify processes and expectations for using the tool within the current system, outline how the tool will measure performance, and identify incentives and rewards to use the tool or demonstrate a change in performance[27].

Discussion

The non-healthcare literature on DS offers a wealth of information that can be used to advance CDS. Characteristics of many decisions are similar between high-level clinical and non-clinical decision-makers, with elements of complexity, high uncertainty, unpredictable adversaries, rapid change, and the potential for cognitive biases. The basic principles that guide decision-making across fields are also worth noting. Both the rational-analytic and naturalistic-intuitive decision-making styles offer benefits and drawbacks, and users should understand how much to trust rational-analytic systems, in addition to being aware of naturalistic-intuitive biases.

This review focused on successes and best practices related to decision-making in non-healthcare disciplines. The literature consistently supports the notion that optimal DS system design depends heavily on situational factors – e.g., the state of the knowledge and sciences, as well as organizational, political, and cultural context. Tools must be developed that solve the right problem – “a knowledge base for bridge design is not useful if it's unclear a bridge is needed at all"[34]. Although the literature provides common lessons learned about how to design effective DS, there is no single “best” design, as this depends on the needs of users, nature of the problem, and system context. The key lessons summarized in Table2, in particular, should not be viewed as a prescription for how CDS should be designed, but rather used to reinforce the use of similar approaches in CDS, and understand what else might work. Implementation of CDS systems is also crucial. It must consider how individual factors and work processes motivate proper use, and how to set the right organizational context to support uptake. Although those working in CDS implementation may already be well aware of this, the non-healthcare literature underscores the notion that no matter what discipline, good system design is necessary but not sufficient for success.

Complex decisions call for a combination of naturalistic-intuitive and rational-analytic approaches. CDS based on rational-analytic methods, such as artificial intelligence technologies in medicine, must still incorporate intuitive judgments to be useful – a balanced approach. Whether these analytic approaches are embedded into computerized DS systems or not, decision-makers can draw upon the benefits of both approaches by prioritizing strategies that are flexible, adaptive, and robust (i.e., flexible enough to accommodate changes of mission or objective, adaptive to circumstances, and robust to adverse shocks). Having a suitable strategy of this sort is of little help, however, unless the effort is made to gain information as treatment proceeds, and to review and modify strategy accordingly. Thus, another principle of decision support should be to define a program for monitoring, information collection, review, and adjustment. In a clinical setting, such a program might include: (1) a written plan (even if informal) with anticipatable branches and potential surprise events, either good or bad, noted, (2) follow-up procedures to check on patient outcomes after decision support is used, (3) monitoring the evolution of knowledge, via colleagues and other experts, that may prove relevant to the treatment program over time, (4) scheduled laboratory tests and examinations, (5) organized big picture reviews that encourage fresh “rethinking” of the problem and the re-direction of treatment approaches if small adjustments to the current ones are not working. Computerized aids could help with most of these, whether or not they are seen as “decision aids.”

Limitations

Our interdisciplinary review process offered a novel approach to examining a problem within the healthcare discipline, but interdisciplinary research also poses many known challenges[4]. We targeted a high-priority and relevant subset of a diverse but inconsistent literature, a process that inevitably overlooked some experiences and publications. Although DS has been used in fields such as aviation, emergency management, nuclear energy, agriculture, and environmental planning[12, 13, 33], our quality and relevance-based selection criteria yielded applicable work from only the disciplines of defense, law, and business, with only 27 of 890 (3.0%) of the search results being abstracted. This article selection rate is similar to Eom and Kim’s DS system survey, which retained 210 of 5400 (3.9%) articles[33]. Although the number of articles and disciplines selected for full-text extraction was limited, the value of the result should probably be judged by whether the conclusions are helpful in considering clinical DS, rather than by whether some additional conclusions might have been found with an even more exhaustive search.

We did not evaluate the effectiveness of specific DS systems described in the literature, but used the inclusion of the study in a review-type article as an indicator of quality or importance. This approach was necessitated by the challenges of interdisciplinary-based systematic review. Descriptions of study quality, outcomes, and measures of DS success could not be reliably abstracted, as is typical in systematic reviews of healthcare. Also, the work represented in this review is not comprehensive or representative of the universe of DS applications, but rather a sample of recent, available work related to high-level, complex decision-making. For example, a number of studies on and tools for strategic planning address issues related to decision making but are not discussed at length due to the focus of this paper[45, 48]. Our study also does not represent the heterogeneity in DS function and purpose, whether clinical or non-clinical, but aims to distill lessons up to a sufficiently high level to be valuable for all types of CDS systems. A more extensive review process that included non-review articles and developed a methodology to evaluate the quality of DS “success” models in other fields would be valuable in future CDS research.

Conclusion

Much has been written recently about how to remove barriers to the adoption of CDS as part of a broader national HIT agenda. These efforts would benefit from taking the view that clinical decisions have similarities to high-level decisions in many non-healthcare fields, and that HIT design can benefit from the long history of DS development and implementation outside the healthcare realm. In summary, we found that CDS systems may be better designed to support complex healthcare decisions if they consider the following features: providing broad, system-level perspectives; customizing interfaces to users and purposes; making protocols transparent; organizing and presenting data effectively; providing multiple scenario generation ability; and facilitating collaboration where appropriate. Moreover, as systems are only as good as the principles upon which they are built, it is crucial for both CDS users and developers to consider how to apply both rational-analytic and naturalistic-intuitive approaches to complex healthcare decision-making.

Authors’ information

HWW holds a PhD in Policy Analysis from the Pardee RAND Graduate School, which supports an interdisciplinary approach to conducting decision analysis and solving complex problems. She also has a M.S. in Health Evaluation Sciences, with a focus on Health Services Research and Outcomes Evaluation. She has advised on HIT implementation efforts at a major academic medical center.

PKD is a Senior Principal Researcher at RAND and a Professor in the Pardee RAND Graduate School. He has published extensively on concepts, methods, and tools for supporting senior decision makers operating amidst great uncertainty. He has led development of ways to identify adaptive courses of action that can deal with both favorable or unfavorable developments.

DSB is a Research Scientist at RAND and an Associate Professor in the Department of Medicine at UCLA whose research focuses on the design and implementation of health information technologies. He is principal investigator of the Advancing Clinical Decision Support project. He is also co-director of the Biomedical Informatics Program of the UCLA Clinical and Translational Sciences Institute.

References

  1. Arnott D, Pervan G: Eight key issues for the decision support systems discipline. Decis Support Syst. 2008, 44 (3): 657-672. 10.1016/j.dss.2007.09.003.

    Article  Google Scholar 

  2. Keefer D, Kirkwood C, Corner J: Perspective on decision analysis applications, 1990–2001. Decis Anal. 2004, 1 (1): 4-22. 10.1287/deca.1030.0004.

    Article  Google Scholar 

  3. Hersh W: Health care information technology: progress and barriers. JAMA. 2004, 292 (18): 2273-2274. 10.1001/jama.292.18.2273.

    Article  CAS  PubMed  Google Scholar 

  4. Giacomini M: Interdisciplinarity in health services research: dreams and nightmares, maladies and remedies. J Health Serv Res Policy. 2004, 9 (3): 177-183. 10.1258/1355819041403222.

    Article  PubMed  Google Scholar 

  5. Reason J: Human error: models and management. BMJ. 2000, 320 (7237): 768-770. 10.1136/bmj.320.7237.768.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Wright A, Sittig DF, Ash JS, Feblowitz J, Meltzer S, McMullen C, Guappone K, Carpenter J, Richardson J, Simonaitis L: Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems. J Am Med Inform Assoc. 2011, 18 (3): 232-242. 10.1136/amiajnl-2011-000113.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Kawamoto K, Houlihan C, Balas E, Lobach D: Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. Br Med J. 2005, 330 (7494): 765-768. 10.1136/bmj.38398.500764.8F.

    Article  Google Scholar 

  8. Mollon B, Chong J, Holbrook A, Sung M, Thabane L, Foster G: Features predicting the success of computerized decision support for prescribing: a systematic review of randomized controlled trials. BMC Medical Informatics and Decision Making. 2009, 9 (1): 11-10.1186/1472-6947-9-11.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Bates D, Kuperman G, Wang S, Gandhi T, Kittler A, Volk L, Spurr C, Khorasani R, Tanasijevic M, Middleton B: Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003, 10 (6): 523-530. 10.1197/jamia.M1370.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Osheroff JA, Teich JM, Middleton B, Steen EB, Wright A, Detmer DE: A roadmap for national action on clinical decision support. J Am Med Inform Assoc. 2007, 14 (2): 141-10.1197/jamia.M2334.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Gorry GA, Barnett G: Sequential diagnosis by computer. JAMA. 1968, 205 (12): 849-854. 10.1001/jama.1968.03140380053012.

    Article  CAS  PubMed  Google Scholar 

  12. Burstein F, Holsapple CW: Handbook on Decision Support Systems 1: Basic Themes. 2008, Berlin, Germany: Springer

    Book  Google Scholar 

  13. Burstein F, Holsapple CW: Handbook on Decision Support Systems 2: Variations. 2008, Berlin, Germany: Springer

    Book  Google Scholar 

  14. U.S. Department of Health and Human Services - Centers for Medicare & Medicaid Services: Electronic Health Record Incentive Program - Final Rule. Fed Regist. 2010, 4435-

    Google Scholar 

  15. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, Clarke M, Devereaux PJ, Kleijnen J, Moher D: The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration. PLoS Med. 2009, 6 (7): e1000100-10.1371/journal.pmed.1000100.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Evans J, Benefield P: Systematic Reviews of Educational Research: Does the medical model fit?. Br Educ Res J. 2001, 27 (5): 527-541. 10.1080/01411920120095717.

    Article  Google Scholar 

  17. Hibner DH: A Cognitive Assessment of Military Approaches to Understanding. 2008, Fort Leavenworth, Kansas: School of Advanced Military Studies, United States Army Command and General Staff College

    Google Scholar 

  18. Davis PK, Kahan JP: Theory and Methods for Supporting High Level Military Decisionmaking. 2007, Santa Monica, CA: RAND Corporation

    Google Scholar 

  19. North Atlantic Treaty Organisation - Research and Technology Organisation: Decision Support to Combined Joint Task Force and Component Commanders. 2004, Neuilly-Sur-Seine Cedex, France: North Atlantic Treaty Organisation-Research and Technology Organisation

    Google Scholar 

  20. Allen JG, Corpac PS, Frisbie KR: Integrated Battle Command Program: Decision Support Tools for Planning and Conducting Unified Action Campaigns in Complex Contingencies. Command and Control Research and Technology Symposium. 2006, San Diego, CA: Defense Advanced Research Projects Agency

    Google Scholar 

  21. Hutchins SG, Kemple WG, Adamo R, Boger D, Nelson BW, Penta HL: Collaboration Tool Suites Developed to Support Joint Command and Control Planning and Decisionmaking. 2002, Monterey, CA: Naval Postgraduate School

    Google Scholar 

  22. North Atlantic Treaty Organisation-Research and Technology Organization: Tactical Decision Aids and Situational Awareness. 2002, Neuilly-Sur-Seine Cedex, France: North Atlantic Treaty Organisation-Research and Technology Organisation

    Google Scholar 

  23. Cohen MS: A Situation Specific Model of Trust in Decision Aids. Proceedings of Human Performance, Situation Awareness & Automation: User-Centered Design for the New Millennium. 2000, Savannah, GA, 143-148.

    Google Scholar 

  24. Dzindolet MT, Beck HP, Pierce LG: Encouraging Human Operators to Appropriately Rely on Automated Decision Aids. Proceedings of the 2000 Command and Control Research and Technology Symposium. 2000, Monterey, CA

    Google Scholar 

  25. Gordon SC: Decision Support Tools for Warfighters. 2000, Orlando, FL: Air Force Agency for Modeling and Simulation

    Google Scholar 

  26. Hunton JE, Rose JM: 21st century auditing: advancing decision support systems to achieve continuous auditing. Accounting Horizons. 2010, 24 (2): 297-312. 10.2308/acch.2010.24.2.297.

    Article  Google Scholar 

  27. Olavson T, Fry C: Spreadsheet decision-support tools: lessons learned at Hewlett-Packard. Interfaces. 2008, 38 (4): 300-310. 10.1287/inte.1080.0368.

    Article  Google Scholar 

  28. Nissan E: Select topics in legal evidence and assistance by artificial intelligence techniques. Cybern Syst. 2008, 39 (4): 333-394. 10.1080/01969720802039537.

    Article  Google Scholar 

  29. Richards D: Two decades of ripple down rules research. Knowl Eng Rev. 2009, 24 (2): 159-184. 10.1017/S0269888909000241.

    Article  Google Scholar 

  30. He J, King WR: The role of user participation in information systems development: implications from a meta-analysis. J Manag Inf Syst. 2008, 25 (1): 301-331.

    Article  Google Scholar 

  31. Chen MY, Chen AP: Knowledge management performance evaluation: a decade review from 1995 to 2004. J Inf Sci. 2006, 32 (1): 17-38. 10.1177/0165551506059220.

    Article  CAS  Google Scholar 

  32. Power DJ: Understanding data-driven decision support systems. Inf Syst Manag. 2008, 25 (2): 149-154. 10.1080/10580530801941124.

    Article  Google Scholar 

  33. Eom S, Kim E: A survey of decision support system applications (1995–2001). J Oper Res Soc. 2006, 57 (11): 1264-1278. 10.1057/palgrave.jors.2602140.

    Article  Google Scholar 

  34. Mackenzie A, Pidd M, Rooksby J, Sommerville I, Warren I, Westcombe M: Wisdom, decision support and paradigms of decision making. Eur J Oper Res. 2006, 170 (1): 156-171. 10.1016/j.ejor.2004.07.041.

    Article  Google Scholar 

  35. Metaxiotis K, Askounis DT, Nikolopoulos K: Identifying the characteristics of successful expert systems: an empirical evaluation. Int J Inf Technol Manag. 2006, 5 (1): 1-19.

    Google Scholar 

  36. Duana Y, Edwards JS, Xu MX: Web-based expert systems: benefits and challenges. Inf Manag. 2005, 42 (6): 799-811. 10.1016/j.im.2004.08.005.

    Article  Google Scholar 

  37. Mathieson K: Towards a design science of ethical decision support. Journal of Business Ethics. 2007, 76 (3): 269-292. 10.1007/s10551-006-9281-4.

    Article  Google Scholar 

  38. Bhargava HK, Power DJ, Sun D: Progress in web-based decision support technologies. Decis Support Syst. 2007, 43 (4): 1083-1095. 10.1016/j.dss.2005.07.002.

    Article  Google Scholar 

  39. Power DJ, Sharda R: Model-driven decision support systems: concepts and research directions. Decis Support Syst. 2007, 43 (3): 1044-1061. 10.1016/j.dss.2005.05.030.

    Article  Google Scholar 

  40. Lusk EJ: Email: its decision support systems inroads - an update. Decis Support Syst. 2006, 42 (1): 328-332. 10.1016/j.dss.2005.01.001.

    Article  Google Scholar 

  41. Arnott D: Cognitive biases and decision support systems development: a design science approach. Inf Syst J. 2006, 16 (1): 55-78. 10.1111/j.1365-2575.2006.00208.x.

    Article  Google Scholar 

  42. Chae B, Paradice D, Courtney JF, Cagle CJ: Incorporating an ethical perspective into problem formulation: implications for decision support systems design. Decis Support Syst. 2005, 40 (2): 197-212. 10.1016/j.dss.2004.02.002.

    Article  Google Scholar 

  43. Workman M: Expert decision support system use, disuse, and misuse: a study using the theory of planned behavior. Comput Hum Behav. 2005, 21 (2): 211-231. 10.1016/j.chb.2004.03.011.

    Article  Google Scholar 

  44. Davis PK, Kulick J, Egner M: Implications of Modern Decision Science for Military Decision-Support Systems. 2005, Santa Monica: RAND

    Google Scholar 

  45. Davis PK, Dreyer P: RAND’s Portfolio Analysis Tool. 2009, Santa Monica, CA: RAND

    Google Scholar 

  46. Davis PK, Shaver RD, Beck J: Portfolio Analysis Methods for Evaluating Capability Options. 2008, Santa Monica, CA: RAND

    Google Scholar 

  47. Boehm B, Turner R: Balancing Agility and Discipline: a Guide for the Perplexed. 2004, Boston, MA: Pearson Education

    Google Scholar 

  48. Lempert RJ, Groves DG, Popper SW, Bankes SC: A General, Analytic Method for Generating Robust Strategies and Narrative Scenarios. Manag Sci. 2006, 52 (4): 514-528. 10.1287/mnsc.1050.0472.

    Article  Google Scholar 

  49. Michalowski W, Slowinski R, Wilk S, Farion K, Pike J, Rubin S: Design and development of a mobile system for supporting emergency triage. Methods Inf Med. 2005, 44 (1): 14-24.

    CAS  PubMed  Google Scholar 

  50. Sittig D, Gardner R, Morris A, Wallace C: Clinical evaluation of computer-based respiratory care algorithms. Int J Clin Monit Comput. 1990, 7 (3): 177-185. 10.1007/BF02915583.

    Article  CAS  PubMed  Google Scholar 

  51. Ravdin P, Siminoff L, Davis G, Mercer M, Hewlett J, Gerson N, Parker H: Computer program to assist in making decisions about adjuvant therapy for women with early breast cancer. J Clin Oncol. 2001, 19 (4): 980-991.

    CAS  PubMed  Google Scholar 

  52. Gigerenzer G, Edwards A: Simple tools for understanding risks: from innumeracy to insight. BMJ. 2003, 327 (7417): 741-744. 10.1136/bmj.327.7417.741.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Buchanan L, O'Connell A: A brief history of decision making. Harv Bus Rev. 2006, 84 (1): 32-39.

    PubMed  Google Scholar 

  54. Tetlock P: Expert political judgment: How good is it? How can we know?. 2005, Princeton, New Jersey: Princeton University Press

    Google Scholar 

Pre-publication history

Download references

Acknowledgements

The authors would like to thank Roberta Shanman for expert literature searching assistance and Drs. Gordon Schiff and Blackford Middleton of Harvard Medical School for their critical review of the manuscript. This work was funded by a task order entitled “Advancing Clinical Decision Support” from the Office of the National Coordinator for Health IT, U.S. Department of Health and Human Services Contract # HHSP23320095649WC, Task order HHSP23337009T.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Douglas S Bell.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All the authors read and approved of the final manuscript. HW performed the literature review, synthesized results, and drafted the manuscript. DB conceived of the study and contributed to the manuscript. PD provided specialized content expertise, participated in the literature review, and contributed to the manuscript.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Wu, H.W., Davis, P.K. & Bell, D.S. Advancing clinical decision support using lessons from outside of healthcare: an interdisciplinary systematic review. BMC Med Inform Decis Mak 12, 90 (2012). https://doi.org/10.1186/1472-6947-12-90

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6947-12-90

Keywords