Email updates

Keep up to date with the latest news and content from BMC Medical Informatics and Decision Making and BioMed Central.

Open Access Highly Accessed Study protocol

Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol

Brian E Dixon123, Shaun J Grannis124 and Debra Revere5*

Author Affiliations

1 Indiana University, School of Informatics and Computing, Indianapolis, IN, USA

2 Regenstrief Institute, Center for Biomedical Informatics, Indianapolis, IN, USA

3 Department of Veterans Affairs, Health Services Research & Development Service, Center for Implementing Evidence-Based Practice, Indianapolis, IN, USA

4 Indiana University, School of Medicine, Indianapolis, IN, USA

5 University of Washington, School of Public Health, Seattle, WA, USA

For all author emails, please log on.

BMC Medical Informatics and Decision Making 2013, 13:121  doi:10.1186/1472-6947-13-121


The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1472-6947/13/121


Received:6 September 2013
Accepted:29 October 2013
Published:30 October 2013

© 2013 Dixon et al.; licensee BioMed Central Ltd.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Abstract

Background

Health information exchange (HIE) is the electronic sharing of data and information between clinical care and public health entities. Previous research has shown that using HIE to electronically report laboratory results to public health can improve surveillance practice, yet there has been little utilization of HIE for improving provider-based disease reporting. This article describes a study protocol that uses mixed methods to evaluate an intervention to electronically pre-populate provider-based notifiable disease case reporting forms with clinical, laboratory and patient data available through an operational HIE. The evaluation seeks to: (1) identify barriers and facilitators to implementation, adoption and utilization of the intervention; (2) measure impacts on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE.

Methods/Design

The intervention will be implemented over a staggered schedule in one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care. Evaluation will be conducted utilizing a concurrent design mixed methods framework in which qualitative methods are embedded within the quantitative methods. Quantitative data will include reporting rates, timeliness and burden and report completeness and accuracy, analyzed using interrupted time-series and other pre-post comparisons. Qualitative data regarding pre-post provider perceptions of report completeness, accuracy, and timeliness, reporting burden, data quality, benefits, utility, adoption, utilization and impact on reporting workflow will be collected using semi-structured interviews and open-ended survey items. Data will be triangulated to find convergence or agreement by cross-validating results to produce a contextualized portrayal of the facilitators and barriers to implementation and use of the intervention.

Discussion

By applying mixed research methods and measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community.

Keywords:
Evaluation; Health information exchange; Mixed methods; Public health reporting

Background

Health information exchange (HIE) is the capacity to electronically transfer and share health information among health care-related stakeholders and organizations such as clinics, laboratories, payers, hospitals, pharmacies and public health [1]. HIEs promise to reduce health care costs [2], improve patient safety [3], and provide access to more timely surveillance data for public health organizations [4]; however the success of HIE implementations and their system improvements can depend on the context in which data flow and clinical care workflow are integrated. For example, an investigation of one HIE’s impact on emergency department (ED) charges reported a reduction of approximately $26 per encounter (p = 0.03) at one hospital with no effect on charges at a second independent hospital in the same HIE [5]. Similarly, in a comparison of HIE utilization among municipalities, Maenpaa et al. (2012) reported a difference in system usage between specialized and primary care providers depending on the numbers of ED visits, laboratory tests, radiology exams and appointments [6]. An investigation of HIE usage in EDs and ambulatory clinics also found that system utilization varied by site, patient population and characteristics, specialization of services, and site policies governing use and administrative access [7]. Studies such as these suggest that outcomes resulting from HIE interactions, access and interventions can be context-sensitive.

An unexplored area is reporting of communicable and infectious diseases, such as pertussis, tuberculosis, salmonella, Chlamydia and Hepatitis C, among others, to public health in an HIE. Most states utilize a dual reporting structure: mandatory case reporting of a disease by providers and mandatory reporting of test results by laboratories to public health authorities. Conventional provider-initiated paper-based reports, transmitted through fax and mail, have been shown to be incomplete, error-prone and untimely. Report completeness ranges from 9 to 99 percent [8]. Highly prevalent diseases like sexually transmitted infections are reported approximately 79 percent of the time and many diseases like pertussis and Lyme disease are reported less than 50 percent of the time [8]. Report timeliness of reporting ranges from one day to three weeks after diagnosis, depending on the disease [9]. In addition, provider reports often lack demographic details that public health workers need, requiring them to perform follow-up calls to get this additional information [10].

In comparison to conventional paper-based reporting, electronic laboratory reporting (ELR) has been successful in delivering more timely laboratory test results to public health [11] and increasing the proportion of notifiable disease reports that are reported to public health [11,12]. However, implementation of ELR can increase or exceed local investigative capacity by significantly increasing the volume of reported cases to public health agencies [13,14] and, like provider reports, ELR can lack clinical and treatment details, such as complete patient demographics, vital signs, pregnancy status or prescribed drugs, needed to fully characterize or prioritize a case for public health purposes [15,16].

Integration of ELR into electronic health record (EHR) systems and HIE networks—i.e., an ELR-EHR-HIE infrastructure—has been shown to connect clinical and public health stakeholders without interrupting existing workflows or adding burden to clinical providers [17]. Given that many states have (or will have in the near future) an infrastructure supporting ELR and HIE [18] and EHRs contain clinical and treatment data often missing from ELR, there is potential to use an integrated ELR-EHR-HIE infrastructure to improve provider-based notifiable disease reporting beyond existing improvements to lab-based reporting.

The “Improving Population Health through Enhanced Targeted Regional Decision Support” study aims to leverage an available ELR-EHR-HIE infrastructure to electronically pre-populate provider-submitted notifiable disease report forms with available clinical, lab and patient data. This intervention has the potential to streamline provider-based reporting workflows, lower barriers to reporting, increase data completeness, improve reporting timeliness and capture a greater portion of communicable disease burden in the community.

We hypothesize clinics that implement the intervention will effectively incorporate the pre-populated reporting form into their workflow, providers will report high satisfaction with the intervention, and that, compared to clinics in which the intervention is not implemented, barriers to reporting will be reduced, timeliness of reporting and completeness of report data fields will improve, and need for providers to provide supplemental or corrected data to public health will be reduced. The implementation of the intervention will be evaluated using mixed methods. The study protocol described will evaluate the implementation of the pre-populated form intervention in the context of an operational HIE.

Methods/Design

The evaluation-specific objectives are to assess the implementation of the reporting form intervention at the clinic level to: (1) identify barriers and facilitators to implementation, adoption and utilization of the pre-populated reporting form; (2) measure impacts of the tool on workflow, provider awareness, and end-user satisfaction; and (3) describe the contextual factors that impact the effectiveness of the intervention within heterogeneous clinical settings and the HIE.

Setting

The intervention will be implemented within one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care (INPC). The INPC is anchored by the Regenstrief Medical Record System which collects data from a variety of sources, including hospitals, clinics, pharmacies, and laboratories [19]. Lab results are routinely delivered to ordering physicians using the Regenstrief DOCS4DOCS® software and analyzed as ELR transactions by the Notifiable Condition Detector (NCD) which identifies mandated reportable diseases and notifies local and state public health departments of these laboratory results [20].

Participants

The study will include outpatient INPC primary care practices operating in urban and rural settings that are part of the INPC, representing multiple health systems and clinics. Clinics that do not use the DOCS4DOCS® software will be excluded.

Intervention

Two technical interventions will be deployed over a staggered schedule at participating clinics: 1) “standard” pre-populated forms and 2) “enhanced” pre-populated forms. The “standard” forms intervention will use EHR (patient demographics and clinic information) and ELR (notifiable disease test results) data available in the HIE to pre-populate and deliver an electronic version of the official state notifiable disease reporting form to the provider. Providers will be able to review the pre-populated form, add any additional information, and fax completed forms to their local health department. The “enhanced” forms intervention will pre-populate an alternative reporting form with an expanded set of data available in the HIE. For example, the “enhanced” form will not only include test results data for a case of hepatitis B but also corollary results on the patient’s liver enzymes; this is information the health department typically requests from the provider in a follow-up phone call when investigating the reported case of hepatitis. Providers will still be able to review the pre-populated “enhanced” form, add any additional information, and fax completed forms to their local health department. Since deployment is staggered, at any point in time the non-intervention sites can act as natural controls for the intervention sites without the selection bias that is generally present in non-randomized experiments. Therefore, the study protocol is theoretically equivalent in its ability to generate causal evidence to a traditional randomized controlled experiment.

Research questions

Mixed methods studies require that research questions be linked to and drive the data collection and analysis methods, as well as inform the study design, sample size, sampling, instruments developed and administered, and data analysis techniques [21]. Our primary research questions are:

1. What individual, organizational and data quality factors may act as barriers or facilitators to the successful adoption and utilization of pre-populated reporting forms and enhanced data transaction processing to public health; and

2. What is the relationship of these barriers and facilitators to fostering improvements in provider-based population health reporting workflows, lowering barriers to reporting and case follow-up, increasing data completeness, and enabling greater capture of communicable disease burden in the community?

Data collection

Using a concurrent mixed methods design, data collection will be conducted during the three project phases: baseline or pre-implementation; post-implementation of the standard form; and post-implementation of the enhanced form. In each phase, qualitative and quantitative data are collected in tandem as coordinated but independent studies. This design will allow us to triangulate the quantitative results from surveys, time-series, and data quality measures with qualitative interview and open-ended survey results to understand experiences with public health reporting before and after each form implementation. Table  1 summarizes the categories of data collected.

Table 1. Summary of study constructs, data collection, analysis approaches and outcomes measurements by method (qualitative, quantitative)

Quantitative data collection

The following data will be collected at baseline (retrospective to 12 months prior to introduction of the intervention) and at 6-, 9-, and 12-months after implementation of both form interventions: reporting rates (the number of reports for individual diseases and in aggregate submitted to public health daily); report data completeness (completeness of fields) and accuracy (errors); reporting timeliness (length of time between the laboratory test date and treatment); treatment timeliness (length of time between the laboratory test date and treatment); and reporting burden defined as communication volume (number of phone calls or FAX communications between public health and clinics/providers or laboratories) and duration (total number of minutes) measured at the level of phone call.

Qualitative data collection

Semi-structured interviews will be conducted with representative clinical and public health workers will collect qualitative data regarding provider and public health perceptions of completeness, accuracy, timeliness and burden associated with notifiable disease reporting prior to and after each form intervention. In addition, perceptions regarding data quality, benefits, utility, adoption, utilization and impact on workflow of reporting prior to and after the intervention will be collected during baseline and at 12-months after implementation of each form intervention. Interviews will be digitally audio-taped and transcribed. Public health practitioner input regarding enhancements and supplemental data preferences for the enhanced report form will be captured by convening focus groups to establish consensus on desirable data elements.

Data analysis

Data analysis methods will be more thoroughly reported in the methods sections of future publications presenting the findings of specific analyses.

Data analysis (quantitative)

Quantitative analysis will provide measurable evidence of the impacts of the intervention and enable us to establish likely cause and effect. Longitudinal effects of the interventions will be evaluated in aggregate and across covariates on reporting rates, reporting timeliness and treatment timeliness using an interrupted time-series design to test for the intervention effect and the time trend post-intervention. This approach has been used in other decision support and time-series evaluations [22] and has been recommended for multiple baseline time-series designs that involve two or more sites that are repeatedly assessed while an intervention is introduced into one site at a time [23]. Since the effects of the interventions may vary across clinics, stratified regression analyses will also be performed. Following comparative assessment of reporting completeness, accuracy and communication burden, these data will be added to the regression model. Assessment of data completeness, timeliness, and accuracy will involve pre-post comparison of data quality metrics [15].

Data analysis (qualitative)

Qualitative analysis will provide in-depth context regarding facilitators and barriers to implementation, adoption, benefits and use of the intervention; identify and describe their impacts on HIE individual and organizational processes; and look at the broad range of interconnected processes or causes at play regarding data quality. Analyses will be conducted using qualitative software by experienced coders using the constant comparative method of analysis [24] and utilizing standard approaches to ensure credibility, consistency and robustness of the findings. Transcribed interviews and open-ended survey items will undergo a series of well-established steps to identify emerging themes and trends and, ultimately, build a model to describe the intervention phenomenon in a conceptual form [25]. The process will begin with developing a coding scheme which will be developed from combining concepts derived a priori from the conceptual frameworks driving the study and inductively as the analysis proceeds. Content will be grouped into nodes, a codebook will be built, and codes or code combinations will be summarized and stratified by contextual factors such as demographics, respondent role, etc. These summaries will be entered into appropriate data displays that specify interactions between the intervention, its context and its effects as preparation for triangulation.

Triangulation of quantitative and qualitative data

Data will be triangulated to find convergence or agreement by cross-validating results. The first step will be exploratory to determine most appropriate “transformation” of the data: conversion of quantitative data into narrative data (“qualitized”) or conversion of qualitative data into numerical codes (“quantitized”) that can be represented statistically. There are several approaches to quantifying qualitative data, including enumerating the frequency of themes within a sample, the percentage of themes associated with a given category of respondent, or the percentage of people selecting specific themes. In these approaches the quantified data can be statistically compared to quantitative data collected concurrently but separately. Another strategy for quantifying qualitative data enumerates whether or not qualitative responses included certain codes—i.e., rather than seeking to understand how many times a certain code was provided by each participant or the frequency with which they appeared, the presence or absence of each code for each participant is quantified into dichotomous variables 0 or 1 based on absence or presence of each coded response. We will determine the most appropriate approach after reviewing the descriptive analysis. Depending on which approach is used, this mixed data will be correlated (quantitative data correlated with qualitized data or vice versa) [26]. Guided by the nature of the data collected, quantitative and qualitative data may be collated to create new or consolidated variables or data sets in order to further compare and integrate data.

Once transformed, we will calculate simple correlations, stratify codes by provider type, demographics, geographic location or organization attributes or look at similarity among respondents. This process will allow us to identify areas in which findings agree (convergence), contradict (discrepant or dissonant) or deepen understanding on the same issue (complementarity) [27]. We anticipate that the results will allow us to identify, analyze and explain social, behavioral and environmental similarities and differences between HIE settings, provider types and perceptions across pre- and post-implementation time units that will describe identified barriers and facilitators to the intervention.

Limitations and biases

There are several limitations to our proposed work. First, some clinical sites may be more open to recruitment and enrollment in the study and thus introduce a bias in our sample. For example, some sites may serve patient populations that are more likely to require notifiable disease reporting (for example, a women’s clinic with a high Chlamydia reporting history) and thus be more incentivized to participate. It is possible given use of an interrupted time-series design, that confounding may occur due to covariates that may change over time. The INPC is a growing HIE so it is possible that policy, governance, legal mandates or information technology changes could occur during the course of this study that may impact data collection or introduce additional confounding issues. This HIE includes an academic affiliation and medical training program so providers may rotate through training sites at different stages of the intervention which may introduce confounding. Introduction and adoption of the intervention may be more rapid in some settings (small ambulatory clinic) than others (large hospital) as the workplace may accommodate or adjust to the intervention more easily or require administrative protocols to support the intervention. It is also possible that the staggered implementation of the intervention may complicate quality control of data collection. Also, given data collection covers only one baseline year and a maximum of two years post-intervention, depending on site, this short time frame may limit ability for analysis to account for seasonal trends in reporting. Our findings may also be limited by the context in which this study is conducted. The INPC is one of the most advanced HIEs in the country; therefore, our results may have limited generalizability. However, we believe our emphasis on context and by clearly documenting the characteristics of the sites in which the intervention, our findings could inform implementation other HIE settings. Given current mandates to build systems and infrastructures for ELR and HIE in communities where it is absent, expand efforts in communities where ELR is already occurring, and requirement for eligible hospitals to routinely transmit reportable laboratory results to a public health agency, our findings may provide insights and inform roadmaps for more nascent HIEs to move forward towards better notifiable disease surveillance.

Ethical considerations

The project received approval by the Institutional Review Board of Indiana University with a concurrent Institutional Review Board deferral from the University of Washington to Indiana University. Informed consent will be obtained from all participants and confidentiality will be ensured. All data will be stored according to the rules of the research ethics committee. Because this study does not meet ICMJE guidelines it has not been registered in a publicly accessible registry.

Discussion

Context is recognized as a critical element for understanding the effects of intervention components individually and in combination [28]; appropriately generalizing findings across multiple sites [29]; and accelerating the process of translating research into practice [30]. Context includes the characteristics of an organization and its environment, e.g. size, organizational structure, personnel, training, governance, financial factors, leadership, legal mandates, communication flows, which can influence the implementation and effectiveness of an intervention. However, the use of conventional quantitative research methods alone often limit detailed understanding of the phenomena under study, ignore the context of the problem or intervention, minimize the impact of the changing nature of the study subject and/or its context, and exclude outliers in preference to generating implications from the mean [31]. Sensitivity to context and “context-heterogeneity”—i.e., how differences in context change, shape and are shaped by the phenomena under study—is one of the values of qualitative methods.

Mixed or multi-methods research, which integrates quantitative and qualitative data collection and analysis in a single study or a program of inquiry, provides methodologies for measuring the effects of an intervention within complex, growing and evolving contexts while also exploring the context of the intervention itself [32,33]. In the HIE setting, employing mixed methods could contribute to better understanding of the contextual issues that influence the potential quality, safety and efficiency outcomes associated with HIE implementations [34].

Combining quantitative and qualitative methods in our evaluation of a complex HIE intervention has two potential benefits: complementarity and multiplicativeness, i.e., the methods may illuminate different aspects of the intervention impacts and the combination of the two methods may provide greater insight than either method individually [35]. By measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community. We believe that our approach demonstrates a multi-faceted and integrative research “attitude” towards the problems we are concerned with, the insights we are seeking, the comprehensive ways in which we desire to generate knowledge, and the creative and scientific curiosity required to rigorously tackle an interdisciplinary investigation.

Conclusion

HIEs are rapidly expanding in the U.S., fueled by national policies that are investing in the integration of ELR, EHR, and other information systems. Yet only one-third of public health departments are engaged in local efforts to electronically exchange data between clinical and public health [36]. Our study not only presents the opportunity to study the evolving public health infrastructure with integration of ELR-EHR-HIE technologies but also the use of mixed methods to better understand the context in which HIE and HIE-based interventions are implemented. To our knowledge, this approach is unique as we were unable to identify previous articles that detail a systematic and rigorous application of mixed research methods to the evaluation of HIE. By testing the feasibility of using mixed methods to study a complex informatics intervention in a complex health care setting, we believe our approach extends prior work in informatics that calls for incremental evaluation of maturing HIE models [34,37]. Similar studies of HIE and public health informatics interventions should consider leveraging the combination of quantitative and qualitative methods to more fully explore, observe, and document the impact of interventions on population outcomes as well as information system adoption and use.

Abbreviations

EHR: Electronic health record; ELR: Electronic laboratory reporting; ED: Emergency Department; HIE: Health information exchange; INPC: Indiana network for patient care.

Competing interests

The authors declare that they have no competing interest.

Authors’ contributions

SG, DR and BD conceived of the study. All authors were involved in study design. DR and BD wrote the draft manuscript. All authors reviewed and participated in revisions of this manuscript. All authors read and approved the final manuscript.

Acknowledgments

The “Improving Population Health through Enhanced Targeted Regional Decision Support” research project is a collaboration between Indiana University, Regenstrief Institute and the University of Washington. The authors wish to acknowledge the project team: Janet Arno, Roland Gamache, Joseph Gibson, Rebecca Hills, Uzay Kirbiyik, Patrick Lai, Melissa McMaster, and Jennifer Williams. This project was supported by grant number R01HS020909 from the Agency for Healthcare Research and Quality (AHRQ). The content is solely the responsibility of the authors and does not necessarily represent the official views of AHRQ or the Department of Veterans Affairs.

References

  1. AHRQ:

    Health information exchange. 2009.

    http://www.healthit.ahrq.gov/hie#Background3 webcite

    OpenURL

  2. Walker J, Pan E, Johnston D, Adler-Milstein J, Bates DW, Middleton B: The value of health care information exchange and interoperability.

    Health Aff (Millwood) 2005, Suppl Web Exclusives:W5-10-W5-18. OpenURL

  3. Kaelber DC, Bates DW: Health information exchange and patient safety.

    J Biomed Inform 2007, 40(6 Suppl):S40-S45. PubMed Abstract | Publisher Full Text OpenURL

  4. Smith PF, Hadler JL, Stanbury M, Rolfs RT, Hopkins RS: “Blueprint version 2.0”: updating public health surveillance for the 21st century.

    J Public Health Manag Pract 2013, 19:231-239. PubMed Abstract | Publisher Full Text OpenURL

  5. Overhage JM, Dexter PR, Perkins SM, Cordell WH, McGoff J, McGrath R, McDonald CJ: A randomized, controlled trial of clinical information shared from another institution.

    Ann Emerg Med 2002, 39:14-23. PubMed Abstract | Publisher Full Text OpenURL

  6. Maenpaa T, Asikainen P, Gissler M, Siponen K, Maass M, Saranto K, Suominen T: The utilization rate of the regional health information exchange: how it impacts on health care delivery outcomes.

    J Public Health Manag Pract 2012, 18:215-223. PubMed Abstract | Publisher Full Text OpenURL

  7. Johnson KB, Unertl KM, Chen Q, Lorenzi NM, Nian H, Bailey J, Frisse M: Health information exchange usage in emergency departments and clinics: the who, what, and why.

    J Am Med Inform Assoc 2011, 18:690-697. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  8. Doyle TJ, Glynn MK, Groseclose SL: Completeness of notifiable infectious disease reporting in the United States: an analytical literature review.

    Am J Epidemiol 2002, 155:866-874. PubMed Abstract | Publisher Full Text OpenURL

  9. Jajosky RA, Groseclose SL: Evaluation of reporting timeliness of public health surveillance systems for infectious diseases.

    BMC Public Health 2004, 4:29. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text OpenURL

  10. Sickbert-Bennett EE, Weber DJ, Poole C, MacDonald PD, Maillard JM: Completeness of communicable disease reporting, North Carolina, USA, 1995–1997 and 2000–2006.

    Emerg Infect Dis 2011, 17:23-29. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  11. Overhage JM, Grannis S, McDonald CJ: A comparison of the completeness and timeliness of automated electronic laboratory reporting and spontaneous reporting of notifiable conditions.

    Am J Public Health 2008, 98:344-350. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  12. Nguyen TQ, Thorpe L, Makki HA, Mostashari F: Benefits and barriers to electronic laboratory results reporting for notifiable diseases: the New York City department of health and mental hygiene experience.

    Am J Public Health 2007, 97(Suppl 1):S142-S145. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  13. Kite-Powell A, Hamilton JJ, Hopkins RS, DePasquale JM: Potential effects of electronic laboratory reporting on improving timeliness of infectious disease notification—Florida, 2002–2006.

    MMWR Morb Mortal Wkly Rep 2008, 57:1325-1328. PubMed Abstract | Publisher Full Text OpenURL

  14. McHugh LA, Semple S, Sorhage FE, Tan CG, Langer AJ: Effect of electronic laboratory reporting on the burden of lyme disease surveillance—New Jersey, 2001–2006.

    MMWR Morb Mortal Wkly Rep 2008, 57:42-45. PubMed Abstract | Publisher Full Text OpenURL

  15. Dixon BE, McGowan JJ, Grannis SJ: Electronic laboratory data quality and the value of a health information exchange to support public health reporting processes.

    AMIA Annu Symp Proc 2011, 2011:322-330. PubMed Abstract | PubMed Central Full Text OpenURL

  16. Lazarus R, Klompas M, Campion FX, McNabb SJ, Hou X, Daniel J, Haney G, DeMaria A, Lenert L, Platt R: Electronic support for public health: validated case finding and reporting for notifiable diseases using electronic medical data.

    J Am Med Inform Assoc 2009, 16:18-24. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  17. Klompas M, McVetta J, Lazarus R, Eggleston E, Haney G, Kruskal BA, Yih WK, Daly P, Oppedisano P, Beagan B, Lee M, Kirby C, Heisey-Grove D, DeMaria A Jr, Platt R: Integrating clinical practice and public health surveillance using electronic medical record systems.

    Am J Public Health 2012, 102(Suppl 3):S325-S332. PubMed Abstract | Publisher Full Text OpenURL

  18. Grannis SJ, Stevens K, Merriwether R: Leveraging health information exchange to support public health situational awareness: the Indiana experience.

    Online J Public Health Inform 2010, 2:3213. OpenURL

  19. McDonald CJ, Overhage JM, Barnes M, Schadow G, Blevins L, Dexter PR, Mamlin B, INPC Management Committee: The Indiana network for patient care: a working local health information infrastructure. An example of a working infrastructure collaboration that links data from five health systems and hundreds of millions of entries.

    Health Aff (Millwood) 2005, 24:1214-1220. PubMed Abstract | Publisher Full Text OpenURL

  20. Fidahussein M, Friedlin J, Grannis S: Practical challenges in the secondary use of real-world data: the notifiable condition detector.

    AMIA Annu Symp Proc 2011, 2011:402-408. PubMed Abstract | PubMed Central Full Text OpenURL

  21. Onwuegbuzie AJ, Leech NL: Linking research questions to mixed methods data analysis procedures.

    Qual Report 2006, 11:474-498. OpenURL

  22. Goldberg HI, Neighbor WE, Cheadle AD, Ramsey SD, Diehr P, Gore E: A controlled time-series trial of clinical reminders: using computerized firm systems to make quality improvement research a routine part of mainstream practice.

    Health Serv Res 2000, 34:1519-1534. PubMed Abstract | PubMed Central Full Text OpenURL

  23. Biglan A, Ary D, Wagenaar AC: The value of interrupted time-series experiments for community intervention research.

    Prev Sci 2000, 1:31-49. PubMed Abstract | Publisher Full Text OpenURL

  24. Strauss A, Corbin J: Basics of qualitative research. London: Sage; 1998. OpenURL

  25. Reeder B, Revere D, Hills RA, Baseman JG, Lober WB: Public health practice within a health information exchange: information needs and barriers to disease surveillance.

    Online J Public Health Inform 2012, 4:4277. OpenURL

  26. Tashakkori A, Teddlie C: Handbook on mixed methods in the behavioral and social sciences. Thousand Oaks CA: Sage; 2003. OpenURL

  27. O’Cathain A, Murphy D, Nicholl J: Three techniques for integrating data in mixed methods studies.

    BMJ 2010, 341:1147-1150. OpenURL

  28. Bonell C, Fletcher A, Morton M, Lorenc T, Moore L: Realist randomised controlled trials: a new approach to evaluating complex public health interventions.

    Soc Sci Med 2012, 75:2299-2306. PubMed Abstract | Publisher Full Text OpenURL

  29. Ovretveit JC, Shekelle PG, Dy SM, McDonald KM, Hempel S, Pronovost P, Rubenstein L, Taylor SL, Foy R, Wachter RM: How does context affect interventions to improve patient safety? An assessment of evidence from studies of five patient safety practices and proposals for research.

    BMJ Qual Saf 2011, 20:604-610. PubMed Abstract | Publisher Full Text OpenURL

  30. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C: National Institutes of Health approaches to dissemination and implementation science: current and future directions.

    Am J Public Health 2012, 102:1274-1281. PubMed Abstract | Publisher Full Text OpenURL

  31. Ovretveit J: The contribution of new social science research to patient safety.

    Soc Sci Med 2009, 69:1780-1783. PubMed Abstract | Publisher Full Text OpenURL

  32. Johnson RB, Onwuegbuzie AJ: Mixed methods research: a research paradigm whose time has come.

    Educ Res 2004, 33:14-26. OpenURL

  33. Zhang W, Creswell J: The use of “mixing” procedure of mixed methods in health services research.

    Med Care 2013.

    in press

    OpenURL

  34. Ash JS, Guappone KP: Qualitative evaluation of health information exchange efforts.

    J Biomed Inform 2007, 40(6 Suppl):S33-S39. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

  35. Protheroe J, Bower P, Chew-Graham C: The use of mixed methodology in evaluating complex interventions: identifying patient factors that moderate the effects of a decision aid.

    Family Pract 2007, 24:594-600. Publisher Full Text OpenURL

  36. Hessler BJ, Soper P, Bondy J, Hanes P, Davidson A: Assessing the relationship between health information exchanges and public health agencies.

    J Public Health Manag Pract 2009, 15:416-424. PubMed Abstract | Publisher Full Text OpenURL

  37. Shapiro JS: Evaluating public health uses of health information exchange.

    J Biomed Inform 2007, 40(6 Suppl):S46-S49. PubMed Abstract | Publisher Full Text | PubMed Central Full Text OpenURL

Pre-publication history

The pre-publication history for this paper can be accessed here:

http://www.biomedcentral.com/1472-6947/13/121/prepub