Skip to main content
  • Research article
  • Open access
  • Published:

The evaluation of an evidence-based clinical answer format for pediatricians

Abstract

Background

Clinicians are increasingly using electronic sources of evidence to support clinical decision-making; however, there are multiple demands on clinician time, and summarised and synthesised evidence is needed. Clinical Answers (CA) have been developed to address this need; the CA is a synthesised evidence-based summary that supports point-of-care clinical decision-making. The aim of this paper is to report on a survey used to test and improve the CA format.

Methods

An online survey was sent to pediatricians via e-mail and posted on a child health clinical standards website. Quantitative data analysis consisted primarily of descriptive statistics; qualitative data analysis consisted of content analysis.

Results

Eighty-three pediatricians responded to the survey. Most respondents found the CA useful or very useful (93%) and agreed or strongly agreed that the layout was effective and allowed them to quickly locate critical information (82%). Quantitative and qualitative data suggested that respondents thought there should be less detail in the linked figures and tables (p = 0.0002), but overall respondents seemed to think there was an appropriate level of detail in most sections of the CA.

Conclusions

Based on the quantitative and qualitative survey responses, major and minor modifications to the CA format were implemented, such as removing forest plots, adding links in each addendum to bring the user back to the front page, and adding an 'Implications for practice' section to the CA. Findings suggest that CAs will be a useful tool for pediatricians; thus, the research team has now begun creating CAs to assist busy clinicians in their day-to-day clinical practice by providing high-quality information for decision-making at the point-of-care.

Peer Review reports

Background

The Cochrane Child Health Field is a small knowledge intensive organisation [1] that functions within The Cochrane Collaboration to advocate for high-quality scientific evidence that reflects the particular needs of infants, children and youth. Furthermore, the Child Health Field works within the child health community to advocate for decision-making based on finding, understanding, and using the best available evidence. Out of these mandates came the Clinical Answers (CAs) project, which aims to develop short, one-page summaries of evidence to quickly answer questions posed by pediatricians.

Studies suggest that clinicians are increasingly turning to electronic evidence to find information to support clinical decision-making [2–4]. In a study conducted among pediatricians in Ireland, many believed that the quality of patient care depended on the ability of clinicians to use online resources [5]. Among the many online resources available to clinicians are summaries of research evidence to support clinical decision-making, such as Best Bets or UptoDate. Best Bets was originally developed for topics in emergency medicine, and is written by volunteers who register to develop a Best Bet. Evidence is presented in a table, and includes primary studies that are evaluated by authors. UptoDate covers a wide range of topics and is written by physicians; information included spans textbooks to primary studies and is several pages in length. Ketchum et al [6] reviewed several of these resources, and found that point-of-care products varied greatly in content, type of evidence, and currency of the evidence. There is a need for consistent and current high-quality evidence delivered in a systematic format; we aim to meet this need by creating and publishing CAs.

Developing short summaries of up-to-date, high-quality evidence is important because clinicians are often unable to keep abreast of current relevant literature [2, 4, 5, 7]. Alper et al [8] estimate (from 2005 database figures) that clinicians would require 627.5 hours per month to evaluate all relevant articles - over 20 hours per day, an impossible task. For example, worldwide productivity in asthma research (measured by number of articles published) has doubled from 1994 to 2004 and is continually growing [9, 10]. Riordan et al [11] found that pediatricians did not have time to track down information and critically appraise evidence while on call; they conclude that pediatricians need easy access to evidence-based answers to common clinical questions.

We chose to develop CAs in the area of respiratory child health, because almost 20% of all reviews published in the Cochrane Database of Systematic Reviews are related to respiratory medicine in children [12]. CAs present evidence from Cochrane systematic reviews, and potentially non-Cochrane systematic reviews, in short one-page electronic summaries with links in text to addendums, that will also be readable in hardcopy. The CA format that we tested consisted of eight separate sections: Question, Answer, Background, Search strategy, Included reviews, Results, Limitations, and References. The addendums included the full search strategy, tables and figures (including forest plots), and references (Additional file 1). Topics for our initial CAs were suggested by pediatricians in the child health field listserv, and also those associated with the Airways and Acute Respiratory Infections Cochrane Review Groups. As CAs are concise, two pediatricians on our team selected the most important outcomes to present for each CA, and decision rules for data extraction and analysis were agreed by our team.

CAs have the potential to be useful for child health clinicians in point-of-care decision-making by providing current high quality evidence in a consistent way to answer common clinical questions. A survey was prepared to test the CA among its intended users. The purpose of the survey was to test the format of the CA, not the particular clinical content of the example used (for example, the survey included questions on the amount and type of information presented, but not whether the clinician agreed with the recommendations for treating bronchiolitis or methodology in producing the CA content). In this paper, we report the survey results and discuss the changes we will implement in the CA format. Conducting this survey assisted us in refining and producing a better product for our future users.

Method

Potential survey respondents included practicing pediatricians working in diverse clinical and research settings ranging from anaesthesia to oncology. Pediatricians were not randomly sampled; instead, a convenience sample was used. Each survey invitation also had a request for the clinician to send the invitation to colleagues. Our recruitment methods are aimed primarily at European sources as the CAs will be published in Evidence-Based Child Health, a journal sponsored by the European Pediatric Association during the project term. Several recruitment methods were employed:

• Email invitations (n = 73) were sent by the Child Health Field to pediatricians in the Field's membership listserv; further invitations were sent by the Cochrane Acute Respiratory Infections Group (n = 46), and Cochrane Airways Group (n = 26) to their contacts.

• An invitation to complete the online survey (including the survey URL) was published in the European Paediatric Association Newsletter, which is sent through e-mail to the Association's membership list.

• A search was conducted through Google Scholar for clinicians who had published in respiratory child health, and where possible, an email invitation (n = 45) to complete the survey was forwarded to these individuals.

• The survey invitation was posted on a European country's national Pediatric Clinical Standards website.

At our University, program evaluation/quality assurance studies (the category the research ethics board assigned us) are not subject to Research Ethics Board review and approval.

Measures

The online survey was developed by an evaluation research consultant, based on initial input and ongoing review by the research team. The survey addressed several key aspects of the CA format: overall value and efficacy, adequacy of detail provided in sub-sections, suggestions for both format improvements and suggestions for content of future CAs (Additional file 2). An item was also included to determine the Net-Promoter Score (NPS), a tool used to measure customer satisfaction. NPS is derived from a single question that asks respondents to indicate - on an 11-point scale anchored at 0 (not at all likely) and 10 (very likely) - how likely they are to recommend a product to a friend or colleague [13–15]. The NPS is calculated by subtracting the percentage of individuals who are unlikely to recommend a product, referred to as "Detractors" (0-6 on the 11-point scale), from the percentage of individuals who are highly likely to recommend it, called "Promoters" (9-10 on the 11-point scale). A bad recommendation to a colleague is associated with a potential risk to reputation, and as such, most individuals are only likely to recommend a product if they are extremely confident in its quality (Ibid). An NPS that is positive is beneficial in that it represents a potential increase in awareness and use of a product or service.

Analysis

Quantitative data analysis consisted primarily of descriptive statistics. In addition, for items that asked respondents to comment on the level of detail provided in various sections of the CA, a binomial test was performed to determine whether the proportion of participants who thought there was too much detail (responses of "4" and "5" on the 5-point scale), or too little detail (responses of "1" and "2"), was significantly different from 50%.

A content analysis was performed on the qualitative survey data. For each qualitative question, coding categories were developed by examining participants' responses for recurring patterns. Individual comments were then coded into the appropriate response category by a researcher.

Results

Characteristics of survey respondents

Characteristics of our 83 survey respondents are shown in Table 1. On average, respondents had been practicing pediatricians for 17.5 years (SD: 9.78), with the greatest proportion of respondents having practiced between 6-10 and 16-20 years (23% each). Almost all respondents worked in a clinical setting (96%) and over half were engaged in teaching (60%) or research (55%). Forty percent of respondents were involved in clinical care, teaching and research, and 28% were involved in clinical care only. The most common specializations were general pediatrics (28%), hematology/oncology (21%), and pulmonology or respiratory medicine (11%).

Table 1 Characteristics of respondents *

Survey responses to questions about the Clinical Answer

Overall, respondents found the CA useful (Table 2). Most respondents found the CA useful or very useful in presenting evidence (93%), agreed or strongly agreed that the CA contained the type of evidence necessary to support clinical decisions (88%) and that they would likely make use of CAs in the future (83%), and indicated that they would likely make use of CAs instead of full-length Cochrane reviews (58%). The positive survey responses were supported by respondents' written feedback: four times more respondents provided feedback that was positive (e.g. "this is excellent - the idea is good and the structure both simple and informative") as opposed to negative (e.g. "lazy way of presenting information").

Table 2 Overall usefulness of the Clinical Answer

A main purpose of our survey was to identify whether each section of the CA contained an appropriate level of detail (Table 3). Between 65% and 89% of respondents felt there was an appropriate level of detail in each section of the CA, with more than 75% of respondents thinking the amount of detail in the 'Question', 'Answer' and 'Included reviews' sections was appropriate. For respondents who did not think the level of detail in each section was appropriate, a significant number thought there was too much detail in the 'Question' (p = 0.04) and 'Linked figures and tables' (p = 0.0002) sections and too little detail in the 'Limitations' section (p = 0.008).

Table 3 Level of detail in each section of the Clinical Answer

Respondents were asked qualitative questions to determine what type of information they felt should be added to or removed from each section of the CA. Most comments related to four recurring themes in the 'Results' section: 1) there should be additional information about the population (location of trials, ages of participants) and intervention (dose and schedule); 2) the word 'significant' should be replaced with the magnitude of the effect; 3) information about heterogeneity should be provided; and 4) statistical terms should be spelled out in full prior to being abbreviated. With the exception of respondents wanting less detail in the search strategy, the qualitative comments overwhelmingly suggested that our CAs should be slightly more detailed. One respondent stated: "I find myself just wanting more information before I view something like this and make a potential change in the way I approach a given clinical situation." Four respondents also shared the opinion that adding "a short concluding remark on what the authors recommend... would be useful."

Regarding questions related to the format of the CA (Table 4), 82% of respondents agreed or strongly agreed that the layout of the CA was an effective way to present content and that it allowed them to quickly locate critical information. Five of nine qualitative comments related to the CA format suggested minor improvements, such as providing links at the end of each section to return to the top of the page. Sixty-five percent of respondents were familiar with the use of GRADE assessments (Grading of Recommendations Assessment, Development and Evaluation), and only 61% of these respondents (40% overall) agreed or strongly agreed that including GRADE assessments in the CA would greatly enhance the quality of the evidence.

Table 4 Clinical Answer format

Lastly, we examined the likelihood of respondents recommending CAs to their colleagues and calculated an NPS of +32.6 from these data (Table 5).

Table 5 Net Promoter Score: likelihood of recommending Clinical Answers to colleagues (N = 83)

CA format changes

To decide whether to alter the level of detail in any one section of the CA, we created a three-step decision rule. First, we determined the sections in which less than 75% of respondents felt there was an appropriate level of detail (Table 3), and then we carried out a binomial test as described in the Results section. Lastly, we used the qualitative data to determine which types of changes to implement. The binomial test indicated that the amount of detail in the 'Limitations' and 'Linked figures and tables' sections potentially needed to be changed. Upon review of the relevant qualitative comments, we did not find clear direction for increasing the amount of detail in the 'Limitations' section as there were no patterns in the responses; thus we will not be changing this section. For the 'Linked figures and tables' section, the forest plots will be removed because qualitative comments indicated that forest plots are a less familiar presentation format. Furthermore, while the p-value for the 'Question' section was significant, respondents strongly believed there was an adequate amount of detail (89%); therefore this section will not be changed.

Participants also offered a variety of small, easily implemented suggestions that will improve the overall usability of the CA. We will be adding information on geographic location, type of study, age of participants, dose and schedule where possible, spelling out statistical terms in full prior to being abbreviated, and replacing the word 'significant' with the magnitude of the effect. The formatting changes include: adding links to the systematic reviews and trials included in the CAs, and adding a link to each addendum to return to the CA front page.

GRADE assessments

During development of the CA the team were uncertain whether or not to add GRADE assessments [16] into the format. Because only 65% of respondents knew what GRADE was, and considering the cost in time and resources compared to the benefit of adding GRADE, we decided not to include GRADE in the CA format. A new section will be added: 'Implications for practice.' This section will provide a treatment recommendation and will take into account the overall quality of both the systematic reviews and their included trials for all outcomes of importance.

Discussion

The findings of our survey suggest that we have developed a useful product for pediatricians that has potential for point-of-care use. Results are positive, and indicate that we need to make only minor changes to the CA format before it is ready for publication. Furthermore, we have a positive NPS score, meaning that in general, clinicians are quite likely to tell their colleagues about CAs. This is of great value in transmitting the news about CAs and facilitating their future use, as Jones et al [17] found that medical colleagues are one of the information sources that are perceived to be of great importance to pediatricians' clinical practice.

The potential benefit of CAs

Attempting to answer all clinical queries as they arise is an important component in evidence-based medicine [5]. In a study conducted in two emergency departments in the US, emergency medicine physicians attempted to answer clinical queries as they arose 81% of the time [18], and were reported to be successful 87% of the time. CAs that address frequently arising clinical questions may increase the chance of success.

Clinicians are often concerned with their ability to keep up to date with the constant flow of new literature relevant to their practice [2, 5, 7]. Cochrane and other systematic reviews sift through all the available evidence in order to summarize the results of a specific clinical question but they still take substantial time to read, whereas a one-page CA summary could provide a much quicker way to keep up to date.

Limitations

Because we did not practice random sampling, our survey has a potential for response bias. Those contacted were likely to be more literate in evidence-based medicine because of the sources through which they were contacted, and all respondents reported experience in teaching or research. Furthermore, the CA tested was on respiratory child health, and only a minority of our respondents (11%) specialized in that area. However, we are testing the format and not the content of CAs, and the broad range of child health specialists will help ensure that our CA format is suitable for a wide variety of topics.

With the potential for several thousand participants, we received 83 completed online surveys. However, despite the low response rate, and potential bias from the ratio of respondents from various specializations, we found the survey results to provide coherent, clear information in our product refinement. Furthermore, 21% of respondents were hematology-oncology specialists, and 28% were general pediatricians, which does not reflect the ratio in practice. A possible explanation for this is related to our decision to measure the NPS; a strong Promoter may have sent our survey invitation to a large number of hematology-oncology specialist colleagues.

Lastly, for CAs to be useful we require an increasingly internet and e-resource savvy clinical population [2, 3]; however, clinicians also need to be trained in searching and evaluating information for quality and reliability, preferably as early as medical school or the residency stage [19].

Conclusions

The results of this survey have shown us that the format of the CA, as well as the amount and type of information presented, were well received by evidence users, suggesting if more are developed it will be a useful high quality synthesised and summarised evidence source that can be used at the point-of-care. We have sent the project's final report to the Cochrane Collaboration, and the Cochrane Editorial Unit is currently continuing the work on Clinical Answers. A potential future direction would be to implement the CAs at the point-of-care, and measure the extent of its effect in practice through further surveys or other evaluative method. We aim to build a product that is useful, sustainable, and efficacious for the child health clinical community-- one that meets the critical information needs of busy clinicians in their day-to-day practice decision-making.

References

  1. Makani J, Marche S: Towards a typology of knowledge-intensive organizations: determinant factors. Knowl Man Res Pract. 2010, 8: 265-277. 10.1057/kmrp.2010.13.

    Article  Google Scholar 

  2. D'Alessandro DM: Evidence-Based Medicine and Pediatrics. Pediatric Informatics. Edited by: Lehmann CU, Kim GR, Johnson KB. 2009, New York: Springer, 133-147.

    Chapter  Google Scholar 

  3. Davies K, Harrison J: The information-seeking behaviour of doctors: a review of the evidence. Health Info Libr. 2007, 24: 78-94. 10.1111/j.1471-1842.2007.00713.x.

    Article  Google Scholar 

  4. D'Alessandro DM, Kreiter CD, Peterson MW: An evaluation of information-seeking behaviors of general pediatricians. Pediatrics. 2004, 113: t-9.

    Article  Google Scholar 

  5. Prendiville TW, Saunders J, Fitzsimons J: The information-seeking behaviour of paediatricians accessing web-based resources. Arch Dis Child. 2009, 94: 633-635. 10.1136/adc.2008.149278.

    Article  CAS  PubMed  Google Scholar 

  6. Ketchum AM, Saleh AA, Jeong K: Type of evidence behind point-of-care clinical information products: a bibliometric analysis. J Med Internet Res. 2011, 13: e21-10.2196/jmir.1539.

    Article  PubMed  PubMed Central  Google Scholar 

  7. McCord G, Smucker WD, Selius BA, Hannan S, Davidson E, Schrop SL, et al: Answering questions at the point of care: do residents practice EBM or manage information sources?. Acad Med. 2007, 82: 298-303. 10.1097/ACM.0b013e3180307fed.

    Article  PubMed  Google Scholar 

  8. Alper BS, Hand JA, Elliott SG, Kinkade S, Hauan MJ, Onion DK, et al: How much effort is needed to keep up with the literature relevant for primary care?. JMLA. 2004, 92: 429-437.

    PubMed  PubMed Central  Google Scholar 

  9. Klaewsongkram J, Reantragoon R: Asthma research performance in Asia-Pacific: a bibliometric analysis by searching PubMed database. J Asthma. 2009, 46: 1013-1020. 10.3109/02770900903242696.

    Article  PubMed  Google Scholar 

  10. Michalopoulos A, Falagas ME: A bibliometric analysis of global research production in respiratory medicine. Chest. 2005, 128: 3993-3998. 10.1378/chest.128.6.3993.

    Article  PubMed  Google Scholar 

  11. Riordan FA, Boyle EM, Phillips B: Best paediatric evidence; is it accessible and used on-call?. Arch Dis Child. 2004, 89: 469-471. 10.1136/adc.2003.029413.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  12. Bow S, Klassen J, Chisholm A, Tjosvold L, Thomson D, Klassen T, et al: A descriptive analysis of child-relevant systematic reviews in the Cochrane Database of Systematic Reviews. BMC Pediatrics. 2010, 10: 34-10.1186/1471-2431-10-34.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Reichheld FF: The one number you need to grow. Harvard Bus Rev. 2003, 81: 46-+.

    Google Scholar 

  14. Reichheld FF: The ultimate question: driving good profits and true growth. 2006, Boston: Harvard Business School Press

    Google Scholar 

  15. Brooks LL, Owen R: Answering the ultimate question: how Net Promoter can transform your business. 2009, San Francisco: Jossey-Bass

    Google Scholar 

  16. Schunemann HJ, Oxman AD, Brozek J, Glasziou P, Jaeschke R, Vist GE, et al: Grading quality of evidence and strength of recommendations for diagnostic tests and strategies. BMJ. 2008, 336: 1106-1110. 10.1136/bmj.39500.677199.AE.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Jones TH, Hanney S, Buxton MJ: The information sources and journals consulted or read by UK paediatricians to inform their clinical practice and those which they consider important: a questionnaire survey. BMC Pediatrics. 2007, 7: 1-10.1186/1471-2431-7-1.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Graber MA, Randles BD, Ely JW, Monnahan J: Answering clinical questions in the ED. Am J Emerg Med. 2008, 26: 144-147. 10.1016/j.ajem.2007.03.031.

    Article  PubMed  Google Scholar 

  19. Abbas J, Schwartz DG, Krause R: Emergency medical residents' use of Google® for answering clinical questions in the emergency room. Proc Am Soc Info Sci Tech. 2010, 47: 1-4.

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgements

The authors would like to thank all of the participants of our survey, the Cochrane Airways and Cochrane Acute Respiratory Infections Groups, and the European Paediatric Association. The authors would also like to thank Denise Thomson and David Tovey for their guidance during the preparation of this manuscript.

We received funding from the Cochrane Collaboration Opportunities Fund

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Iva Seto.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

IS managed the project, coordinated the team, recruited participants, carried out initial data analysis, and wrote the introduction and discussion. MF prepared the clinical answer for the survey, carried out further data analysis and wrote the results section. BA prepared the survey and wrote the methods section. TK and KW provided guidance on the project and revised the manuscript. All authors read and approved the final manuscript.

Electronic supplementary material

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Seto, I., Foisy, M., Arkison, B. et al. The evaluation of an evidence-based clinical answer format for pediatricians. BMC Pediatr 12, 34 (2012). https://doi.org/10.1186/1471-2431-12-34

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1471-2431-12-34

Keywords