Skip to main content

Getting physicians to open the survey: little evidence that an envelope teaser increases response rates

Abstract

Background

Physician surveys are an important tool to assess attitudes, beliefs and self-reported behaviors of this policy relevant group. In order for a physician to respond to a mailed survey, they must first open the envelope. While there is some evidence that package elements can impact physician response rates, the impact of an envelope teaser is unknown. Here we assess this by testing the impact of adding a brightly colored "$25 incentive" sticker to the outside of an envelope on response rates and nonresponse bias in a survey of physicians.

Methods

In the second mailing of a survey assessing physicians' moral beliefs and views on controversial health care topics, initial nonrespondents were randomly assigned to receive a survey in an envelope with a colored "$25 incentive" sticker (teaser group) or an envelope without a sticker (control group). Response rates were compared between the teaser and control groups overall and by age, gender, region of the United States, specialty and years in practice. Nonresponse bias was assessed by comparing the demographic composition of the respondents to the nonrespondents in the experimental and control condition.

Results

No significant differences in response rates were observed between the experimental and control conditions overall (p = 0.38) or after stratifying by age, gender, region, or practice type. Within the teaser condition, there was some variation in response rate by years since graduation. There was no independent effect of the teaser on response when simultaneously controlling for demographic characteristics (OR = 0.875, p = 0.4112).

Conclusions

Neither response rates nor nonresponse bias were impacted by the use of an envelope teaser in a survey of physicians in the United States.

Peer Review reports

Background

Health services researchers are dependent on physicians' participation in surveys in order to assess provider attitudes, beliefs and self-reported behaviors such as guideline adherence. There is some evidence that physician response rates have been falling in more recent years [1, 2]. While low response rates are known to reduce statistical power and result in higher costs per completed survey, the impact of low response rates on nonresponse bias may not be as strong as previously thought [3]. Nevertheless, exploring methods to increase response rates in surveys of physicians is an important area of study, one that was recently called for in a review of the physician response literature by VanGeest et al. [4] In November of 2010, the National Cancer Institute convened a Provider Survey Methods Workshop where a research agenda for the field was developed, underscoring the broad attention to surveying this population with specific attention to methods designed to enhance participation. Due to the growing evidence that response rate is not necessarily correlated with response bias [3] these calls simultaneously call for systematic examination of nonresponse bias as is undertaken herein.

For an individual to respond to a mailed survey a series of steps must occur - not the least of which is opening the envelope in which the survey is sent. While this may seem trivial, for physicians it may be less so. There is evidence that physicians receive a large amount of materials (including surveys) in the mail [5] thus taking this important first step is far from guaranteed. In a qualitative study asking nonresponding general practitioners why they did not respond, fully 34% said that the questionnaire "got lost in a pile of paper work." [6] An additional barrier to obtaining survey responses from physicians is the potential presence of gatekeepers, or individuals who screen physicians' mail and pass on only those materials which, in the absence of any clarifying information about their contents, are deemed "important" by a third party. While this extent of gatekeeping has not been empirically documented, it has been hypothesized to put a downward pressure on response rates [4, 7–9]. In preparation for our own earlier work, we have anecdotal evidence that this is the case. We conducted informal focus groups in a physician population about potential issues in responding to surveys; not opening one's mail was a cited barrier [9].

One way to motivate the person who would regularly be responsible for opening mail (either the potential responders or the gatekeeper if indeed they do exist as hypothesized) to open an envelope is with a "teaser" on the outside of the envelope itself. This teaser would entice or suggest to the recipient that it is worth their time to open the letter or package, lessening the chance that the envelope is "lost in a pile of paperwork". A recent meta-analysis of methods to increase response rate suggested that including an envelope teaser increased response rates more than three-fold [10]. This conclusion, however, was based on only one study with a sample size of 190 in which recipients were assigned to an envelope teaser condition (i.e. a stamp on the envelope reading "Did you know you were entitled to more money?") or a control group with no such stamp [11]. While this finding is suggestive, the small sample size does not support a strong inference, nor can the findings be generalized to a physician population. Moreover, the current context of mailed surveys is likely quite different more than a decade after this initial publication.

More generally, however, there is evidence that other elements of packaging can significantly affect response rates in physician surveys (for review see: Kellerman & Herold [1]). For example, Asch and Christakis [12] manipulated the envelope so that the letter appeared from the University Medical School or the Veterans Affairs Hospital. Response rates were significantly higher for the latter envelope, suggesting that aspects of the envelope itself can matter and that respondents were more likely to open an envelope from the Veteran Hospital than from the University Hospital as conveyed on the outside of the envelope itself. Moreover, in the executive summary from the National Cancer Institute's provider methods workshop the importance of engaging physicians with the envelope itself was cited due to the observation that the decision about whether to open envelopes may be made "in seconds"[13].

Here we report results of a randomized study testing the assertion that an envelope teaser could increase response rates among a nationally representative physician population receiving a mailed questionnaire. This manipulation occurred in the second wave mailing to initial survey nonrespondents.

Methods

Samples and procedures

The envelope teaser experiment was embedded in a survey assessing physicians' moral beliefs and views on controversial healthcare topics, the substantive methods and findings of which are reported elsewhere [14]. In May 2009, we mailed a self-administered, 8-page survey to 2,000 practicing U.S. physicians ages 65 and younger and representing all specialties. Our random sample was selected from the AMA Physician Masterfile.

The first mailing of the survey included a book ("The Quotable Osler") as an incentive and promised an additional $25 check to all respondents. Nonrespondents to the first mailing (n = 1,250) were sent another copy of the survey six weeks later with the same promised $25 check upon survey completion. In this second wave mailing, we randomized nonrespondents to receive either an envelope with a brightly colored sticker reading "$25 incentive" (n = 630) or an envelope with no sticker (n = 620). This study was approved by the Mayo Clinic Institutional Review Board.

Analysis

Overall response rates between the teaser and control groups were compared using a Chi-square goodness-of-fit test. Response rates were also compared by gender, age, geographic region, years since graduation from medical school, and practice type (i.e. office-based practice or full time hospital staff). Multivariate analysis was undertaken to isolate the impact of the teaser, controlling for available demographic and practice characteristics. The relative representativeness of responders compared to nonresponders was also compared with respect to demographic characteristics to further assess nonresponse bias across conditions.

Results

Overall response rates did not differ significantly comparing the teaser (15.9%) and control (14.2%) groups (p = 0.38) (see Table 1). Within the teaser group there were no overall differences in response rates after stratifying on age, gender, region or practice type (office or hospital), indicating no differential nonresponse bias. However, this was not the case for years since graduation. Among physicians in the teaser group, a higher response rate was observed in those having graduated 5-9 years ago (27.1%) compared to those who graduated 10-19 years ago (11.6%) (p = 0.03), indicating that there may be a small nonresponse bias with respect to years since graduation when a teaser is used, however this relationship did not hold up in the multivariate analysis (results not shown). There were no differences in response rates by demographic or practice characteristics within the control condition for any of the examined characteristics. Moreover, when controlling simultaneously for all of the demographic and practice characteristics, the teaser did not predict response (OR = 0.875, p = 0.4112) (results not shown).

Table 1 Response Rates Between and Within Groups

Discussion

Adding a teaser sticker to the outside of an envelope in a mailed survey did not significantly increase response rates among a nationally representative sample of physicians in the United States. However, because the relative cost of adding an envelope teaser is minimal, the evidence that it introduces nonresponse bias small, and given the sparse existing literature on envelope teasers, future work in this area is warranted. Specifically, wording variations on the teaser sticker itself, the incentive amount and type (monetary vs. nonmonetary and guaranteed vs. lottery) should be further tested in a physician population. It is possible that the teaser for a $25 incentive offer was insufficient to change behaviors in this population of physicians, as the findings of Keating et al. [15], who found significantly higher response rates in a survey of physicians when a prepaid check for $50 was issued versus a prepaid check for $20, would suggest.

This study has several limitations. One of the mechanisms through which the envelope teaser is hypothesized to work is through increasing the chance that a physician gatekeeper would pass the survey along to the potential respondent, thereby increasing the likelihood of response. The present study cannot determine if there was a differential response to the envelope teaser by the presence or absence of a gatekeeper as it did not explicitly measure the presence or absence of a gatekeeper. Future research should work toward moving beyond the proposed existence and impact of gatekeepers to a documentation of this extent that gatekeeping actually impacts survey response. Once this is determined, the mechanism by which something like a teaser functions with or without gatekeeping can be elucidated. Because the present study provided a manipulation in the follow-up to a survey, its findings cannot be generalized to an initial mailing as all of the responders in this experiment had already not responded to an initial request. In order to better understand the true potential of envelope teasers, and given their low cost, their value should be tested in initial mailings as well as later mailings. Finally, the extent that these findings can be generalized beyond physicians in the United States is unknown.

Conclusion

In conclusion, even though neither response rates nor nonresponse bias were impacted by the use of an envelope teaser in a survey of physicians in the United States, it is important that health survey methodologists continue to respond to the calls of Cull et al. [16], Kellerman and Harold [1], Klabunde et al. [13], McMahon et al. [17], and VanGeest and Johnson [4] by continuing to further test methods of enhancing survey participation among elite populations such as physicians while simultaneously examining nonresponse bias. While there is mounting evidence that there is no linear relationship between response rate and nonresponse bias it is important to consider the potential for nonresponse bias in each survey context [3]. Otherwise, the physician perspective may not be adequately represented in debates and issues germane to the practice of medicine or to the realm of health care reform.

References

  1. Kellerman SE, Herold J: Physician response to surveys. A review of the literature. Am J Prev Med. 2001, 20 (1): 61-67. 10.1016/S0749-3797(00)00258-0.

    Article  CAS  PubMed  Google Scholar 

  2. Thorpe C, Ryan B, McLean SL, Burt A, Stewart M, Brown JB, Reid GJ, Harris S: How to obtain excellent response rates when surveying physicians. Fam Pract. 2009, 26 (1): 65-68.

    Article  CAS  PubMed  Google Scholar 

  3. Groves R: Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly. 2006, 70 (5): 646-675. 10.1093/poq/nfl033.

    Article  Google Scholar 

  4. VanGeest JB, Johnson TP, Welch VL: Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof. 2007, 30 (4): 303-321. 10.1177/0163278707307899.

    Article  PubMed  Google Scholar 

  5. Thran S, Hixson J: Physician surveys: recent difficulties and proposed solutions. ASA Proc Sec Survey Res Methods. 2000, 233-237.

    Google Scholar 

  6. Kaner EF, Haighton CA, McAvoy BR: "So much post, so busy with practice - so, no time!": A telephone survey of general practitioners' reasons for not participating in postal questionnaire surveys. British Journal of General Practice. 1998, 48: 1067-1069.

    CAS  PubMed  PubMed Central  Google Scholar 

  7. Heywood A, Mudge P, Ring I, Sanson-Fisher R: Reducing systematic bias in studies of general practitioners: the use of a medical peer in the recruitment of general practitioners in research. Fam Pract. 1995, 12 (2): 227-231. 10.1093/fampra/12.2.227.

    Article  CAS  PubMed  Google Scholar 

  8. Parsons J, Warnecke R, Cazja R, Barnsley J, Kaluzny A: Factors associated with response rates in a national survey of primary-care physicians. Eval Rev. 1991, 18 (6): 756-766.

    Article  Google Scholar 

  9. Beebe TJ, Locke GR, Barnes SA, Davern ME, Anderson KJ: Mixing web and mail methods in a survey of physicians. Health Serv Res. 2007, 42 (3 Pt 1): 1219-1234.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Edwards P, Roberts I, Clarke M, DiGuiseppi C, Pratap S, Wentz R, Kwan I, Cooper R: Methods to increase response rates to postal questionnaires. Cochrane Database Syst Rev. 2007, MR000008-2

  11. Dommeyer C, Elganayan D, Umans C: Increasing mail survey response with an envelope teaser. J Mark Res Soc. 1991, 33 (2): 139-140.

    Google Scholar 

  12. Asch DA, Christakis NA: Different response rates in a trial of two envelop styles in mail survey research. Epidemiology. 1994, 5 (3): 364-365. 10.1097/00001648-199405000-00020.

    Article  CAS  PubMed  Google Scholar 

  13. Klabunde C, McLeod C, Willis G: Designing and fielding high quality surveys of physicians and medical group practices: A research agenda. 2011, Peachtree, Georgia: In Tenth Conference on Health Survey Research Methods

    Google Scholar 

  14. Antiel RM, Curlin FA, James KM, Tilburt JC: Physicians' beliefs and U.S. health care reform--a national survey. N Engl J Med. 2009, 361 (14): e23-10.1056/NEJMp0907876.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Keating NL, Zaslavsky AM, Goldstein J, West DW, Ayanian JZ: Randomized trial of $20 versus $50 incentives to increase physician survey response rates. Med Care. 2008, 46 (8): 878-881. 10.1097/MLR.0b013e318178eb1d.

    Article  PubMed  Google Scholar 

  16. Cull WL, O'Connor KG, Sharp S, Tang SF: Response rates and response bias for 50 surveys of pediatricians. Health Serv Res. 2005, 40 (1): 213-226. 10.1111/j.1475-6773.2005.00350.x.

    Article  PubMed  PubMed Central  Google Scholar 

  17. McMahon SR, Iwamoto M, Massoudi MS, Yusuf HR, Stevenson JM, David F, Chu SY, Pickering LK: Comparison of e-mail, fax, and postal surveys of pediatricians. Pediatrics. 2003, 111 (4 Pt 1): e299-e303.

    Article  PubMed  Google Scholar 

Pre-publication history

Download references

Acknowledgements

This publication was made possible by the Mayo Clinic Department of Medicine funding to Dr. Tilburt and from Grant Number 1 KL2 RR024151 from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH), and the NIH Roadmap for Medical Research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeanette Y Ziegenfuss.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

JZ wrote the manuscript, supervised the analysis and led the interpretation of findings. KB and KMJ assisted with the writing and interpretation of findings. LH performed the analysis. JCT led the study in which the present study was embedded and designed this study. TJB oversaw data collection and interpretation of findings. All authors approved a final version of the manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Ziegenfuss, J.Y., Burmeister, K., James, K.M. et al. Getting physicians to open the survey: little evidence that an envelope teaser increases response rates. BMC Med Res Methodol 12, 41 (2012). https://doi.org/10.1186/1471-2288-12-41

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1471-2288-12-41

Keywords