Email updates

Keep up to date with the latest news and content from BMC Health Services Research and BioMed Central.

Open Access Highly Accessed Research article

Using quantitative and qualitative data in health services research – what happens when mixed method findings conflict? [ISRCTN61522618]

Suzanne Moffatt*, Martin White, Joan Mackintosh and Denise Howel

Author Affiliations

Public Health Research Group, School of Population & Health Sciences, Faculty of Medical Sciences, William Leech Building, Framlington Place, Newcastle upon Tyne NE2 4HH, UK

For all author emails, please log on.

BMC Health Services Research 2006, 6:28  doi:10.1186/1472-6963-6-28

Published: 8 March 2006

Abstract

Background

In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over.

Methods

Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice.

Results

Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match.

Conclusion

The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this enhance the robustness of the study, it may lead to different conclusions from those that would have been drawn through relying on one method alone and demonstrates the value of collecting both types of data within a single study. More widespread use of mixed methods in trials of complex interventions is likely to enhance the overall quality of the evidence base.