Skip to main content
  • Research article
  • Open access
  • Published:

Models of care for musculoskeletal health: a cross-sectional qualitative study of Australian stakeholders’ perspectives on relevance and standardised evaluation

Abstract

Background

The prevalence and impact of musculoskeletal conditions are predicted to rapidly escalate in the coming decades. Effective strategies are required to minimise ‘evidence-practice’, ‘burden-policy’ and ‘burden-service’ gaps and optimise health system responsiveness for sustainable, best-practice healthcare. One mechanism by which evidence can be translated into practice and policy is through Models of Care (MoCs), which provide a blueprint for health services planning and delivery. While evidence supports the effectiveness of musculoskeletal MoCs for improving health outcomes and system efficiencies, no standardised national approach to evaluation in terms of their ‘readiness’ for implementation and ‘success’ after implementation, is yet available. Further, the value assigned to MoCs by end users is uncertain. This qualitative study aimed to explore end users’ views on the relevance of musculoskeletal MoCs to their work and value of a standardised evaluation approach.

Methods

A cross-sectional qualitative study was undertaken. Subject matter experts (SMEs) with health, policy and administration and consumer backgrounds were drawn from three Australian states. A semi-structured interview schedule was developed and piloted to explore perceptions about musculoskeletal MoCs including: i) aspects important to their work (or life, for consumers) ii) usefulness of standardised evaluation frameworks to judge ‘readiness’ and ‘success’ and iii) challenges associated with standardised evaluation. Verbatim transcripts were analysed by two researchers using a grounded theory approach to derive key themes.

Results

Twenty-seven SMEs (n = 19; 70.4 % female) including five (18.5 %) consumers participated in the study. MoCs were perceived as critical for influencing and initiating changes to best-practice healthcare planning and delivery and providing practical guidance on how to implement and evaluate services. A ‘readiness’ evaluation framework assessing whether critical components across the health system had been considered prior to implementation was strongly supported, while ‘success’ was perceived as an already familiar evaluation concept. Perceived challenges associated with standardised evaluation included identifying, defining and measuring key ‘readiness’ and ‘success’ indicators; impacts of systems and context changes; cost; meaningful stakeholder consultation and developing a widely applicable framework.

Conclusions

A standardised evaluation framework that includes a strong focus on ‘readiness’ is important to ensure successful and sustainable implementation of musculoskeletal MoCs.

Peer Review reports

Background

Australians enjoy one of the most accessible and high-quality healthcare systems in the world when benchmarked against other Organization for Economic Cooperation and Development member nations [1, 2]. Like other developed and, increasingly developing nations, health delivery systems and health policy makers are faced with the challenge of adapting to the changing needs of its consumers [3]. Population health profiles are shifting healthcare priorities from acute care needs and communicable diseases to chronic, non-communicable conditions due to their increasing prevalence [4], particularly musculoskeletal conditions [5]. The system is also challenged with an active reform agenda for primary and hospital-based care that includes changes to funding allocations, policy and governance and workforce roles [6].

In Australia, the morbidity and mortality burden imposed by chronic musculoskeletal conditions is second only to cancer [7]. The prevalence and impact of musculoskeletal conditions are projected to soar in Australia and other developed nations [4, 811]. Furthermore, across all areas of healthcare, including musculoskeletal healthcare, there is a failure in the translation of evidence on best practice from research into patient care [1215]. Addressing these issues requires research evidence to articulate more efficiently with policy and practice to minimise the ‘evidence-practice’, ‘burden-policy’ and ‘burden-service’ gaps. Further changes required include health service care delivery; the manner in which health professionals are trained and practice; and the level of engagement by consumers in the management of their health [7]. These shifts are required to optimise health system responsiveness, sustainability and population physical and mental well-being.

Historically, musculoskeletal conditions have not been considered in a comprehensive manner in health policy for chronic disease management or primary prevention [1619]. In recent years, this situation has been somewhat redressed, particularly in Australia, with the introduction of state-wide and national Models of Care (MoCs) for musculoskeletal health [7, 20] and other conditions. MoCs provide evidence-informed blueprints for the delivery of consumer-centred health services that are tailored to meet local operational requirements. As such, MoCs serve as a vehicle to drive evidence into policy and practice at a jurisdictional level. Although the structural components of a MoC necessarily vary according to its clinical or population focus, the core elements and information provided are based around the framework: right care, right time, right team, right place, and right resources [20]. Development of MoCs will also vary depending on the governance and health system processes in given nations. In Australia, a Health Network model is used to develop MoCs where cross-discipline stakeholders (including consumers and carers) are connected to discuss, debate and recommend consumer-centred strategies to address health issues [20]. A web of evidence is developing to substantiate the effectiveness of the implementation of specific musculoskeletal MoCs, as reported in our recent international review [21], as well as uptake at a jurisdictional level [2225].

There remains no standardised national approach to the evaluation of musculoskeletal MoCs despite evidence to support their effectiveness in improving consumer outcomes and system efficiencies, and indications that they continue to evolve and be promoted in Australia [7, 26] and internationally [27]. Furthermore, there is a dearth of information in relation to evaluating MoCs ‘readiness’ for implementation as well as indicators of successful implementation. Here, we broadly define ‘readiness’ as the content and development processes required to ensure implementation success and sustainability, while ‘success’ is broadly defined as the outcomes achieved after implementation. There is also limited data on whether Australian end-users assign value to musculoskeletal MoCs in their work or to a standardised evaluation approach. The aim of this study was to explore end users’ views on the relevance of musculoskeletal MoCs to their work and value of a standardised evaluation approach.

Methods

Design

Cross-sectional qualitative study, which is reported according to the COREQ-32 checklist (refer to Additional file 1) [28].

Sampling strategy

Subject matter experts (SMEs) across broad health, policy and consumer backgrounds were sampled for this study, consistent with a recently published framework [29]. SMEs were identified based on their established role(s) in the development or implementation of MoCs. Identification was made through established musculoskeletal networks, clinical leads of musculoskeletal service and research programs, and policy makers who have been involved in development or implementation of musculoskeletal MoC, as we have reported previously [20]. SMEs were sampled across three Australian states: Western Australia (WA), New South Wales (NSW) and Victoria (VIC). These states currently have the most active policy, multidisciplinary engagement in musculoskeletal healthcare, and health services research activity [7]. We also used snowballing methods to identify additional SMEs, not necessarily known to the research team, and who met the inclusion criteria. To ensure national representation and adequate heterogeneity of skill sets, inclusion criteria and a minimum sample size for each discipline group, were defined a priori by the research team (Table 1).

Table 1 Inclusion criteria for subject matter experts. SMEs identified in each discipline were required to meet all the inclusion criteria

SMEs were invited to participate by personal email from the research team. The email invitation provided a hyperlink to a secure portal, powered by Qualtrics™ software (Provo, Utah, USA). At this portal, SMEs were provided with a background to the study and invited to consent to, or decline, the invitation to participate. On accepting the invitation to participate, SMEs were required to answer screening questions to determine eligibility and to collect demographic data. SMEs who declined the invitation were invited to provide a reason for their decision and to nominate other SMEs, facilitating snowballing recruitment.

Ethics, consent and permissions

The Human Research Ethics Committee at Curtin University approved this study (RDHS-08-14) and all participants provided informed consent, including consent to report de-identified participant-level data.

Development of interview schedule

Our multidisciplinary research team iteratively developed a semi-structured interview schedule to explore SMEs perceptions about:

i) aspects of musculoskeletal MoCs important to their work (or life, for consumers); ii) the usefulness of standardised evaluation frameworks to judge ‘readiness’ of a MoC prior to its implementation and successful implementation of a MoC iii) challenges associated with evaluating MoCs pre and post-implementation.

A draft interview schedule was pilot tested with eight SMEs who met the inclusion criteria (different from those included in this study). Two from each stakeholder group were interviewed using a cognitive interviewing approach to determine whether the questions were generating the anticipated information [30]. A probing paradigm was adopted where the interviewer (JEJ) asked direct questions to assess comprehension of each question, understanding of terminology used, and whether it was easy or hard to answer. To allow for refinement of questions, two rounds of four interviews were conducted 1 week apart. Based on feedback from the first round of piloting, a separate consumer interview schedule was developed. The consumer schedule was shorter than the original and re-worded to explore the same concepts using consumer-centred language. The second round of interviews did not provide any further insights in relation to comprehension of questions, and therefore no further pilot interviews were undertaken.

Data collection

Data were collected in April 2015 by four interviewers experienced in qualitative methods (AMB, HS, MJ, RS). The interviewers had a broad range of career skills including clinical practice in musculoskeletal healthcare (AMB, HS, MJ), clinical practice across chronic diseases (RS), health service management experience (MJ), education (HS), health services research (AMB, HS, RS), health policy (AMB, RS) and MoC development, implementation and evaluation (AMB, HS, MJ, RS). Given cross-sector and cross-discipline experience of the interviewers and the sampling strategy used, some of the SMEs were known to the interviewers. In this context, the possible threat of responder bias was unavoidable since we identified a priori that a minimum level of contextual experience among SMEs was needed in order to provide meaningful data. To minimise any risk of dependent relationships, interviewers only interviewed SMEs where no direct reporting, working, personal, or financial relationship existed. To ensure consistency between interviewers, a senior qualitative researcher (JEJ) provided training to the interviewer team. Specifically this involved tailoring interviews to the knowledge and experience of interviewees, use of specified probing questions, and paraphrasing answers to confirm understanding [31]. Interviews were conducted by telephone and audio-recorded.

Data analysis

Interviews were transcribed verbatim and analysed independently by two researchers: one a qualitative methods expert (JEJ) and the other a content expert (AMB). An inductive analysis based on grounded theory was used [32]. All transcripts were initially reviewed and codes derived from the data. Through an iterative process of reviewing transcripts and generating codes, the codes were further refined, grouped into concepts where similarities amongst codes existed, and key themes developed. Where discordance between analysts was identified, and where appropriate to gain consensus, these were discussed and transcripts re-reviewed. Descriptive statistics were used to characterise the SMEs.

Results

Twenty-seven SMEs (n = 19; 70.4 % female) participated in the study across three states (WA: 12; NSW: 10; VIC: 5). Interview duration ranged from 18 to 52 min. Five (18.5 %) of the 27 SMEs were consumers. Three (11.1 %) SMEs reported some awareness of the Australian musculoskeletal MoCs, 10 (37.0 %) reported having reviewed at least one MoC, and 14 (51.9 %) reported having been involved in the development and/or implementation of at least one MoC. A summary of SME demographic characteristics is provided in Table 2.

Table 2 Demographic characteristics of the sample

Qualitative results are presented under the following headings:

  1. 1.

    Aspects of musculoskeletal MoCs that are important to stakeholders’ work.

  2. 2.

    Perceptions of standardised evaluation frameworks to judge ‘readiness’ and ‘success’ of musculoskeletal MoCs.

  3. 3.

    Challenges associated with evaluating musculoskeletal MoCs.

Where the term MoC is referred to, this relates to musculoskeletal MoCs.

Aspects of musculoskeletal MoCs that are important to stakeholders’ work

MoCs were perceived as critical for providing a platform to influence and initiate changes to healthcare delivery, so as to align it with best-practice (see MoC as platform for change section), as well as providing practical guidance on how to implement and evaluate services (see MoC providing practical guidance to implement and evaluate services section).

“…we should all be evidence-based practitioners and we should be all trying to implement evidence into what we do, and so I see models of care and evaluation of Models of Care as an integral component of that” (SME 8)

MoC as platform for change

In considering MoCs as providing a platform for sector change, four critical information components were identified to imbue credibility in a MoC and increase the likelihood of uptake by stakeholders:

  • the development and consultation processes

  • a compelling case for change

  • a clear outline of objectives, core elements and anticipated outcomes, along with performance indicators to evaluate outcomes

  • a description of how services and resources would be delivered.

Inclusiveness in development of a MoC

The strongest theme to emerge was the need to demonstrate that a broad range of stakeholders had been consulted from inception of the MoC. This included representatives across the care continuum (i.e. primary through to tertiary sector) and different sectors (i.e. private, public, non-government, consumers, community).

“That the end product can demonstrate that its had the input from the right professions… it’s very much having a representative group…you know, multi-disciplinary, as well as having the people with influence, the colleges, the association, that kind of stuff. And involving them from the beginning. So, not just bringing them to review something at the end. It’s actually they’re involved in drafting and developing.” (SME 7)

Further, without endorsement from influential change champions and organisations, SMEs warned that MoCs can be perceived as incomprehensive, which would negatively affect engagement, or at worst, result in stakeholders being unaware of the MoC’s existence or ambivalent to it.

“Well I think the bad thing is, as a community pharmacist, a Model of Care is really not important to my work at all - it should be, but it’s not. And I think that’s because as a community pharmacist I had absolutely no awareness of [its] existence…” (SME 15)

Establishment of a case for change

A compelling case for change was also considered vital for uptake. This included a justification being based not only on best evidence, but also local data (jurisdictional/national), previous experiences and learnings and local expert opinion. Detailing the benefits of the MoC, how it would address ‘known’ local problems within the sector, and how proposed services would be integrated into existing service delivery models, were also perceived as essential.

“I would be interested in…the argument for the evidence in which the Models of Care are based and how that fits in the service delivery models both in terms of an economic perspective and also an integrated sort of model.” (SME 2)

“It’s got to be safe, so the safety elements have to be considered strongly. It must either improve or at least maintain the quality of care that’s already likely one would hope improve, but certainly not in any way diminish. It has to ensure that the effectiveness of care is going to be improved or again at least maintained…It should improve the patient journey and the patient experience. It should increase the efficiency from a service or system perspective.” (SME 5)

Providing a clear outline

Across disciplines, strong emphasis was placed on the MoC clearly defining its scope, objectives and outcomes. The language in the document needed to be meaningful to different stakeholders and to delineate between the core and non-essential components of the model. SMEs from primary care, in particular, stressed the need for simple, translatable strategies to assist implementation.

“One of the most important things is having a document which is actually going to be understandable to a person outside of that special [musculoskeletal] area…that specific managers understand it as well as clinicians [and] consumers.” (SME 13)

“And they’re [General Practitioners] not interested in pages of documentation, they’re not interested in the eighty page Model of Care document; they’re interested in two pages… Give it to me simply; have you got an assessment for me with my package that I can use, and how do I do it, and who do I refer to? Keep it really simple with a link to the more detailed document.” (SME 22)

Describing how services could be delivered

Detailing how musculoskeletal services would be delivered was of high importance to SMEs involved in providing health services and consumers. This included access and service components, as well as education strategies and support mechanisms for both health professionals and consumers.

“… as a clinician I think the most important things are that it [MoC] sets down…the way in which care will be delivered for people with musculoskeletal conditions…so that there’s hopefully a clear pathway with services available and accessible that need to be delivered in line with the model of care.” (SME 14)

Consumers specifically advocated that MoCs were important in facilitating the provision of accessible multidisciplinary services that are ideally located in the community. Examples included outreach and telehealth services. Consumers also identified the need for MoCs to include an emphasis on early identification and management, psychosocial health, and transition services for adolescents.

“I think probably the key point is access…consumers need to be able to access these opportunities easily with the minimal cost and reduced waiting times, and to be able to have a service that gives them as much multidisciplinary care as possible.” (SME 24)

MoC providing practical guidance to implement and evaluate services

The second key aspect of MoCs that was important to SMEs’ work was providing practical and detailed considerations for implementation and evaluation of changes to service delivery. A fundamental component was the demonstration of pilot testing across diverse sites to provide insight into outcomes and adaptability.

“I’d like to see whether a pilot study has been run in a small number of sites of different, maybe, locations and environments to make sure that the Model of Care is both effective, but is also generalizable.” (SME 12)

“…particularly when you see a preliminary evaluation– if there’s pilot studies done and we’ve, you know, had that advantage in New South Wales obviously to do some pilot studies and then present the results from the pilot studies, then I think that’s really important because they [stakeholders] say “Oh, I want that too…””(SME 20)

SMEs cited several practical components that were important to be included in a MoC to assist with implementation. This included identifying system requirements for implementation; detailing how a new MoC would replace an old or existing MoC; prioritising components for implementation and providing implementation plans or business cases. Similarly, guidance was also sought in relation to evaluation processes and identification of data sources to assist with evaluation.

“I think in advance it’s also helpful to make sure that the evaluation that you’d like to do at the end is thought about…so what on a day do you want to capture and systems that are required for that, and the person burden, but also resource burden in developing and implementing that…” (SME 10)

“…there’s a step between the Models of Care and then what it is that you plan to implement. The work we did…was trying to look at how you turn the Model of Care into discreet programs. I think there needs to be a lot more work of that nature. The real lesson…is that those processes then need rigorous project methodology and do need very structured implementation targets.” (SME 19)

Perceptions of standardised evaluation frameworks to judge ‘readiness’ and ‘success’ of musculoskeletal MoCs

There was unequivocal endorsement for a framework that assessed whether critical components across the broader health system had been considered or were ready prior to implementation of an MoC; i.e. a ‘readiness’ framework (Evaluating the readiness of a MoC section). While evaluating the ‘success’ of an implemented MoC was considered important (Evaluating success of an implemented MoC section), this was perceived as an already familiar concept with a reasonably well established and understood approach, compared to an appraisal of ‘readiness’. The need for a structured approach to judging readiness resonated strongly with SMEs, based on their experiences with premature and unsuccessful implementation.

“I think an “evaluability assessment” or “readiness assessment”, whatever it’s called, is really, really important… [there have been] so many evaluations where you’ve gone in to do it and not only has it not been ready to implement, the enablers aren’t in place to support a new Model of Care, staff perceptions are quite obtuse, but there might not even be a baseline” (SME 4)

Evaluating the readiness of a MoC

A readiness evaluation framework was perceived as valuable in providing assurance to stakeholders that essential elements required for implementation are present in the target environment and that known barriers have been removed or worked through.

“…because it is difficult to know [when a MoC is ready] because so many parts of the system and such a wide variety of things need to align that it’s difficult to know when all of that stuff is aligning up…so some sort of framework or structured way or a model that could take you through…and give you the confidence that it’s now ready would be very valuable.” (SME 1)

The concept of readiness included assessing stakeholders’ willingness to change and whether the MoC was usable in the current environment. This was perceived as critical in building receptiveness towards the MoC.

“… it comes down to the whole principles of change management and whether you’ve got a receptive audience. Do you have fertile ground in which to plant this thing?” (SME 19)

“So I think it’s really important to make sure you’ve got everything, all the ingredients right, the context is right, that the people involved are ready and engaged with it, ready to deliver it and embrace it before you actually implement it and evaluate it.” (SME 8)

SMEs were also wary of any readiness evaluation framework being overly simplistic, and thus not capable of capturing the complexities of a dynamic healthcare environment or adapting across different environments. Similarly, SMEs cautioned that an evaluation framework should not be too prescriptive so that it became an obstacle to conducting meaningful evaluations.

“…in some ways I think there can be a tendency to boil everything down to a checklist, and that concerns me as well.” (SME 4)

“…it’s got to be seen as a value add exercise and not something that’s just kind of ticking a box to do a function that you know may or may not be useful.”(SME 14)

Evaluating success of an implemented MoC

SMEs strongly identified the need to distinguish between an evaluation of the implementation of a MoC (process evaluation) and an evaluation of the outcomes associated with the MoC itself (impact evaluation).

“…if you want to make statements or judgments about the effectiveness of a Model of Care, which presumably at some point you would want to because the whole – one of the purposes behind a Model of Care is to improve…and you can’t make judgments about effectiveness if you don’t know whether it’s been implemented appropriately in the first instance…” (SME 8)

“I guess we need to measure what we do to know whether we’re making a difference…I mean, we’re in a place of increasing need and demand, and we have finite resources, so…measuring what we do is important to make sure that it’s delivering value, so value in terms of quality and cost”. (SME 7)

Using a framework to inform a structured approach to exploring and documenting lessons learnt from implementation efforts was also deemed valuable. This provided opportunities for continued engagement with stakeholders and ongoing quality improvement for implementation efforts.

“I think measuring people’s commitment to the implementation and to the Model of Care is very telling so that’s really very important…how they’ve been engaged and enabled to change whatever it is that they have needed to change. That can be rich I think in terms of what you might do the next time, both good and bad…so what we might not do, but also but what you might want to replicate.” (SME 2)

“…evaluation is obviously also critically important for consumer satisfaction and potential need for modifying the Model of Care, assessing the consistency of uptake and obviously the acceptability to the stakeholders” (SME 9)

A standardised framework that facilitated evaluation of MoCs across sites, and even jurisdictions, was perceived as highly useful for the sector.

“…if there were a particular framework that could be used, then there is some measure of comparison rather than, well we evaluated this [MoC] implementation by this process but another evaluation process was used for [another] Model of Care…” (SME 3)

Challenges associated with evaluating musculoskeletal MoCs

Whilst SMEs supported standardised evaluation frameworks to judge the readiness and success of MoCs, they were cognisant of the many challenges associated with this. Of these, the strongest themes to emerge were: (i) identifying, defining and measuring key indicators that adequately reflect ‘readiness’ and ‘success’ of a MoC (Identifying, defining and measuring indicators of readiness and success section) and (ii) recognising the potential impacts of systems or context changes within a dynamic healthcare environment on prospective evaluations (Impacts of systems or context changes on prospective evaluations section). Other challenges included system constraints, cost of evaluation, meaningful consultation with stakeholders and developing a framework that is applicable across diverse jurisdictions and healthcare settings (Other challenges section).

Identifying, defining and measuring indicators of readiness and success

While SMEs strongly indicated the need for an evaluation framework to be flexible and take into account different contexts and changing environments, they also identified perceived challenges. These included developing specific indicators that appropriately reflected readiness and success from MoCs and achieving consensus amongst stakeholders as to how these should be measured and defined.

“…Models of Care are pretty high level documents and as such they often lack that specificity or that operational level of detail and specificity of activities that allow for betterment, so it can be very difficult to know where to target your measures and determine what to measure. And then, yeah, there’s often a lack of agreement I think about how to measure, is something ready to be measured…” (SME 1)

SMEs identified that musculoskeletal conditions are highly diverse in their presentations and are very commonly co-morbid to other chronic health conditions. This complexity creates considerable heterogeneity in cohorts being evaluated and is a challenge to evaluation initiatives.

“ one [challenge] that I’ve also eluded to is the comorbidity - the fact that our patients are wonderfully diverse and that’s often our biggest dilemma is trying to decide whether somebody who’s got so many comorbidities is still worthwhile trying to include in it [MoC] or whether you know they are so different in all other respects and there only going to be a confounder.” [SME 11]

Impacts of systems or context changes on prospective evaluations

SMEs noted that the speed at which healthcare environments change could make evaluation difficult or render it out of date quickly. Risks were also perceived in attributing causation of observed outcomes to a MoC (either positively or negatively) given multiple influential factors present in any healthcare setting.

“… but the Model of Care is probably only one of the things that accounts for improved or changed outcomes in the patient population. Because almost invariably there are other things happening at the same time, some of them planned and some of them unplanned. So the evaluation framework around the Model of Care is very important but I think we have to also be a bit pragmatic and remind ourselves that sometimes when there is a change in outcomes, whether it’s good or bad, there is probably a number of factors impacting upon it and that’s always the dilemma…” (SME 11)

Other challenges

Of the several other challenges cited by SMEs relating to the development of an evaluation framework (Table 3), creating a framework that is applicable to different MoCs across jurisdictions was the most complex. This related not only to the range of different condition-specific musculoskeletal MoCs available and those likely to be developed, but also geographic variability in implementation sites such as urban and rural settings, and consequent variability in the availability of resources and workforce, modes of care delivery, and funding models.

Table 3 Other challenges associated with evaluating Models of Care

“…but national consistency and quality in a country like Australia, not just the [geographical] vastness, but the different jurisdictions, is always going to be problematic…” (SME 6)

Discussion

Main findings

Australian stakeholders in this study perceived MoCs to be critical enablers for system and service delivery reform and a blueprint for implementation and evaluation of new services. Our data are consistent with previously published reports, where MoCs have been identified as a mechanism to guide service planning and delivery for chronic health conditions [3335], particularly musculoskeletal conditions [7, 21, 27, 36]. Importantly, we identified that MoCs hold different values to different stakeholders and that these value propositions could be enhanced and sustained by recognition of this diversity in the development of MoCs. Further, despite a range of challenges identified to achieving meaningful and valid evaluation outcomes, stakeholders agreed that an evaluation framework for musculoskeletal MoCs that included a strong focus on ‘readiness’ was important to ensure successful and sustainable implementation of musculoskeletal MoCs. While our focus was on musculoskeletal MoCs specifically, we suggest that our findings have relevance to MoCs for chronic diseases more broadly.

Relevance of MoCs to stakeholders’ work

Stakeholders stressed the need for MoCs to be user-relevant and system-relevant, and that broad sector buy-in of the MoC was fundamental to achieving any meaningful level of sustainable implementation. Health and policy professionals cited the need for MoCs to adopt robust and comprehensive consultation methods and to clearly articulate a case for change and to identify performance indicators. In contrast, consumers identified the need for the MoC to address accessibility of services, particularly multidisciplinary care; consider support structures for families and adolescents; and directly address psychosocial health. These consumer-centred views on service planning and delivery align with other recent consumer reports [3739] and highlight the importance of engaging these perspectives alongside other stakeholders. These different foci of relevance are reflective of the distinct ways in which these two stakeholder groups interact with the health system – one being a service administrator and delivery agent where processes, governance and adoption are important; and the other a consumer where point-of-care, access, scope and equity matter. Notably, a recent systematic review examining the effectiveness of chronic care models for health outcomes and health practices (for diabetes, cardiovascular disease, depression and chronic respiratory diseases), identified that of the 77 papers included in the review, only two included family support as part of the model, perhaps indicative of inadequate consumer input into the model design [33]. Further, another recent systematic review reported greater operational success in chronic care models when consumer perspectives were included [40].

Meaningful consultation, inclusive of a broad range of stakeholders, was considered an overall imperative in the development and planning implementation of a MoC. Inadequate or tokenistic engagement and consultation is a recognised barrier to uptake of programs, policies and models [41]. Australian MoCs, for example are generally developed through a network-based model of engagement, consultation and iterative implementation [20]. Such network-developed models are reported to be acceptable to stakeholders [26, 42], associated with improved outcomes for consumers [43], and identified as a facilitator to implementation of chronic care models [40]. Implicit within consultation, was the need for identification and engagement of clinical and administrative ‘change champions’ to support the development and implementation of a MoC; an established strategy for achieving knowledge translation into policy and practice [4446]. Organisational champions have been similarly identified as important agents for facilitating organisational adoption of new programs or MoCs [33, 40, 47].

A compelling case for change was also identified as an important enabler for sector engagement. Bleser and colleagues reported that providing evidence-informed arguments is an effective strategy for achieving intellectual buy-in amongst stakeholders [47]. SMEs identified this could be facilitated by using local data, such as local costing [48], activity [49] or access data [50] that was meaningful to health administrators. While burden of disease and national or international data are compelling, where feasible, these should be supplemented by comparable data at a local level. This enabler has also been reported in other recent studies [47].

All stakeholders, perceived value in MoCs as a blueprint for guiding implementation efforts and specifically providing detail on how services could be delivered to a community. The use of locally derived pilot data to demonstrate proof of concept for a particular MoC was considered an important enabler in this regard. For example, the extended rollout of a system inversion MoC for pain management services in Western Australia has been facilitated by initial pilot work in program delivery [51] and aligned health workforce and consumer education [5254]. In this context, a MoC could be considered as a knowledge translation intervention. Support for pragmatic, health services and implementation research is therefore an important consideration for those tasked with the development and implementation of MoCs.

A further identified enabler was the provision of a detailed implementation plan, or guidance on the development of a business plan. This enabler resonates, for example, with international initiatives to develop fracture liaison services for osteoporotic fractures [55]. A concerted international effort has been made to evaluate MoCs for re-fracture prevention and to provide guidance on specific implementation strategies and quality standards [56].

Development of an evaluation framework and associated challenges

SMEs identified evaluation of MoCs as a fundamental principle, and ideally a routine practice after implementation to confirm improvements in service quality, efficiency and consumers’ access and satisfaction – arguably, elements defining ‘success’. However, the concept of an evaluation of ‘readiness’ for implementation was less familiar, yet very strongly supported, particularly in the context of assessing organisational readiness for change as one important component of an overall judgement of readiness. Notably, this was also the most significant challenge identified by the SMEs in this study. Although the state of New South Wales uses formative evaluation to examine readiness and applicability, and summative or impact evaluation to determine success, this approach is not completely standardized or nationally adopted [57].

In the business sector, organisational readiness for change is routine and well described, such as Kotter’s 8 Step model [58]. However, these concepts, skillsets and evaluation capabilities are arguably less well developed in healthcare environments, and is perhaps one factor underpinning unsuccessful implementation of MoCs [25, 59]. In one of the largest systematic reviews conducted to date on the effectiveness of chronic care models, Davy and colleagues [33] reported that no differences could be identified in health and practice outcomes between variant models. It was, therefore, impossible to identify which aspects of the chronic care models led to improvements in health outcomes and practices. They concluded that factors other than the implementation of the model per se, particularly organisational, personal and communication-related, have an important influence on outcomes. A recent review of organisational readiness for change in healthcare identified that psychological and structural domains were the most important domains to consider [60]. However, currently available tools to assess these domains lacked reliability and content, construct and predictive validity [60]. That review, and others [6163], support the concept that a framework to evaluate readiness is needed, particularly as it relates to implementation of chronic disease MoCs [59], and should consider psychological and structural factors [62]. It is not surprising then, that several protocol papers to develop organisational readiness for change tools have been published recently [59, 64, 65] and importantly, reflect just one component of an overall judgement of MoC readiness.

Strengths and limitations

This is the first study to qualitatively explore Australian stakeholders’ views on contemporary MoCs for musculoskeletal health. Strengths include purposeful sampling across multiple stakeholder groups, including consumers [41]; iterative development of an interview schedule informed by pilot testing; and data analysis by a content, as well as methods, expert. While there were clear and recurrent themes identified across stakeholders suggestive of data redundancy, a limitation, in the context of a grounded theory analytical approach, is that we did not validate our conceptualisation of MoC relevance and evaluation with a wider range of stakeholders [66]. There is also a possibility of selection bias in terms of individuals who participated in this research. For example, these individuals may have been more interested in the topic, and therefore more willing to participate. Further, our inclusion criteria and sampling strategy was established a priori to ensure that SMEs had a minimum level of senior, professional experience relevant to our research question and a minimum level of understanding or experience with Australian musculoskeletal MoCs in order to collect meaningful and appropriately-contextualised data. Our data, therefore, reflect the views of stakeholders with experience and a contextual background in the area and not of stakeholders who are naïve to Australian musculoskeletal MoCs. Although this design choice limits generalizability of the findings to the categories of stakeholders sampled, we do not consider this a limitation, but rather a focused design strategy to address the research question, which is an approach congruent with qualitative research principles, particularly purposive sampling [67]. Further, our purposive sampling approach ensured a spread in experiences and familiarity with the MoCs (Table 1). A limitation in our sampling approach was the absence of representation from the orthopaedic surgery discipline. In the current study, we intentionally did not aim to define the components of, or criteria for, a ‘readiness’ and ‘success’ evaluation framework, as this is the focus of ongoing research by our group.

Conclusions

MoCs appear critical for influencing and initiating changes to best-practice healthcare delivery, and for practical guidance on how to implement and evaluate services. An evaluation framework for musculoskeletal MoCs that is particularly focused on capturing ‘readiness’ is important to ensure successful and sustainable implementation efforts. Balancing sufficient detail against an overly simplistic evaluation framework and identifying organisational readiness for change are key challenges in this regard.

Abbreviations

MoC:

Model of care

References

  1. Mathers CD, Murray CJL, Salomon JA, Sadana R, Tandon A, Lopez AD, et al. Healthy life expectancy: comparison of OECD countries in 2001. Aust N Z J Public Health. 2003;27(1):5–11.

    Article  PubMed  Google Scholar 

  2. Van Doorslaer E, Clarke P, Savage E, Hall J. Horizontal inequities in Australia’s mixed public/private health care system. Health Policy. 2008;86(1):97–108.

    Article  PubMed  Google Scholar 

  3. World Health Organisation. Innovative Care for Chronic Conditions: Building Blocks for Action: Global Report. Geneva: WHO; 2002.

    Google Scholar 

  4. Vos T, Flaxman AD, Naghavi M, Lozano R, Michaud C, Ezzati M, et al. Years lived with disability (YLDs) for 1160 sequelae of 289 diseases and injuries 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet. 2013;380(9859):2163–96.

    Article  Google Scholar 

  5. March L, Smith EU, Hoy DG, Cross MJ, Sanchez-Riera L, Blyth F, et al. Burden of disability due to musculoskeletal (MSK) disorders. Best Pract Res Clin Rheumatol. 2014;28(3):353–66.

    Article  PubMed  Google Scholar 

  6. Department of Health and Ageing. Primary Health Care Reform in Australia. Report to Support Australia’s First National Primary Health Care Strategy. Canberra: Commonwealth of Australia; 2009.

    Google Scholar 

  7. Briggs AM, Towler SC, Speerin R, March LM. Models of care for musculoskeletal health in Australia: now more than ever to drive evidence into health policy and practice. Aust Health Rev. 2014;38(4):401–5.

    Article  PubMed  Google Scholar 

  8. Arthritis and Osteoporosis Victoria. A problem worth solving. The rising cost of musculoskeletal conditions in Australia. Melbourne: Arthritis and Osteoporosis Victoria; 2013.

    Google Scholar 

  9. Watts JJ, Abimanyi-Ochom J, Sanders KM. Osteoporosis costing all Australians. A new burden of disease analysis 2012–2022. Sydney: Osteoporosis Australia; 2013.

    Google Scholar 

  10. The European Bone and Joint Health Strategies Project. European Action Towards Better Musculoskeletal Health: A Public Health Strategy to Reduce the Burden of Musculoskeletal Conditions. A Bone and Joint Decade Report. Lund: Department of Orthopaedics, University Hospital 2004.

  11. Mithal A, Ebeling P, Kyer CS. The Asia-Pacific Regional Audit. Epidemiology, costs & burden of osteoporosis in 2013. Nyon: International Osteoporosis Foundation; 2013.

    Google Scholar 

  12. Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care. 2001;39(8 Suppl 2):II46–54.

    CAS  PubMed  Google Scholar 

  13. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635–45.

    Article  PubMed  Google Scholar 

  14. Schuster MA, McGlynn EA, Brook RH. How good is the quality of health care in the United States? 1998. Milbank Q. 2005;83(4):843–95.

    Article  PubMed Central  PubMed  Google Scholar 

  15. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362(9391):1225–30.

    Article  PubMed  Google Scholar 

  16. Hunter DJ, Neogi T, Hochberg MC. Quality of osteoarthritis management and the need for reform in the US. Arthritis Care Res. 2011;63(1):31–8.

    Article  Google Scholar 

  17. Woolf AD, Akesson K. Understanding the burden of musculoskeletal conditions. The burden is huge and not reflected in national health priorities. BMJ. 2001;322(7294):1079–80.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  18. Al Maini M, Adelowo F, Al Saleh J, Al Weshahi Y, Burmester GR, Cutolo M, et al. The global challenges and opportunities in the practice of rheumatology: White paper by the World Forum on Rheumatic and Musculoskeletal Diseases. Clin Rheumatol. 2014;34:819–29.

    Article  PubMed Central  PubMed  Google Scholar 

  19. Brand C, Hunter D, Hinman R, March L, Osborne R, Bennell K. Improving care for people with osteoarthritis of the hip and knee: how has national policy for osteoarthritis been translated into service models in Australia? Int J Rheum Dis. 2011;14(2):181–90.

    Article  PubMed  Google Scholar 

  20. Briggs AM, Bragge P, Slater H, Chan M, Towler SC. Applying a Health Network approach to translate evidence-informed policy into practice: a review and case study on musculoskeletal health. BMC Health Serv Res. 2012;12:394.

    Article  PubMed Central  PubMed  Google Scholar 

  21. Speerin R, Slater H, Li L, Moore K, Chan M, Dreinhofer K, et al. Moving from evidence to practice: Models of care for the prevention and management of musculoskeletal conditions. Best Pract Res Clin Rheumatol. 2014;28(3):479–515.

    Article  PubMed  Google Scholar 

  22. Department of Health (Western Australia). Results of the models of care survey: A snapshot of how models of care have been implemented in Western Australia. Perth: Health Networks Branch; 2012.

    Google Scholar 

  23. Consulting A. Formative evaluation of the osteoporotic re-fracture prevention project. Final evaluation report. Sydney: NSW Health Agency for Clinical Innovation; 2012.

    Google Scholar 

  24. Advisory O’C. Pain Management Model of Care: Formative Evaluation. Sydney: NSW Agency for Clinical Innovation; 2015.

    Google Scholar 

  25. Department of Health (Western Australia). Implementation of models of care and frameworks – progress report 2015. Perth: Strategic System Policy and Planning Division, WA Department of Health; 2015.

    Google Scholar 

  26. Cunningham FC, Ranmuthugala G, Westbrook JI, Braithwaite J. Net benefits: assessing the effectiveness of clinical networks in Australia through qualitative methods. Implement Sci. 2012;7:108.

    Article  PubMed Central  PubMed  Google Scholar 

  27. MacKay C, Veinot P, Badley EM. Characteristics of evolving models of care for arthritis: a key informant study. BMC Health Serv Res. 2008;8:147.

    Article  PubMed Central  PubMed  Google Scholar 

  28. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

    Article  PubMed  Google Scholar 

  29. Zardo P, Collie A, Livingstone C. External factors affecting decision-making and use of evidence in an Australian public health policy environment. Social Sci Med. 2014;108:120–7.

    Article  Google Scholar 

  30. Beatty PC, Willis GB. Research synthesis: The practice of cognitive interviewing. Public Opin Q. 2007;71(2):287–311.

    Article  Google Scholar 

  31. Patton M. Qualitative Research & Evaluation Methods 3rd ed. California: Sage Publications; 2002.

    Google Scholar 

  32. Strauss A, Corbin J. Basics of qualitative research. Techniques and procedures for developing grounded theory. 2nd ed. California: Sage Publications; 1998.

    Google Scholar 

  33. Davy C, Bleasel J, Liu HM, Tchan M, Ponniah S, Brown A. Effectiveness of chronic care models: opportunities for improving healthcare practice and health outcomes: a systematic review. BMC Health Serv Res. 2015;15:194.

    Article  PubMed Central  PubMed  Google Scholar 

  34. Lauvergeon S, Burnand B, Peytremann-Bridevaux I. Chronic disease management: a qualitative study investigating the barriers, facilitators and incentives perceived by Swiss healthcare stakeholders. BMC Health Serv Res. 2012;12:176.

    Article  PubMed Central  PubMed  Google Scholar 

  35. Hroscikoski MC, Solberg LI, Sperl-Hillen JM, Harper PG, McGrail MP, Crabtree BF. Challenges of change: A qualitative study of chronic care model implementation. Ann Fam Med. 2006;4(4):317–26.

    Article  PubMed Central  PubMed  Google Scholar 

  36. Davis AM, Cott C, Wong R, Landry M, Li LD, Jones A, et al. Models of care for arthritis: Drivers, facilitators and barriers to their development and implementation. Arthritis Rheum. 2013;65:S822–S3.

    Article  Google Scholar 

  37. Briggs AM, Slater H, Bunzli S, Jordan JE, Davies SJ, Smith AJ, et al. Consumers’ experiences of back pain in rural Western Australia: access to information and services, and self-management behaviours. BMC Health Serv Res. 2012;12:357.

    Article  PubMed Central  PubMed  Google Scholar 

  38. Tong A, Jones J, Speerin R, Filocamo K, Chaitow J, Singh-Grewal D. Consumer perspectives on pediatric rheumatology care and service delivery: a qualitative study. J Clin Rheumatol. 2013;19(5):234–40.

    Article  PubMed  Google Scholar 

  39. Arthritis and Osteoporosis Victoria. Exploring the needs of Arthritis and Osteoporosis Victoria’s stakeholders: Consumers. Melbourne: Arthritis and Osteoporosis Victoria; 2013.

    Google Scholar 

  40. Kadu MK, Stolee P. Facilitators and barriers of implementing the chronic care model in primary care: a systematic review. BMC Family Prac. 2015;16:12.

    Article  Google Scholar 

  41. Gill SD, Gill M. Partnering with consumers: national standards and lessons from other countries. Med J Aust. 2015;203(3):134–6.

    Article  PubMed  Google Scholar 

  42. Cunningham FC, Morris AD, Braithwaite J. Experimenting with clinical networks: the Australasian experience. J Health Organ Manag. 2012;26(6):685–96.

    Article  PubMed  Google Scholar 

  43. Cunningham FC, Ranmuthugala G, Plumb J, Georgiou A, Westbrook JI, Braithwaite J. Health professional networks as a vector for improving healthcare quality and safety: a systematic review. BMJ Qual Saf. 2011;21:239–49.

    Article  PubMed Central  PubMed  Google Scholar 

  44. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.

    Article  PubMed Central  PubMed  Google Scholar 

  45. Doumit G, Gattellari M, Grimshaw J, O’Brien MA. Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2007;1:CD000125.

    PubMed  Google Scholar 

  46. Mitton C, Adair CE, McKenzie E, Patten SB, Waye PB. Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007;85(4):729–68.

    Article  PubMed Central  PubMed  Google Scholar 

  47. Bleser WK, Miller-Day M, Naughton D, Bricker PL, Cronholm PF, Gabbay RA. Strategies for achieving whole-practice engagement and buy-in to the patient-centered medical home. Ann Fam Med. 2014;12(1):37–45.

    Article  PubMed Central  PubMed  Google Scholar 

  48. Briggs AM, Sun W, Miller LJ, Geelhoed E, Huska A, Inderjeeth CA. Hospitalisations, admission costs and refracture risk related to osteoporosis in Western Australia are substantial: a 10-year review. Aust N Z J Public Health. 2015. doi:10.1111/1753-6405.12381

  49. Burke ALJ, Denson LA, Mathias JL, Hogg MN. An analysis of multidisciplinary staffing levels and clinical activity in australian tertiary persistent pain services. Pain Med. 2015;16(6):1221–37.

    Article  PubMed  Google Scholar 

  50. Hogg MN, Gibson S, Helou A, DeGabriele J, Farrell MJ. Waiting in pain: A systematic investigation into the provision of persistent pain services in Australia. Med J Australia. 2012;196(6):386–90.

    Article  PubMed  Google Scholar 

  51. Davies S, Quintner J, Parsons R, Parkitny L, Knight P, Forrester E, et al. Preclinic group education sessions reduce waiting times and costs at public pain medicine units. Pain Med. 2011;12(1):59–71.

    Article  PubMed  Google Scholar 

  52. Slater H, Briggs AM, Bunzli S, Davies SJ, Smith AJ, Quintner JL. Engaging consumers living in remote areas of Western Australia in the self-management of back pain: a prospective cohort study. BMC Musculoskelet Disord. 2012;13:69.

    Article  PubMed Central  PubMed  Google Scholar 

  53. Slater H, Briggs AM, Smith AJ, Bunzli S, Davies SJ, Quintner JL. Implementing evidence-informed policy into practice for health care professionals managing people with low back pain in Australian rural settings: a preliminary prospective single-cohort study. Pain Med. 2014;15(10):1657–68.

    Article  PubMed  Google Scholar 

  54. Slater H, Davies SJ, Parsons R, Quintner JL, Schug SA. A policy-into-practice intervention to increase the uptake of evidence-based management of low back pain in primary care: A prospective cohort study. PLoS ONE. 2012;7(5):e38037.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  55. Akesson K, Mitchell P. Capture the fracture. A global campaign to break the fragility fracture cycle. Nyon: International Osteoporosis Foundation; 2012.

    Google Scholar 

  56. Akesson K, Marsh D, Mitchell PJ, McLellan AR, Stenmark J, Pierroz DD, et al. Capture the Fracture: a best practice framework and global campaign to break the fragility fracture cycle. Osteoporos Int. 2013;24(8):2135–52.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  57. NSW Agency for Clinical Innovation. Understanding Program Evaluation. An ACI Framework. Sydney: NSW Agency for Clinical Innovation; 2013.

    Google Scholar 

  58. Kotter JP. Accelerate. Harv Bus Rev. 2012;90:45–58.

    Google Scholar 

  59. Gagnon MP, Labarthe J, Legare F, Ouimet M, Estabrooks CA, Roch G, et al. Measuring organizational readiness for knowledge translation in chronic care. Implement Sci. 2011;6:72.

    Article  PubMed Central  PubMed  Google Scholar 

  60. Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev. 2008;65(4):379–436.

    Article  PubMed  Google Scholar 

  61. Holt DT, Armenakis AA, Harris SG, Feild HS. Toward a comprehensive definition of readiness for change: A review of research and instrumentation. Res Organ Change Development. 2007;16:289–336.

    Google Scholar 

  62. Walker HJ, Armenakis AA, Bernerth JB. Factors influencing organizational change efforts - An integrative investigation of change content, context, process and individual differences. J Organ Chang Manag. 2007;20(6):761–73.

    Article  Google Scholar 

  63. Weiner BJ. A theory of organizational readiness for change. Implement Sci. 2009;4:67.

    Article  PubMed Central  PubMed  Google Scholar 

  64. Helfrich CD, Blevins D, Smith JL, Kelly PA, Hogan TP, Hagedorn H, et al. Predicting implementation from organizational readiness for change: a study protocol. Implement Sci. 2011;6:76.

    Article  PubMed Central  PubMed  Google Scholar 

  65. Khan S, Timmings C, Moore JE, Marquez C, Pyka K, Gheihman G, et al. The development of an online decision support tool for organizational readiness for change. Implement Sci. 2014;9:56.

    Article  PubMed Central  PubMed  Google Scholar 

  66. Mays N, Pope C. Qualitative research in health care. Assessing quality in qualitative research. BMJ. 2000;320(7226):50–2.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  67. Sandelowski M. The problem of rigor in qualitative research. Adv Nurs Sci. 1986;8:27–37.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

The 27 SMEs who participated in the study are gratefully acknowledged for sharing their time and expertise. Funding for this study was provided by partnership grants awarded by the Australian Physiotherapy Research Foundation, the NSW Agency for Clinical Innovation and the Department of Health, Government of Western Australia. Funding bodies have not influenced the design, data collection, analysis, and interpretation of data; and have not been involved in the writing of the manuscript or in the decision to submit the manuscript for publication.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew M. Briggs.

Additional information

Competing interests

RS and JC are employees of the agencies that have provided grant funding to support this project. The agencies have not influenced interpretation or presentation of the data in the manuscript.

Authors’ contributions

All authors were involved in drafting the article or revising it critically for important intellectual content, and all authors approved the final version to be submitted for publication. AMB had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. All authors contributed to the study conception and design. AMB, HS, MJ, and RS were responsible for acquisition of data. JEJ and AMB were responsible for data analysis and all authors were responsible for interpretation of the data.

Additional file

Additional file 1:

Consolidated criteria for reporting qualitative studies (COREQ): 32-item checklist. (DOCX 110 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Briggs, A.M., Jordan, J.E., Speerin, R. et al. Models of care for musculoskeletal health: a cross-sectional qualitative study of Australian stakeholders’ perspectives on relevance and standardised evaluation. BMC Health Serv Res 15, 509 (2015). https://doi.org/10.1186/s12913-015-1173-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-015-1173-9

Keywords