Skip to main content
  • Research article
  • Open access
  • Published:

Validation of public health competencies and impact variables for low- and middle-income countries

Abstract

Background

The number of Master of Public Health (MPH) programmes in low- and middle-income countries (LMICs) is increasing, but questions have been raised regarding the relevance of their outcomes and impacts on context. Although processes for validating public health competencies have taken place in recent years in many high-income countries, validation in LMICs is needed. Furthermore, impact variables of MPH programmes in the workplace and in society have not been developed.

Method

A set of public health competencies and impact variables in the workplace and in society was designed using the competencies and learning objectives of six participating institutions offering MPH programmes in or for LMICs, and the set of competencies of the Council on Linkages Between Academia and Public Health Practice as a reference. The resulting competencies and impact variables differ from those of the Council on Linkages in scope and emphasis on social determinants of health, context specificity and intersectoral competencies. A modified Delphi method was used in this study to validate the public health competencies and impact variables; experts and MPH alumni from China, Vietnam, South Africa, Sudan, Mexico and the Netherlands reviewed them and made recommendations.

Results

The competencies and variables were validated across two Delphi rounds, first with public health experts (Nā€‰=ā€‰31) from the six countries, then with MPH alumni (Nā€‰=ā€‰30). After the first expert round, competencies and impact variables were refined based on the quantitative results and qualitative comments. Both rounds showed high consensus, more so for the competencies than the impact variables. The response rate was 100%.

Conclusion

This is the first time that public health competencies have been validated in LMICs across continents. It is also the first time that impact variables of MPH programmes have been proposed and validated in LMICs across continents. The high degree of consensus between experts and alumni suggests that these public health competencies and impact variables can be used to design and evaluate MPH programmes, as well as for individual and team assessment and continuous professional development in LMICs.

Peer Review reports

Background

Responding to the crisis in human resources for health and the need for a well-established public health workforce, the number of Master of Public Health (MPH) training programmes has increased, especially in low- and middle-income countries (LMICs) [1ā€“4]. As the LMIC context differs deeply from High Income Countries, the question has been posed whether existing LMIC programmes equip public health alumni to be effective, and whether the taught competencies from these programmes are relevant to their contexts [4ā€“6]. Since the 1990s, in schooling and higher education globally, detailed descriptions of expected performance or competencies have been commonly used as drivers of curriculum development, programme evaluation, job function delineation and continuous professional development assessments [7ā€“9]. For some of these purposes, competencies are defined as the ā€œeffective application of available knowledge, skills, attitudes and values in complex situationsā€ [7].

Over the past decade, public health competencies have received considerable attention, and have been developed and refined in a range of countries: in the United States of America (USA) they were formulated by the Council on Linkages Between Academia and Public Health Practice in 2001 and revisions adopted in 2010 [10]; the Public Health Agency of Canada published a list in 2007 [11], while in Europe, the Association of Schools of Public Health in the European Region (ASPHER) drafted a list in 2008, which were redefined in 2011 [12]. In the same year (2008), the United Kingdom (UK) Public Health Skills and Career Framework was endorsed [13], while in Australia, the Foundation Competencies for Master of Public Health alumni [14] were published in 2009. These public health competencies were, in many instances, developed through group discussions and a modified Delphi method, with varying degrees of input from academia and public health practitioners at different levels [8, 12ā€“14].

In LMICs, the Public Health Foundation of India held a multi-country conference in 2008, attended by a wide range of local and international delegates and experts. Some delegates were commissioned to develop reports on the state of public health training in their own countries, with a view to informing the development of the public health curriculum in India [15]. Since then, public health competencies have also been developed in Latin America [16]. However public health competencies have not been accepted nor validated across LMICs.

Furthermore, although restructuring curricula in terms of competencies constitutes a statement of intent on behalf of the provider, it does not demonstrate whether these competencies have been acquired, nor whether the selected competencies had impact in the workplace or in society. A review by Zwanikken et al. [17] revealed that very few Masters programmes in health and health care have defined their intended impact on the workplace and in society in general, by specifying outcome or impact indicators.

When six institutions offering MPH programmes came together in December 2011 to design a comparative impact evaluation across programmes, each brought to the discussion the set of key competencies which have guided their programmes over the past decade. These were to serve as the basis for formulating the competencies and impact variables against which to evaluate the impact of the MPH programmes across all six institutions involved in this study. As part of the design, these competencies and impact variables were to be validated using a Delphi process. All the institutions were engaged in training health and allied health professionals working in LMICs and included: School of Public Health, University of the Western Cape, South Africa; Hanoi School of Public Health, Vietnam; School of Public Health Fudan, China; National Institute of Public Health, Mexico; University of Medical Science and Technology (UMST), Sudan (through the Ministry of Health), and the Royal Tropical Institute, the Netherlands.

Methods

The team used a multistep process, starting with the December 2011 meeting of MPH programme convenors from six countries, to reach consensus on a set of public health competencies, develop a list of draft impact variables and design the validation process. The process of developing the competencies and variables aimed to represent the diversity amongst institutional competencies and learning objectives as well to harmonize and streamline the competency statements sufficiently to establish a shared basis for the evaluation. Specific competencies articulated by particular schools were discussed until consensus was reached on whether and how to include them. The resulting set of competencies includes the common competencies of all schools and seeks to articulate key areas of public health performance. The process of modification was discussion, careful deliberation and consensus building during the face to face meeting, email communications and two Skype meetings.

After the competencies were defined and agreed upon, impact variables were developed. The impact variables were divided into impact on the workplace, such as developing improved working procedures within a work unit, and impact on the sector or society, such as improved quality of care for patients. The team formulated the impact variables through inductive logic while taking into consideration the public health competencies that had already been defined. Although these two levels of intended impact are linked, they were not constructed to be directly equivalent.

Prior to embarking on the study, the ethics committees of the six participating institutions, ie the University of Western Cape Senate Research and Ethics Committee, Hanoi School of Public Health Ethic Committee, Fudan University School of Public Health Institutional Review Board, Sudan Medical and Scientific Research Institute (SUMASRI) Ethical Clearance Committee, National Institute of Public Health Ethic Committee, Royal Tropical Institute Research Ethics Committee, granted ethical approval for the study.

The initial meeting was followed by a period of refinement and discussion of the validation design within the team, by email and Skype conferencing. Validation was undertaken using a modified Delphi process. A number of researchers have used the Delphi method to generate consensus on the public health competencies of different health professionals [18ā€“21]. Though the original conceptualisation of the Delphi method includes at least two and sometimes three rounds of feedback by the same experts [21], a modified Delphi process was chosen: this involved consulting a group of experienced public health experts (1st round) invited by each convenor on the basis of maximum diversity, and if need be, a second round by the experts, followed by consultation of a similarly selected group of programme alumni (2nd or 3rd round), and possibly another round with the alumni. The rationale for including alumni was that their experience would enable them to critique the competencies and impact variables from the perspective of what they regard as relevant in their field of work. Maximum professional, gender and cohort diversity criteria were used in their selection.

The five public health experts from each country (Nā€‰=ā€‰31) were asked to review and validate the public health competencies and impact variables using a Likert scale graded from 1 (signifying that the competence was of ā€˜poorā€™ relevance) to 5 (indicating ā€˜excellentā€™ relevance to the field of public health practice). The intention was that the decision to undertake further rounds of consultation with the same groups should be based on the degree of consensus found. Qualitative comments and suggestions were also invited.

Responses were entered and stored in Microsoft Office ExcelĀ® 2003 (Microsoft Corporation, USA), and calculations were made using Excel. Since the results from the first round showed considerable consensus, a further round was not deemed to be required. The expertsā€™ feedback, however, guided further refinement of the competencies and impact variables, which was then circulated to alumni (Nā€‰=ā€‰30) across the six MPH programmes for further validation. Once again, based on the level of consensus in the results of the graduate round, a further round with them was not deemed necessary.

No agreement exists in the literature on how to measure consensus; measures of central tendency and measures of dispersion are often used [22]. According to Argyrous [23], the median can be used with ranked data (ordinal and interval/ratio), but this is not considered useful for scales with few values; in addition the mean can be used for data that are not skewed. Initially we determined the cut-off points as a meanā€‰<ā€‰=3.9 or a variation coefficientā€‰>ā€‰=0.26. However, the data appeared to be skewed, so the median was chosen: a median of 4 or 5 was considered a good degree of consensus.

Developing the public health competencies and impact variables

The competencies and impact variables which were validated in the Delphi process had been developed through the deliberations of the six LMIC schools of public health. At an early stage, the competencies were compared with those of the Council on Linkages Between Academia and Public Health Practice [10], as these competencies have been widely used elsewhere as a basis for curriculum design [24ā€“26]. Having considered this framework, the team decided to structure the list of competencies into seven competency clusters. Taking into account the argument against ā€˜atomizationā€™ or fragmentation into component parts of competencies/outcomes, and the recognition that the overarching competence is often considered the best expression thereof [27, 28], the team used the clusters to condense and clarify the competencies. In the course of this clustering process, we decided to group ā€˜analytical/assessment skillsā€™ with ā€˜public health sciences skillsā€™ā€‰, as we consider ā€˜analytical/assessmentā€™ skills to be embedded in, and the active component of ā€˜public health skillsā€™.

In making this comparison, additions to our competencies are notable because we view them as important in the LMICs context. Gender issues were absent from the Council on Linkages framework. The ā€˜pro-poor and equity-based approachā€™ was specifically added in response to population needs of LMICs. Furthermore, the team added competencies related to the social determinants of health, as they are considered to be an important foundation for public health practice, as acknowledged in the work of the WHO Commission on the Social Determinants of Health, and others [29]. The conceptualization of cultural competencies was also expanded to encompass ā€˜context-sensitive competenciesā€™ā€‰, in recognition that health status is determined by far more than cultural or background factors, including social, economic, political and gender factors. ā€˜Policy advocacyā€™ was added to the domain of policy, as well as the need for ā€˜context sensitivityā€™ of policies. ā€˜Intersectoral competenciesā€™ were added to community competencies, because intersectoral engagement is regarded as a critical principle in furthering the impact of public health. A further contrast with the Council on Linkages framework is that where they assign an important role to financial planning and management, this groupā€™s competencies emphasize planning and management as a whole, including finances (FigureĀ 1).

Figure 1
figure 1

Public health competencies.

Results

As described in the Methodology, the competencies and impact variables were validated using two Delphi rounds, yielding quantitative and qualitative results.

Validation by experts

To ensure a maximum variation sample of experts in the public health field, the following criteria were used: reviewers should have a broad view on public health, at least 10 yearsā€™ work experience in public health, and work at different workplace types. The expert group was required to include both men and women, at least one person from a university (e.g. professor/programme trainer/policymaker of an educational department), one from health system management (i.e. at national/provincial level), one from a service delivery institution (e.g. Centers for Disease Control and Prevention or other public health institutions) and one from a non-governmental organization (NGO). Respondents were recruited by the MPH convener of the respective schools by email or telephone.

Respondents were provided the competencies and impact variables and asked to rate the relevance of each competency and impact variable using a Likert scale from 1 (ā€˜poorā€™ relevance) to 5 (ā€˜excellentā€™ relevanceā€™). The key to the competencies and impact variables noted: ā€˜Relevance in this study means that this particular competency is expected of a Public Health Masters graduate working in the field of Public Healthā€™. As regards impact variables: ā€˜excellent relevanceā€™ suggested that this effect on the graduateā€™s workplace or sector of society was very important, e.g. they were asked to rate the relevance of ā€˜Contributed to equity/pro-poor orientation towards health access at all levelsā€™. Comments and additional suggestions were also received from the expert group.

Responses were received over two months from 31 experts (21 men and 10 women). Eighteen of the experts were from universities, seven from health system management, three from service delivery institutions and three from NGOs, including international agencies such as the United Nations Family Planning Association. All public health experts had more than 10 years of experience in public health. In the analysis of data, medians were calculated for all scores as well as for each country to identify cross-country variability (see Additional file 1).

Quantitative analysis across all six countries revealed that 11 of the competencies had a median of 4, while 12 competencies had a median of 5, which shows a high degree of consensus. Qualitative comments focused mainly on the formulation of selected competencies, either pointing out vagueness of expression, a dual focus question or adding to and improving formulations.

Results for the workplace impact variables also showed a high level of consensus: 19 of them had a median of 4, two had a median of 4.5, and three had a median of 5 (TableĀ 1). Two of the lowest variables (with a median of 3) were: the graduate had ā€˜published book chaptersā€™ā€‰, which was considered too high an expectation for an MPH graduate; and ā€˜Projects [were] rewarded and by what amountā€™; this was also considered too ambitious and concern was expressed that raising funds could often not be attributed to one person alone. Some experts commented that several workplace variables required a scale, that some were difficult to measure, some too broad and some too ambitious, given certain contexts.

Table 1 Results of two Delphi validation rounds

All nine of the impact variables on society had a median of 4 (TableĀ 1) across the six countries. General feedback by some experts was that it was difficult to score these variables, as the scoring would depend on the context in which a graduate works. The importance of each variable would depend on the level and role of the graduate, as well as the specific field in which they work, for example as a policy maker, implementation manager, educator or researcher. One expert commented that it would be difficult to attribute an indicator to one person, as so often public health workers operate in teams. Qualitative comments suggested that some society level impact variables were too broad, and too ambitious given certain contexts; in some cases, clarifications were suggested.

Some cross-country variability was identified, with one school scoring overall lower regarding the competencies and variables. In this schoolā€™s survey, four out of 23 competencies had a median of 3, while other schools had only one competency with a median of 3, or none. For the impact variables at work, the same school scored three variables less than 3, while other schools scored one to three variables with a median of 3.

For impact variables on society, there were two schools which scored four variables with a median of 3. The competencies and variables which were scored lower in the one school were also not rated highly in other schools, and were changed.

Based on the quantitative results and the qualitative comments from the experts, 14 of the 23 competencies were reworded to improve clarity (TablesĀ 1 and 2). Two of the workplace impact variables were changed because the median of 3 was low: ā€˜Achievements which can be attributed to the leadership of the graduate, e.g. individual or organisational awardsā€™ was deleted as it was seen as overlapping with another variable, and the following variable was added: ā€˜Participated in building a successful partnershipā€™ā€‰, based on comments from experts. Eleven of the 26 impact variables in the workplace were reworded, based on qualitative feedback (TablesĀ 1 and 3). Four of the nine impact variables on society were reworded based on the qualitative feedback, and one variable was added, based on comments from experts: ā€˜Influenced better understanding of public health measures amongst the general populationā€™ (TablesĀ 1 and 4).

Table 2 Results of validation of competencies
Table 3 Results of validation of impact variables in workplace
Table 4 Results of validation of impact variables on society

Validation by alumni

After the researchers reached agreement on the revision, each school sent the competencies and impact variables out to five alumni for a second round of validation. The selection of alumni was based on maximum variation per school using criteria of gender, geographical location, year of graduation (between 2004 and 2010) and workplace type. Respondents were again asked to rate the relevance of each competency on a five-point Likert scale ā€“ i.e. whether each competency is expected of an MPH graduate working in public health; the key for scoring was revised for greater clarity, and ranged from 1 (ā€˜Not a key competencyā€™) to 5 (ā€˜Highly relevantā€™) on the advice of one of the experts. Respondents were also asked to rate the drafted impact variables from 1 (ā€˜Not a key variableā€™) to 5 (ā€˜Highly relevantā€™). They were also asked for comments and additional suggestions. It took a further two months to gather feedback from these 14 men and 16 women. One of the alumni graduated in 2004; two in 2005; five in 2006; four in 2007; six in 2008; six in 2009; five in 2010; and one in 2011. Of them, ten alumni worked for higher education institutions, 12 in health system management, seven for service delivery institutions and one for an NGO.

Medians for each country were computed to identify cross country variability. Quantitative results revealed that ten competencies had a median of 4, and 13 competencies a median of 5 (TablesĀ 1 and 2), showing a high degree of consensus (see Additional file 2).

In relation to the impact variables in the workplace, 19 of the 26 variables had a median of 4, and seven had a median of 5 (TablesĀ 1 and 3). Seven of the impact variables in society had a median of 4, and three had a median of 5 (TablesĀ 1 and 4).

As regards to cross-country variability for the competencies, alumni from one school rated two competencies at 3, alumni from two different schools scored five out of 23 work impact variables at 3, while alumni from other schools scored a median of 3 for zero, one or three variables. With regard to the impact variables in society, alumni from two different schools scored two and three different impact variables at 3.

General qualitative feedback suggested that the impact variables were dependent on the actual job or workplace of an MPH graduate, as well as the expectation of a student at the start of a program, which concurred with the feedback from the experts. Qualitative feedback suggested that wording could be more specific in 12 of the 23 competencies, 11 of the 26 impact variables on work and five of the 10 impact variables on society. For example, of the fourth competency, ā€˜Commissions researchā€™ā€‰, two alumni (A15 male, A20 male) commented: ā€˜Consider adding application of ethical principlesā€™. For the second impact variable on society: ā€˜Contributed to changed guidelines, regulations, ordinances beyond the workplaceā€™, alumni commented: ā€˜Itā€™s hard because of the old thought about what public health is, but weā€™re pushing for the changeā€™ (A23, Female) and [it is] ā€˜not easy to demonstrateā€™ (A27, Female). (See Additional file 3).

Discussion

A set of competencies and impact variables of MPH programmes were formulated and validated with public health experts and alumni of the programmes. This is the first time, to our knowledge, that public health competencies have been validated for MPH programmes located in or intended for LMICs, across continents. It was also the first time that impact variables of MPH programmes have been formulated and validated. Although there were some variations across countries, the results show an overall consensus of the 23 public health competencies, 26 impact variables on the workplace and 10 impact variables on society.

The process of competency development has differed across the globe: in the USA as well as in the UK, a large number of experts were involved [8, 13], but in both cases, the process was criticized for being too strongly directed by the higher ranks [12]. Another approach was taken by ASPHER in the European region, which involved local employer and workforce representatives [12]. Other recent studies reviewing public health competency formulation surveyed only specific stakeholders such as employers [29], experts [20], academic practitioners and employers, but not alumni [18, 19]. Higher numbers of people than were used in this study were sometimes included in the panel [18, 30], however, the response rate and the level of consensus was lower. None of the initiatives reviewed in the literature developed impact variables.

Public health is in different stages of development in the different countries. However, in spite of this and other contextual diversity factors, the validation process yielded high consensus. It is possible, however, that if experts and alumni had been asked to prioritize or weight the competencies and impact variables, differences would have become more pronounced [30].

The nature of the Delphi method is qualitative in design and does not seek statistical representativeness in the number of experts invited, but rather attempts to achieve maximum variation of the characteristics of the experts, based on purposeful selection [21]. Future work could engage a larger number of experts or ensure a wider variety of experts.

In this validation process, the ā€˜public health science skillsā€™ as well as the ā€˜context sensitive competenciesā€™ received the highest ratings from both experts and alumni: clearly the addition of the ā€˜context sensitive competenciesā€™ was deemed important in the context of LMICs. Further high scoring competencies were ā€˜planning and managementā€™ā€‰, ā€˜communicationā€™ as well as ā€˜leadershipā€™ and ā€˜systems thinkingā€™ competencies. Slightly lower ratings were assigned to ā€˜policy developmentā€™ and ā€˜community and intersectoral competenciesā€™. Though still highly rated, the ā€˜policy development competenciesā€™ and ā€˜community competenciesā€™ might be less valued because of assumptions regarding the working level and roles played by MPH alumni.

The highest scoring impact variables on the workplace amongst experts and alumni were: ā€˜Created evidence for decision makingā€™ā€‰, ā€˜Developed a study or research proposalā€™ā€‰, ā€˜Reported and made recommendations on populations health status or needsā€™ā€‰, as well as the variable ā€˜Implemented performance improvement strategies in response to monitoring and evaluation findingsā€™. Interestingly the variable, ā€˜Contributed to addressing determinants of healthā€™ was scored higher by alumni than experts, possibly reflecting the alumniā€™ current experience in the field as well as recent public health developments emphasising the determinants. As for impact variables in society, there was not much difference between the rating of experts and alumni, except for the first indicator: ā€˜Contributes to changes in policy or strategy in generalā€™: this was rated higher by the alumni. The highest rated competencies and variables were clearly strongly endorsed and indicated a high degree of consensus.

Limitations

Although the experts came from four different continents and six countries, there was a high degree of consensus regarding the rating of competencies, though this applied to a lesser extent to the impact variables. It is suggested that consensus might have been promoted by social desirability: although the experts were anonymous to one another, they were selected by MPH programme convenors who had developed the competencies and variables. Intra-rater variability, however, showed scores from 1ā€“5. In addition, no prioritization of competencies was requested which might have elicited even greater social desirability bias. By using selection criteria for experts to ensure maximum variation, this bias was reduced.

The validation by alumni yielded similar results, with increased congruency, i.e. no impact variable with a median of 3, and more with a median of 5. For the graduate respondents, it is possible that social desirability may have influenced the results, as they were informed that the competencies and variables had already been reviewed by experts. However given the fact that there was less consensus regarding the impact variables and intra-rater variability was mostly between 2ā€“5, this was probably not the case. In retrospect, it might have been better to engage the alumni in the first rather than the second round, to avoid this possible risk of bias. The question is, however, whether that would have yielded different results, given the already high consensus in the first round. The fact that the alumni were only invited to participate in the second round created the opportunity to improve the competencies and impact variables before they received them. The selection of alumni may also have been biased, as this was undertaken by MPH programme convenors; this bias was minimized through the use of explicit selection criteria.

The validated impact variables yielded relatively high consensus although less than the competencies. Both the experts and alumni commented that contextual factors, such as the position of the graduate or the level at which the graduate was working, influenced whether these variables could be measured; this was raised as a greater concern for the impact variables on society. Some cross-country variability was identified however, given the small number of experts and alumni per country and the fact that no specific school could be identified as differing markedly from others, and given the otherwise high consensus, these findings were thought not to be material to the study.

Conclusion

This study contributes to the debates and deliberations on appropriate selection of public health competencies in LMICs. The validation of the competencies and impact variables suggests that public health competencies in LMICs should differ from those in high income countries by placing emphasis on factors which impact on the health of their populations, such as examining the social determinants of health, focusing on context specificity and intersectoral competencies, with less emphasis on financial planning in management.

Inasmuch as this formulation and validation process of public health competencies is understood to be a first initiative and that impact variables for MPH alumni working in LMICs have not previously been developed or at least publicised, the study can be said to have provided a foundation for further refinement, and suggested surprising consensus across countries. Although the social, cultural and political situation and the state of public health development differs considerably between the countries where the six MPH programmes are situated, clear consensus emerged as to what the public health competencies and impact variables should entail.

These public health competencies and impact variables can, therefore, be used to design or evaluate MPH programmes and to assess the competencies of individuals engaged in formal programmes and continuous professional education.

Authorā€™s contributions

PZ conceived the article. PZ, LA, NTH, QX, LMV, XHY, MCGR, MAW and SN were all involved in the data collection. All authors contributed to data analysis, writing and review. All authors read and approved the final manuscript.

Abbreviations

A:

Alumnus

ASPHER:

Association of schools of public health in the European region

MPH:

Master of Public Health

LMIC:

Low and middle income countries

NGO:

Non-governmental organisation

UK:

United Kingdom

USA:

Unites States of America.

References

  1. Hongara C, McPake B: How to bridge the gap in human resources for health. Lancet. 2004, 2004 (364): 1451-1458.

    ArticleĀ  Google ScholarĀ 

  2. World Health Organization: The World Health Report 2006: Working together for health. 2006, Geneva: WHO press, World Health Organisation

    BookĀ  Google ScholarĀ 

  3. Petrakova A, Sadana R: Problems and progress in public health education. Bulletin of the World Health Organization. 2007, 85 (12): 963-965. 10.2471/BLT.07.046110.

    ArticleĀ  PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  4. Sadana R, Petrakova A: Shaping public health education around the world to address health challenges in the coming decades. Bull World Health Organ. 2007, 85 (12): 902-10.2471/BLT.07.049247.

    ArticleĀ  PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  5. Sadana R, Chowdhury AMR, Petrakova A: Strengthening public health education and training to improve global health. Bull World Health Organ. 2007, 85: 163-10.2471/BLT.06.039321.

    ArticleĀ  PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  6. Plugge E, Cole D: Oxford graduatesā€™ perceptions of a global health masterā€™s degree: a case study. Human Res Health. 2011, 9 (1): 26-10.1186/1478-4491-9-26.

    ArticleĀ  Google ScholarĀ 

  7. Calhoun JG, Davidson PL, Sinioris ME, Vincent ET, Griffith JR: Toward an understanding of competency identification and assessment in health care management. Qual Manag Health Care. 2002, 11 (1): 14-38. 10.1097/00019514-200211010-00006.

    ArticleĀ  PubMedĀ  Google ScholarĀ 

  8. Calhoun J, Ramiah K, McGean Weist E, Shortell SM: Development of a core competency model for the Master of Public Health degree. Am J Public Health. 2008, 98: 1598-1607. 10.2105/AJPH.2007.117978.

    ArticleĀ  PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  9. Frank JR, Mongroo R, Ahmad Y, Wang M, De Rossi S, Horsley T: Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010, 32: 631-637. 10.3109/0142159X.2010.500898.

    ArticleĀ  PubMedĀ  Google ScholarĀ 

  10. The Council on Linkages Between Academia and Public Health Practice: Core competencies for Public Health Professionals Tier 1, Tier 2 and Tier 3 (adopted 3 May 2010). http://www.phf.org/resourcestools/Documents/Core_Competencies_for_Public_Health_Professionals_2010May.pdf,

  11. Public health agency of Canada: Core competencies for public health in Canada Release 1.0. 2007, http://www.phac-aspc.gc.ca/php-psp/ccph-cesp/stmts-enon-eng.php,

    Google ScholarĀ 

  12. Birt CA, Foldspang A: The developing role of systems of competences in public health education and practice. Public Health Rev. 2011, 33 (1): 134-147.

    Google ScholarĀ 

  13. Wright J, Rao M, Walker K: The UK public health skills and career framework ā€“ could it help to make public health the business of every workforce?. Public health. 2008, 122: 541-544. 10.1016/j.puhe.2008.03.006.

    ArticleĀ  PubMedĀ  Google ScholarĀ 

  14. Genat B, Robinson P, Parker E: Foundation competencies for Master of Public Health graduates in Australia. 2009, Brisbane QUT Publications: Australian Network of Academic Public Health Institutions

    Google ScholarĀ 

  15. Public Health Foundation of India: Report of International Conference on New Directions for Public Health Education in Low and Middle Income Countries, Processes, Proceedings and Proposed Next Steps. 2008, Hyderabad, India: Public Health Foundation

    Google ScholarĀ 

  16. MagaƱa Valladares L: Essential Skills Regional Framework Public Health, Latin America, working paper. 2011, Cuernavaca: INSP/MEX Dr. Charles Godue, OPS/OMS

    Google ScholarĀ 

  17. Zwanikken PAC, Dieleman M, Samaranayake D, Akwataghibe N, Scherpbier A: A systematic review of outcome and impact of Masterā€™s in health and health care. BMC Med Educ. 2013, 13: 18-10.1186/1472-6920-13-18. doi:10.1186/1472-6920-13-18

    ArticleĀ  PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  18. Retoo KN, Harrington JM, Macdonald EB: Required competencies of occupational physicians: a Delphi survey of UK customers. Occup Environ Med. 2005, 62: 406-413. 10.1136/oem.2004.017061.

    ArticleĀ  Google ScholarĀ 

  19. Jonsdottir S, Hughes R, Thorsdottir I, Yngve A: Consensus on the competencies required for public health nutrition workforce development in Europe ā€“ the JobNut project. Public Health Nutr. 2011, 14: 1439-1449. 10.1017/S1368980010000625.

    ArticleĀ  PubMedĀ  Google ScholarĀ 

  20. Kennie-Kaulbach N, Farrell B, Ward N, Johnston S, Gubbels A, Eguale T, Dolovich L, Jorgenson D, Waite N, Winslade N: Pharmacist provision of primary health care: a modified Delphi validation of pharmacistsā€™ competencies. BMC Family Pract. 2012, 13: 27-10.1186/1471-2296-13-27.

    ArticleĀ  Google ScholarĀ 

  21. Powell C: The Delphi technique: myths and realities. J Adv Nurs. 2003, 41 (4): 376-382. 10.1046/j.1365-2648.2003.02537.x.

    ArticleĀ  PubMedĀ  Google ScholarĀ 

  22. Von der Gracht HA: Consensus measurement in Delphi studies review and implications for future quality assurance. Technol Forecast Soc Change. 2012, doi:10.1016/j.techfore.2012.04.013

    Google ScholarĀ 

  23. Argyrous G: Statistics for Research. 2005, London: Sage Publications, 2

    Google ScholarĀ 

  24. Borders S, Blakely C, Quiram B, McLeroy K: Considerations for increasing the competences and capacities of the public health workforce: assessing the training needs of public health workers in Texas. Human Res Health. 2006, 4: 18-10.1186/1478-4491-4-18.

    ArticleĀ  Google ScholarĀ 

  25. Van der Putten M, Vichit-Vadakan N, Chuchat A, Love EJ: Assessing required skill mastery in public health competencies in Thailand. J Ed Health. 2006, 19 (2): 233-243.

    ArticleĀ  Google ScholarĀ 

  26. Hagopian A, Spigner C, Gorstein JL, Mercer MA, Pfeiffer J, Frey S, Benjamin L, Gloyd S: Developing competencies for a graduate school curriculum in international health. Public Health Reports. 2008, 123 (3): 408-414.

    PubMedĀ  PubMed CentralĀ  Google ScholarĀ 

  27. Tuck R: An Introductory Guide to National Qualifications Frameworks: Conceptual and Practical Issues for Policy Makers. 2007, Geneva: ILO

    Google ScholarĀ 

  28. Ten Cate O: Trust, competence, and the supervisorā€™s role in postgraduate training. Br Med J. 2006, 333 (7571): 748-751. 10.1136/bmj.38938.407569.94.

    ArticleĀ  Google ScholarĀ 

  29. Biesma RG, Pavlova M, Vaatstra R, Van Merode GG, Czabanowska K, Smith T, Groot W: Generic versus specific competencies of entry-level public health graduates: employers perceptions in Poland, the UK, and the Netherlands. Adv Health Sci Ed. 2008, 13: 325-343. 10.1007/s10459-006-9044-0.

    ArticleĀ  Google ScholarĀ 

  30. Pfeiffer J, Beschta J, Hohl S, Wasserheit J: Competency-based curricula to transformglobal health: redesign with the end in mind. Acad Med. 2013, 88: 10.1097/ACM.0b013e318276bdf4

    Google ScholarĀ 

Pre-publication history

Download references

Acknowledgements

The contribution of the respondents to the Delphi survey ā€“ both experts and alumni ā€“ is greatly appreciated. The study for this article was funded by the respective institutions and partly funded by the Ministry of Foreign Affairs of the Netherlands.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Prisca AC Zwanikken.

Additional information

Competing interests

The authors report no competing interests.

Electronic supplementary material

Authorsā€™ original submitted files for images

Below are the links to the authorsā€™ original submitted files for images.

Authorsā€™ original file for figure 1

Authorsā€™ original file for figure 2

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Zwanikken, P.A., Alexander, L., Huong, N.T. et al. Validation of public health competencies and impact variables for low- and middle-income countries. BMC Public Health 14, 55 (2014). https://doi.org/10.1186/1471-2458-14-55

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1471-2458-14-55

Keywords