Skip to main content
  • Research article
  • Open access
  • Published:

A tool for self-assessment of communication skills and professionalism in residents

Abstract

Background

Effective communication skills and professionalism are critical for physicians in order to provide optimum care and achieve better health outcomes. The aims of this study were to evaluate residents' self-assessment of their communication skills and professionalism in dealing with patients, and to evaluate the psychometric properties of a self-assessment questionnaire.

Methods

A modified version of the American Board of Internal Medicine's (ABIM) Patient Assessment survey was completed by 130 residents in 23 surgical and non-surgical training programs affiliated with a single medical school. Descriptive, regression and factor analyses were performed. Internal consistency, inter-item gamma scores, and discriminative validity of the questionnaire were determined.

Results

Factor analysis suggested two groups of items: one group relating to developing interpersonal relationships with patients and one group relating to conveying medical information to patients. Cronbach's alpha (0.86) indicated internal consistency. Males rated themselves higher than females in items related to explaining things to patients. When compared to graduates of U.S. medical schools, graduates of medical schools outside the U.S. rated themselves higher in items related to listening to the patient, yet lower in using understandable language. Surgical residents rated themselves higher than non-surgical residents in explaining options to patients.

Conclusion

This appears to be an internally consistent and reliable tool for residents' self-assessment of communication skills and professionalism. Some demographic differences in self-perceived communication skills were noted.

Peer Review reports

Background

Excellent communication skills are essential to the practice of medicine.[1] Effective communication during medical encounters has been associated with significant benefits in areas such as patient recall and understanding, adherence to treatment plans, symptom resolution, physiological outcomes, and medical decision-making, as well as satisfaction of both patients and physicians.[2, 3]

Positive doctor-patient relations can increase the patient's perceptions of physician competence.[4] Research has shown that physicians who exhibit negative communication behaviors are more likely to have been sued in the past for malpractice than those with more positive doctor-patient relations. [5–8] Beckman, et al.[9] found that in 70% of malpractice depositions, communication problems between physicians and patients were identified.

Several national organizations have recognized the importance of fostering and evaluating communication skills and professionalism in physicians in training and in practice.[1, 10–12] The Association of American Medical Colleges (AAMC) Medical School Objectives Project urges medical schools to teach these skills.[13] The Accreditation Council for Graduate Medical Education (ACGME) has defined these skills as core competencies that programs have to train in and evaluate.[14] Finally, the American Board of Internal Medicine (ABIM) includes an evaluation of communication skills and professionalism in the recertification process of practicing physicians.[15]

The wide agreement on the importance of communication skills and professionalism challenges medical educators to develop effective methods and tools to evaluate them. Current evaluation methods in residency training programs include attending physician assessment, peer evaluation, standardized patient assessment, actual patient assessment, and self-assessment. [16–18] One of the available evaluation tools is the ABIM Patient Assessment survey, a widely used instrument for patient evaluation of physicians. [19–23]

The aims of this study were to evaluate residents' self-assessment of their communication skills and professionalism in dealing with patients, and to evaluate the psychometric properties of the ABIM Patient Assessment survey, modified as a self-assessment questionnaire for this purpose. Self-assessment, though it has some limitations, [24–30] may be used as part of a multi-source evaluation scheme for those studying and practicing medicine.[1, 10, 31–35]

Methods

Setting and participants

The Office of Graduate Medical Education of the University at Buffalo School of Medicine and Biomedical Sciences organizes a yearly core curriculum "master session" on professionalism. Residents in all of the 23 affiliated training programs are required to attend this master session at least once by the time they graduate from their residency.

Study design

We invited all residents attending the 2007 master session on professionalism to voluntarily participate in the study by completing a two-page anonymous questionnaire. We distributed the questionnaire at the beginning of the session along with other lecture-related material. At the end of the session, those who chose to participate left the survey in a box anonymously. The Institutional Review Board of the University at Buffalo approved the study.

Survey questionnaire

The survey questionnaire was a modified version of the ABIM's Patient Assessment survey, which is part of the Patient and Physician Peer Assessment Module for maintenance of certification.[36] The Patient Assessment survey includes 11 questions about different aspects of communication skills and professionalism of the physician. Patients rate each of these aspects on five-point Likert scales ranging from 1 (poor) to 5 (excellent). Lipner[19] applied generalizability theory to the survey as administered to patients about their physicians. The variance component for participants was 0.01. The generalizability coefficient was 0.67 with a 95% confident interval of ± 0.14.

We modified the Patient Assessment survey to a physician self-assessment survey by using the third-person (e.g., "greeting them warmly") instead of the second-person (e.g., "greeting you warmly") while otherwise preserving the exact same wording. The questionnaire included additional questions related to resident demographic characteristics (age, sex), and educational characteristics (year of residency, country of graduation [U.S. versus international], number of years since medical school graduation, and specialty).

Statistical analysis

We performed a descriptive analysis (frequencies and percentages) of the demographic and educational characteristics of the participants. We conducted factor analysis to search for items showing a high level of commonality. Prior to conducting the factor analysis, we tested the suitability of the scale using the Kaiser-Meyer-Oklin (KMO) test and the Bartlett's Test of Sphericity. Because the data were skewed toward the higher values, we re-categorized the answer choices into three groups: "poor or fair," "good," and "very good or excellent."

We then conducted a series of analyses to evaluate the psychometric properties of the survey questionnaire. We measured internal consistency reliability (Cronbach's alpha) to assess whether all the items were contributing to the measurement of professionalism. We also measured inter-item gamma scores to examine the level of relationship between the items in the questionnaire. Gamma replaced the standard Pearson r correlation due to the categorical nature of the results. We assessed association utilizing cross tabulation techniques with chi square statistics to determine the discriminative validity of the proposed instrument, regardless of the demographic and educational characteristics of the participants. In addition, we calculated the Kruskal-Wallis K Independent Samples test to examine differences in total score across demographic and educational groups. The total score was computed by tabulating the sum of all 11 items to provide a continuous measure of communications skills and professionalism. We used SPSS version 13.0 (SPSS, Inc., Chicago, Illinois) for all analyses.

Results

Of 149 individuals who registered for the session, 132 participated in the survey but only 130 completed the entire questionnaire (87.2% response rate). Table 1 lists participants' demographic and educational characteristics.

Table 1 Demographic Characteristics

Factor analysis

The scale was suitable for the factor analysis: the KMO test value was 0.806 (recommended minimum level is 0.6), and the Bartlett test value was statistically significant (p < 0.001).

Factor analysis revealed the presence of three components with eigen values over 1, explaining 60.5% of the total variance. These three components were groupings of questions based on common themes. Upon inspecting the scree plot obtained during initial analysis of the factors, a break was observed between factors two and three; thus it was decided to retain only the first two factors for investigation. The two retained factors accounted for a total of 51.4% of the variance. We defined these factors as items related to "Interpersonal Relations" and items related to "Conveying Medical Information." To further aid interpretation, Varimax rotation was performed. Table 2 shows the rotated factors, emphasizing a relatively simple structure with factors loading on only one component in all but two cases. Items loading strongly on component one (interpersonal relations) include Item 1 (being truthful, telling the patient everything, etc.), Item 2 (greeting them warmly, etc.), Item 3 (treating the patient on the same level, not "talking down", etc.), Item 4 (letting the patient tell his/her story, not interrupting, etc.), Item 5 (showing interest in the patient, not acting bored/uninterested), and Item 10 (using understandable words, explaining medical and technical terms).

Table 2 Rotated Component Matrix

Items loading strongly on component two (conveying medical information) include; Item 6 (warning the patient prior to a physical exam), Item 7 (discussing options with the patient, offering choices, etc.), Item 8 (encouraging the patient to ask questions), and Item 9 (explaining to the patient what they need to know about their medical issue). There was overlap on two items, Item 10 (using words the patient can understand) and Item 11 (overall level of professionalism). Of particular note is the overlap between the two components on Item 11 (overall level). Additional data should be collected to further examine this overlap.

Item 10 loads in both component 1 (interpersonal relations) and component 2 (conveying medical information) indicating that clarity in speaking is important in interpersonal relations and conveying medical information. However, it loads stronger in component 1 (0.563 versus 0.369).

Psychometric properties of the survey questionnaire

The Cronbach's alpha level was 0.86, well above the minimally acceptable level of 0.7. In addition, each component's internal consistency reliability was assessed. Component 1 (Interpersonal Relations) showed an internal consistency reliability of 0.82 and component two (Communication and Professionalism) showed an internal consistency reliability of 0.80. Based on these results, both components show a high level of internal consistency reliability, suggesting that each component's items are measuring the main idea of the components.

The Kruskal-Wallis K Independent Samples test resulted in no significant differences between groups in each of the demographic and educational variables. The lack of significant differences shows that the total score of professionalism is not influenced in a significant way by membership in any one of these groups.

The gamma tests showed significant associations between some of the self-rating items and some of the educational and demographic characteristics. Males, compared with females, rated themselves significantly more positively on Item 1 (telling your patient everything, etc.; p = 0.032), Item 9 (explaining to your patient what they need to know, etc.; p = 0.040), and Item 10 (using words they can understand, etc; p = 0.017).

There were two statistically significant differences between graduates of U.S. medical schools and graduates of medical schools outside the U.S. International graduates, compared with U.S. graduates, rated themselves significantly more positively on Item 4 (letting patients tell their story, etc; p = 0.003). On the other hand, U.S. graduates, compared with international graduates, rated themselves significantly more positively on Item 10 (using words they can understand, etc.; p = 0.003).

Finally, residents in surgical specialties, compared with non-surgical specialties, rated themselves significantly more positively on Item 7 (discussing options with patients, etc.; p = 0.005).

Discussion

The main utility of our educational exercise (the professionalism presentation) was to raise resident awareness regarding issues of doctor-patient communication and to expose them to a tool which has been used to survey patients about their perceptions of their doctors. Our psychometric analysis of the adaptation of the ABIM patient satisfaction survey for resident self-assessment shows highly acceptable levels of internal consistency reliability. In addition, factor analysis shows us the beginning development of two distinct components of professionalism ("interpersonal relations" and "conveying medical information"). These two components allow for the potential creation of separate sub-scales of professionalism. Of particular interest is the overlap on the two components of overall level of professionalism. This overlap could potentially be due to a number of influences. For instance, it could be that both subscales adequately measure overall professionalism and the item is important in both components. One might speculate that items related to "conveying information" are fairly mechanistic, whereas items related to "interpersonal relations" require being more in tune with the patient's emotions. Ginsburg and colleagues[37] noted that the nature of professionalism is context-specific. Perhaps physicians need to rely on a different set of communication skills depending on the context of the clinical interaction (e.g., delivering a poor prognosis for a cancer versus describing the risks/benefits/alternatives of epidural anesthesia during labor). The issue of "sub-scales" of professionalism could be explored further by administering the survey to patients and residents immediately after a clinical encounter and noting the context of the interaction.

The results also show significant associations between self ratings and certain participant characteristics. Males tended to rate themselves higher than females on certain items. While this might be their self-perception, reality may prove contrary. In a study by Minter,[38] female surgical residents were found to underestimate their skills compared to attending physician evaluation. Additional studies have found that females score better than males after a training course in communication skills, [39–42] female physicians engage in more patient-centered communication than do males,[43] and female graduates of medical schools outside the U.S. received slightly higher communication and interpersonal skills ratings than did male graduates of medical schools outside the U.S. on the United States Medical Licensing Examination™ Step 2 Clinical Skills exam.[44] When the original ABIM survey was administered to patients for purposes of assessing physicians, female doctors received higher overall professionalism ratings from their patients.[19]

Foreign graduates reported doing a better job at listening to their patients, whereas U.S. graduates reported using more understandable language. This may have more to do with confidence in English language proficiency rather than basic communication skills; foreign graduates might listen more closely so as not to misunderstand their patients, and their lack of confidence in speaking English might cause them to perceive that they are not speaking in language which is easily understandable to their patients. This would need to be pursued in a further study, particularly comparing resident self-assessment to actual patient perception.

Surgical residents rated themselves more positively in discussing options with patients compared to residents in non-surgical specialties. The quality of these communication skills has a substantial impact on patient outcomes, including superior recovery rates in patients undergoing surgical procedures,[45] reduced need for analgesic use to treat post-operative pain,[46] and improved emotional and functional adjustment in adult cancer patients.[47] We might speculate that surgery residents' perceptions of their ability to discuss options might be higher than non-surgical residents' because, often in surgery, the options are more concrete (i.e., surgery versus no surgery) whereas patient management options in the non-surgical specialties are often less concrete (i.e., one medication regime versus another medication regime) and perhaps more difficult to articulate to patients.

Current schemes for evaluating communication skills and professionalism in students and residents have flaws when used in isolation.[37] For instance, although standardized patients have been shown to be useful in evaluating the mechanics of clinical and communication skills,[48] their reliability in assessing elements of professionalism is less established. These settings may also be considered contrived. Faculty evaluations of professional behavior using instruments with Likert scales have been shown to have poor reliability, primarily because the attending physician usually does not interact extensively with the student in the work environment. [49–51] Peer evaluations, though they provide good information about interpersonal skills, are problematic because of the "halo effect" (i.e., if a person is popular, this "halo" may affect the evaluation of specific traits) and because peers are often reluctant to comment on poor behavior in colleagues.[1, 10, 16, 52, 53] Actual patient evaluations would seem to be the best source of evaluation of communication skills, but can be influenced greatly by the setting in which they are performed.

Recent reviews of self-assessment in the health professions raise questions about the ability of professionals to generate accurate judgments of their own performance.[24, 25] Of even more concern is that those who perform the least well on external assessment may also overrate their performance on self-assessment exercises.[24, 26–28] Most studies of self-assessment are in areas of technical knowledge and ability.[24] Even in concrete areas such as these, self-assessment has been found to be inaccurate.[29, 30, 37] This may be of even more concern when the area of assessment is laden with value judgments, as is the case in communication skills and professionalism.[54] In one study, medical residents perceived a high level of competence to discuss end-of-life issues, but failed to engage in recommended behaviors for such discussions.[55] When surveys use self-assessment, they are subject to social desirability bias.[1] This may limit the usefulness of these self-assessments to formative assessment and the formation of personal goals.[37]

It is thought that a more representative picture of professional behavior can be obtained by surveying a variety of patients and people in the healthcare team because they interact with students and residents on a more regular basis and may provide a more informative picture of professional behavior throughout their academic career.[1, 10, 31–33] This type of assessment is referred to as a "360-degree Evaluation." Resident self-assessment of professionalism skills may therefore be more useful when compared with the assessments made by others.[34, 35]

There may also be educational benefit simply by the process of self-reflection. Reflection, both on the process and content of learning can help students to monitor their own learning.[56, 57] Reflection-in-learning is related to readiness for self-regulation of learning and may be conducive to enhanced diagnostic ability.[58] Studies have found that a greater effort of reflection is associated with a more positive or meaningful learning experience.[58, 59] The rationale for encouraging reflection in the promotion of self-directed learning is extensive.[60] Actually, reflection is conceived as one of the metacognitive skills or cognitive regulation strategies required for the development of self-regulated learning, from a theoretical viewpoint.[61]

Conclusion

Through psychometric analysis, we have shown that the adaptation of the ABIM patient satisfaction survey is reliable for self-assessment of communication skills in residents. Although the face validity of the items has been reviewed elsewhere,[19] we hope to further establish the validity of this tool for resident self-assessment by administering the survey to patients and residents immediately following a clinical encounter, and comparing aggregate patient evaluations with the residents' own self-assessment of communication and professionalism skills. Although, as part of the ABIM Maintenance of Certification process, physicians are asked to answer the patient survey questions as a "self-assessment," we are aware of no published studies comparing the physician and patient data. We would like to explore comparing these data with results we have obtained or will obtain in our own studies of resident physicians. We would also like to use the resident self-assessment in conjunction with other resident evaluation methods currently in place at our institution.

References

  1. Stern DT: Measuring medical professionalism. 2006, Oxford; New York: Oxford University Press

    Google Scholar 

  2. Stewart MA: Effective physician-patient communication and health outcomes: a review. CMAJ. 1995, 152: 1423-1433.

    Google Scholar 

  3. Stewart M, Brown JB, Boon H, Galajda J, Meredith L, Sangster M: Evidence on patient-doctor communication. Cancer Prev Control. 1999, 3: 25-30.

    Article  Google Scholar 

  4. Moore PJ, Adler NE, Robertson PA: Medical malpractice: the effect of doctor-patient relations on medical patient perceptions and malpractice intentions. West J Med. 2000, 173: 244-250. 10.1136/ewjm.173.4.244.

    Article  Google Scholar 

  5. Hickson GB, Clayton EW, Entman SS, Miller CS, Githens PB, Whetten-Goldstein K, Sloan FA: Obstetricians' prior malpractice experience and patients' satisfaction with care. JAMA. 1994, 272: 1583-1587. 10.1001/jama.272.20.1583.

    Article  Google Scholar 

  6. Hickson GB, Clayton EW, Githens PB, Sloan FA: Factors that prompted families to file medical malpractice claims following perinatal injuries. JAMA. 1992, 267: 1359-1363. 10.1001/jama.267.10.1359.

    Article  Google Scholar 

  7. Levinson W, Roter D: Physicians' psychosocial beliefs correlate with their patient communication skills. J Gen Intern Med. 1995, 10: 375-379. 10.1007/BF02599834.

    Article  Google Scholar 

  8. Levinson W, Roter DL, Mullooly JP, Dull VT, Frankel RM: Physician-patient communication. The relationship with malpractice claims among primary care physicians and surgeons. JAMA. 1997, 277: 553-559. 10.1001/jama.277.7.553.

    Article  Google Scholar 

  9. Beckman HB, Markakis KM, Suchman AL, Frankel RM: The doctor-patient relationship and malpractice. Lessons from plaintiff depositions. Arch Intern Med. 1994, 154: 1365-1370. 10.1001/archinte.154.12.1365.

    Article  Google Scholar 

  10. Arnold L: Assessing professional behavior: yesterday, today, and tomorrow. Acad Med. 2002, 77: 502-515.

    Article  Google Scholar 

  11. Stern DT, Friedman Ben-David M, Norcini J, Wojtczak A, Schwarz MR: Setting school-level outcome standards. Med Educ. 2006, 40: 166-172. 10.1111/j.1365-2929.2005.02374.x.

    Article  Google Scholar 

  12. Stern DT, Frohna AZ, Gruppen LD: The prediction of professional behaviour. Med Educ. 2005, 39: 75-82. 10.1111/j.1365-2929.2004.02035.x.

    Article  Google Scholar 

  13. Learning objectives for medical student education – guidelines for medical schools: report I of the Medical School Objectives Project. Acad Med. 1999, 74: 13-18.

  14. Toolbox for the evaluation of competence. [http://www.acgme.org]

  15. Improve your Practice With PIMs. [http://www.abim.org/online/pim/demo.aspx]

  16. Lynch DC, Surdyk PM, Eiser AR: Assessing professionalism: a review of the literature. Med Teach. 2004, 26: 366-373. 10.1080/01421590410001696434.

    Article  Google Scholar 

  17. Terry R, Hiester E, James GD: The use of standardized patients to evaluate family medicine resident decision making. Fam Med. 2007, 39: 261-265.

    Google Scholar 

  18. Franzese CB: Pilot study of an Objective Structured Clinical Examination ("the Six Pack") for evaluating clinical competencies. Otolaryngology Head Neck Surg. 2008, 138: 143-148. 10.1016/j.otohns.2007.10.017.

    Article  Google Scholar 

  19. Lipner RS, Blank LL, Leas BF, Fortna GS: The value of patient and peer ratings in recertification. Acad Med. 2002, 77: S64-66. 10.1097/00001888-200210001-00021.

    Article  Google Scholar 

  20. Fan VS, Reiber GE, Diehr P, Burman M, McDonell MB, Fihn SD: Functional status and patient satisfaction: a comparison of ischemic heart disease, obstructive lung disease, and diabetes mellitus. J Gen Intern Med. 2005, 20: 452-459. 10.1111/j.1525-1497.2005.40057.x.

    Article  Google Scholar 

  21. Harris LE, Swindle RW, Mungai SM, Weinberger M, Tierney WM: Measuring patient satisfaction for quality improvement. Med Care. 1999, 37: 1207-1213. 10.1097/00005650-199912000-00004.

    Article  Google Scholar 

  22. Rider EA, Hinrichs MM, Lown BA: A model for communication skills assessment across the undergraduate curriculum. Med Teach. 2006, 28: e127-134. 10.1080/01421590600726540.

    Article  Google Scholar 

  23. Yudkowsky R, Alseidi A, Cintron J: Beyond fulfilling the core competencies: an objective structured clinical examination to assess communication and interpersonal skills in a surgical residency. Curr Surg. 2004, 61: 499-503. 10.1016/j.cursur.2004.05.009.

    Article  Google Scholar 

  24. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L: Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006, 296: 1094-1102. 10.1001/jama.296.9.1094.

    Article  Google Scholar 

  25. Eva KW, Regehr G: Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005, 80: S46-54. 10.1097/00001888-200510001-00015.

    Article  Google Scholar 

  26. Hodges B, Regehr G, Martin D: Difficulties in recognizing one's own incompetence: novice physicians who are unskilled and unaware of it. Acad Med. 2001, 76: S87-89. 10.1097/00001888-200110001-00029.

    Article  Google Scholar 

  27. Lynn DJ, Holzer C, O'Neill P: Relationships between self-assessment skills, test performance, and demographic variables in psychiatry residents. Adv Health Sci Educ Theory Pract. 2006, 11: 51-60. 10.1007/s10459-005-5473-4.

    Article  Google Scholar 

  28. Kruger J, Dunning D: Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999, 77: 1121-1134. 10.1037/0022-3514.77.6.1121.

    Article  Google Scholar 

  29. Gordon MJ: Self-assessment programs and their implications for health professions training. Acad Med. 1992, 67: 672-679. 10.1097/00001888-199210000-00012.

    Article  Google Scholar 

  30. Regehr G, Hodges B, Tiberius R, Lofchy J: Measuring self-assessment skills: an innovative relative ranking model. Acad Med. 1996, 71: S52-54.

    Article  Google Scholar 

  31. Harris GD: Professionalism: part II – teaching and assessing the learner's professionalism. Fam Med. 2004, 36: 390-392.

    Google Scholar 

  32. De Haes JC, Oort FJ, Hulsman RL: Summative assessment of medical students' communication skills and professional attitudes through observation in clinical practice. Med Teach. 2005, 27: 583-589. 10.1080/01421590500061378.

    Article  Google Scholar 

  33. de Haes JC, Oort F, Oosterveld P, ten Cate O: Assessment of medical students' communicative behaviour and attitudes: estimating the reliability of the use of the Amsterdam attitudes and communication scale through generalisability coefficients. Patient Educ Couns. 2001, 45: 35-42. 10.1016/S0738-3991(01)00141-0.

    Article  Google Scholar 

  34. Brinkman WB, Geraghty SR, Lanphear BP, Khoury JC, Gonzalez del Rey JA, Dewitt TG, Britto MT: Effect of multisource feedback on resident communication skills and professionalism: a randomized controlled trial. Arch Pediatr Adolesc Med. 2007, 161: 44-49. 10.1001/archpedi.161.1.44.

    Article  Google Scholar 

  35. Wood J, Collins J, Burnside ES, Albanese MA, Propeck PA, Kelcz F, Spilde JM, Schmaltz LM: Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol. 2004, 11: 931-939.

    Google Scholar 

  36. Patient and Physician Peer Assessment Module. [http://www.abim.org/pdf/pim-related/parpsq.pdf]

  37. Ginsburg S, Regehr G, Hatala R, McNaughton N, Frohna A, Hodges B, Lingard L, Stern D: Context, conflict, and resolution: a new conceptual framework for evaluating professionalism. Acad Med. 2000, 75: S6-S11. 10.1097/00001888-200010001-00003.

    Article  Google Scholar 

  38. Minter RM, Gruppen LD, Napolitano KS, Gauger PG: Gender differences in the self-assessment of surgical residents. Am J Surg. 2005, 189: 647-650. 10.1016/j.amjsurg.2004.11.035.

    Article  Google Scholar 

  39. Smith DK, Shaw RW, Slack J, Marteau TM: Training obstetricians and midwives to present screening tests: evaluation of two brief interventions. Prenat Diagn. 1995, 15: 317-324. 10.1002/pd.1970150404.

    Article  Google Scholar 

  40. Marteau TM, Humphrey C, Matoon G, Kidd J, Lloyd M, Horder J: Factors influencing the communication skills of first-year clinical medical students. Med Educ. 1991, 25: 127-134. 10.1111/j.1365-2923.1991.tb00038.x.

    Article  Google Scholar 

  41. Holm U, Aspegren K: Pedagogical methods and affect tolerance in medical students. Med Educ. 1999, 33: 14-18. 10.1046/j.1365-2923.1999.00332.x.

    Article  Google Scholar 

  42. Holm U: The affect reading scale: a method of measuring the prerequisites for empathy. Scandinavian Journal of Educational Research. 1996, 40: 239-253. 10.1080/0031383960400304.

    Article  Google Scholar 

  43. Roter DL, Hall JA, Aoki Y: Physician gender effects in medical communication: a meta-analytic review. JAMA. 2002, 288: 756-764. 10.1001/jama.288.6.756.

    Article  Google Scholar 

  44. van Zanten M, Boulet JR, McKinley DW, DeChamplain A, Jobe AC: Assessing the communication and interpersonal skills of graduates of international medical schools as part of the United States Medical Licensing Exam (USMLE) Step 2 Clinical Skills (CS) Exam. Acad Med. 2007, 82: S65-68.

    Article  Google Scholar 

  45. Mumford E, Schlesinger HJ, Glass GV: The effect of psychological intervention on recovery from surgery and heart attacks: an analysis of the literature. Am J Public Health. 1982, 72: 141-151. 10.2105/AJPH.72.2.141.

    Article  Google Scholar 

  46. Melamed BG, Siegal L: Behavioural medicine. Practical applications in health care. 1980, New York: Springer

    Google Scholar 

  47. Meyer TJ, Mark MM: Effects of psychosocial interventions with adult cancer patients: a meta-analysis of randomized experiments. Health Psychol. 1995, 14: 101-108. 10.1037/0278-6133.14.2.101.

    Article  Google Scholar 

  48. Stern D: Outside the classroom: teaching and evaluating future physicians. Spec Law Dig Health Care Law. 2006, 9-42.

    Google Scholar 

  49. Swick HM, Szenas P, Danoff D, Whitcomb ME: Teaching professionalism in undergraduate medical education. JAMA. 1999, 282: 830-832. 10.1001/jama.282.9.830.

    Article  Google Scholar 

  50. Veloski JJ, Fields SK, Boex JR, Blank LL: Measuring professionalism: a review of studies with instruments reported in the literature between 1982 and 2002. Acad Med. 2005, 80: 366-370. 10.1097/00001888-200504000-00014.

    Article  Google Scholar 

  51. Ainsworth MA, Szauter KM: Medical student professionalism: are we measuring the right behaviors? A comparison of professional lapses by students and physicians. Acad Med. 2006, 81: S83-86. 10.1097/00001888-200610001-00021.

    Article  Google Scholar 

  52. Arnold EL, Blank LL, Race KE, Cipparrone N: Can professionalism be measured? The development of a scale for use in the medical environment. Acad Med. 1998, 73: 1119-1121. 10.1097/00001888-199810000-00025.

    Article  Google Scholar 

  53. Shue CK, Arnold L, Stern DT: Maximizing participation in peer assessment of professionalism: the students speak. Acad Med. 2005, 80: S1-5. 10.1097/00001888-200510001-00004.

    Article  Google Scholar 

  54. Hassett JM, Zinnerstrom K, Nawotniak RH, Schimpfhauser F, Dayton MT: Utilization of standardized patients to evaluate clinical and interpersonal skills of surgical residents. Surgery. 2006, 140: 633-638. 10.1016/j.surg.2006.07.014. discussion 638–639

    Article  Google Scholar 

  55. Buss MK, Alexander GC, Switzer GE, Arnold RM: Assessing competence of residents to discuss end-of-life issues. J Palliat Med. 2005, 8: 363-371. 10.1089/jpm.2005.8.363.

    Article  Google Scholar 

  56. Boud D, Keogh R, Walker D: Promoting reflection in learning: a model. Reflection: Turning Experience Into Learning. Edited by: Boud D, Keogh R, Walker D. 1985, London: Kogan Page

    Google Scholar 

  57. Gibbs G: Improving the Quality of Student Learning. 1992, Bristol: Technical and Educational Services

    Google Scholar 

  58. Sobral DT: An appraisal of medical students' reflection-in-learning. Med Educ. 2000, 34: 182-187. 10.1046/j.1365-2923.2000.00473.x.

    Article  Google Scholar 

  59. Mitchell R: The development of the cognitive behaviour survey to assess medical student learning. Teach Learn Med. 1994, 6: 161-167.

    Article  Google Scholar 

  60. Hammond M, Collins R: Self-Directed Learning in Practice. 1991, London: Kogan Page

    Google Scholar 

  61. Boekaerts M: Self-regulated learning: a new concept embraced by researchers, policy makers, educators, teachers, and students. Learning Instruction. 1997, 7: 161-186. 10.1016/S0959-4752(96)00015-1.

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgements

The authors would like to acknowledge Andrew Danzo for reviewing and editing the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew B Symons.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

ABS: substantial contributions to research design, the acquisition, analysis and interpretation of data; drafting the paper; approval of the submitted and final versions. AS: substantial contributions to the interpretation of data; revising the paper critically; approval of the submitted and final versions. DM: substantial contributions to the interpretation of data; revising the paper critically; approval of the submitted and final versions. SO: acquisition of data, revising the paper critically; approval of the submitted and final versions. EAA: study design, data analysis and interpretation, revising the paper critically; approval of the submitted and final versions.

Andrew B Symons, Andrew Swanson, Denise McGuigan, Susan Orrange and Elie A Akl contributed equally to this work.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Symons, A.B., Swanson, A., McGuigan, D. et al. A tool for self-assessment of communication skills and professionalism in residents. BMC Med Educ 9, 1 (2009). https://doi.org/10.1186/1472-6920-9-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-9-1

Keywords