Emergency Medicine: Open Access

Emergency Medicine: Open Access
Open Access

ISSN: 2165-7548

+44 1223 790975

Research Article - (2014) Volume 4, Issue 6

How do Emergency Medicine Attending Physicians Evaluate their Trainees? A Multicenter Focus Group Study

Abdulmohsen Alsaawi2,3*#, Mishal Almarshady3, Abdullah Alzabin3, Abdullah Alanazi1,3, Majid Alsalamah1,3 and Mohammed Alsultan2,3#
1College of Applied Medical Sciences, King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia
2College of Medicine, King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia
3Emergency Department, King Abdulaziz Medical City, Riyadh, Saudi Arabia
#Contributed equally to this work
*Corresponding Author: Abdulmohsen Alsaawi, P.O. Box 36228, Postal Code 11419, Riyadh, Saudi Arabia, Tel: 966-547777818 Email:

Abstract

Background: In-training evaluations have an invaluable role in assessing the clinical competency of the trainee. In this study, we explore which trainees’ characteristics have the strongest impact on their evaluation and whether these characteristics fit in the Royal College of Physicians and Surgeons of Canada's CanMEDS Physician Competency Framework. Based on the seven roles that physicians need to have, the framework describes the capabilities that physicians need to produce better patient outcomes. Methods: Emergency medicine attending physicians involved in supervising residents at the four main emergency medicine residency training sites in Riyadh, Saudi Arabia participated in focus group sessions to identify resident characteristics most frequently noted and their impact on the overall evaluation. The interview process followed a standard format. All interviews were audiotaped, and field notes were taken. Two independent coders coded the interviews using CanMEDS competencies as a framework. The frequency of each mention of a particular characteristic was recorded. Following the interviews, participants were also asked to complete a questionnaire about the CanMEDS competencies they routinely or rarely assess. Results are presented in a descriptive fashion. Results: A total of six focus groups sessions were held with 19 participants. The focus group sessions yielded a total of 145 features, or characteristics. Characteristics relating to medical expertise competencies had the strongest impact, followed by professionalism-related competencies, while characteristics relating to health advocacy and managerial skills had the weakest impact on the evaluation. Conclusion: Our results are consistent with previous literature in showing that evaluators tend to base their evaluations on certain competencies and fail to evaluate competencies across the entire CanMEDS spectrum.

Keywords: Emergency medicine; Residency training; Evaluation; Assessment; Feedback; CanMED

Background

In-training evaluations represent an integral part of assessing trainees' clinical competence and progress. The quality of these evaluation reports, most of which are done by clinical supervisors, have been repeatedly questioned in the literature [1-3]. Emergency medicine is one of the most challenging environments when it comes to evaluating trainees and assessing their performance due to factors such as shift work and scheduling conflicts that lead to evaluation of trainees based on short encounters. Likewise, emergency department overcrowding can have a significant impact on adequate clinical exposure and supervision. Last but not least is the complexity of components being evaluated in a learning emergency medicine resident [4,5]. These factors, to name only a few, make it extremely challenging to deliver high-quality clinical supervision and evaluation in such a chaotic context.

A number of studies have addressed what trainees expect from their trainers [6,7]. However, to our knowledge, the opposite scenario—what trainers expect from their trainees—have not been explored in the emergency medicine literature. The primary objective of this study was to explore which characteristics have the strongest positive or negative impact on residents' evaluation results among emergency medicine attending physicians and educators. In addition, because the Saudi Commission of Health Specialties has formally adopted CanMEDS as an evaluation framework, we aimed to examine whether the current evaluation process adequately assesses different CanMEDS competencies and whether factors unrelated to CanMEDS affect the evaluation [8,9].

Methods

The theoretical framework of this qualitative research is based on grounded theory and ethnography. A purposive sampling strategy was followed in which eligible attending physicians with the highest potential for providing relevant and rich information were invited to a focus group interview session about their evaluation of residents. Participants were chosen from the four major emergency medicine residency training sites in Riyadh. Most participants were heavily involved in residents' mentorship and evaluations, and they were required to have had at least 1 year of experience as a mentor to be eligible for inclusion. Participants were assured that all obtained data would remain anonymous and would not be linked to individuals. Scheduling conflicts made it extremely difficult to schedule interview times suitable to all participants; however, we believe that we managed to include a rich subset with 19 participants. Most interviews were conducted in a conference room, and included three to five participants. We did not allow residents to attend the interview in order to encourage participants to speak freely.

Data collection

The study was conducted from August to October 2012. The interview process followed a standard format in terms of prompting questions (Index 1). Participants were asked to identify positive and negative characteristics of residents and list them in order of the weight each carries in the overall evaluation. The same interviewer conducted and audiotaped all interviews, and another facilitator took field notes. The interviews were conducted in English unless participants wanted to express their thoughts in Arabic; however, this rarely happened. Interviews lasted 45 to 60 minutes, depending on the number of participants. This length was enough to reach data saturation in most interviews. After each focus group session, the participants were asked to complete a questionnaire composed of CanMEDS competencies as to whether the participant rarely or routinely assessed specific resident characteristics.

Data analysis

Two coders listened to the interviews and performed the coding process independently. In addition, the facilitator who took field notes coded his notes. Discrepancies were resolved by consensus agreement. The coding framework was based mainly on CANMEDS competencies. After combining the three coding results in a single document, it was revised by another coauthor, which ensured correct categorization of characteristics.

The number of times each general theme and specific qualifier occurred in the transcript was recorded. This number was used as an estimate of the popularity of the characteristic among participants. Code strategies and qualifiers were arranged in descending frequency and tabulated based on the CANMED framework. Any new variable not belonging to the CANMEDS framework was evaluated through a consensus of three emergency medicine attending physicians with significant experience as program directors; two of them hada master’s degree in medical education as well. The aim of the consensus was to assess the appropriateness of the characteristic as an evaluation parameter.

Results

A total of six focus group sessions with 19 participants were held. The focus group sessions yielded a total of 145 characteristics. The average experience of the participants as instructors for emergency medicine residents was 4 years. Fifteen participants had fewer than 5 years of experience, while two had more than 10 years. Participants came from all four main institutes involved in residency training in Riyadh.

The competencies that participants mentioned as having the highest positive or negative impact on the evaluation are shown in Figure 1 in their distribution across the CANMEDs framework. Our focus group results have shown that evaluators base their evaluations mainly on medical expertise, followed by professionalism. Professionalism typically was evaluated in a negative context (i.e., lack of professionalism). Collaboration was third out of the seven competencies, followed by communication and the scholarly competencies. Health advocacy and managerial competencies had the least impact on the evaluation in our sample (Figure 1).

emergency-medicine-participants-mentioned

Figure 1: The competencies that participants mentioned as having the highest positive or negative impact on the evaluation

A total of 19 questionnaires were completed. Medical expert competencies had the highest percentage of “routinely assessed” items, while health advocacy had the highest percentage of “rarely assessed” items (Table 1).

CanMEDS competencies Routinely assessed % Rarely assessed %
Medical expert
Function effectively as consultants, integrating all CanMEDS roles to provide optimal, ethical, and patient-centered medical care 74 26
Establish and maintain clinical knowledge, skills, and attitudes appropriate to their practice 95 5
Perform a complete and appropriate assessment of a patient 100 0
Use preventive and therapeutic interventions effectively 68 32
Demonstrate proficient and appropriate use of procedural skills, both diagnostic and therapeutic 100 0
Seek appropriate consultation from other health professionals, recognizing the limits of their expertise 100 0
Communicator
Develop rapport, trust, and ethical therapeutic relationships with patients and families 63 37
Accurately elicit and synthesize relevant information and perspectives of patients and families, colleagues, and other professionals 79 21
Accurately convey relevant information and explanations to patients and families, colleagues, and other professionals 79 21
Develop a common understanding on issues, problems, and plans with patients, families, and other professionals to develop a shared plan of care 47 53
Convey effective oral and written information about a medical encounter 89 11
Collaborator
Participate effectively and appropriately in an interprofessional healthcare team 68 32
Effectively work with other health professionals to prevent, negotiate, and resolve interprofessional conflict 74 26
Manager
Participate in activities that contribute to the effectiveness of their healthcare organizations and systems 32 68
Manage their practice and career effectively 26 74
Allocate finite healthcare resources appropriately 58 42
Serve in administration and leadership roles, as appropriate 74 26
Health advocate
Respond to individual patient health needs and issues as part of patient care 95 5
Respond to the health needs of the communities that they serve 21 79
Identify the determinants of health for the populations that they serve 11 89
Promote the health of individual patients, communities, and populations 32 68
Scholar
Maintain and enhance professional activities through ongoing learning 79 21
Critically evaluate medical information and its sources and apply this appropriately to practice decisions 79 21
Facilitate the learning of patients, families, students, residents, other health professionals, the public, and others, as appropriate 63 37
Contribute to the development, dissemination, and translation of new knowledge and practices 42 58
Professionalism
Demonstrate a commitment to their patients, profession, and society through ethical practice 100 0
Demonstrate a commitment to their patients, profession, and society through participation in profession-led regulation 37 63
Demonstrate a commitment to physician health and sustainable practice 42 58

Table 1: Questionnaire and percentage answers

A total of 5 characteristics were identified as not belonging to the CANMEDs framework and needed to be reviewed by our medical education experts (MA, AA, MS). The consensus opinion was that four of these were considered to be inappropriate and should not be used as evaluation characteristics (Table 2). Table 3 shows sample quotes of commonly mentioned positive and negative resident features

Feature Decision on appropriateness
Impact of my evaluation on the trainee's self-esteem or future Inappropriate Special counseling should be provided to the resident, if needed
Fun to work with Inappropriate
Lack of personal hygiene (clothing, body odor) Appropriate Professionalism-related
Flirting with the opposite sex while on duty Inappropriate
I tend to give popular or showy residents worse evaluations Inappropriate

Table 2: Features not fitting into the CanMEDS framework and the expert’s consensus opinion regarding their appropriateness as an evaluation parameter

Positive Negative
Ability to recognize acute illness Unreliable, dishonest
Good communication with patients, families and colleagues Overconfident
Team leader Lack of interest in learning

Table 3: Example quotes of commonly mentioned positive and negative resident features

Discussion

Competency-based assessment is increasingly common in medical education. Different frameworks have been adopted in different countries, including the Accreditation Council for Graduate Medical Education framework in the United States, Tomorrow’s Doctor in the United Kingdom, and CanMEDS in Canada. The latter recently has been adopted in multiple countries, including Saudi Arabia. The Saudi Commission for Health Specialties, which regulates residency training across the country, recently began requiring training programs to base their training and assessments on the CanMEDs framework.

Emergency medicine attending physicians are expected to assess and evaluate their trainees, even though most of them have not had adequate training in assessment and evaluation [10]. The Saudi Commission for Health Specialties is in the process of training the trainers on the CanMEDs framework in collaboration with the Royal College of Physicians and Surgeon in Canada. However, training most program directors and others involved in training will require significant time and effort. Meanwhile, faculty development sessions have been shown to be an effective tool in improving the quality of their evaluation reports [1]. In addition, feedback has been shown to positively impact the quality of evaluations [2]. The results of the focus groups and the written survey were consistent, especially in the order of importance of the seven competencies.

Our results show that medical expert competencies are the main focus of evaluators. This is consistent with previous literature; however, the importance given to other competencies was remarkably different in our study and previous literature [11]. Interestingly, our results demonstrate that evaluators expect their residents to be professional, which is why a lack of professionalism has a strong negative impact, while the impact of being professional, as expected, and is not as strong. It is worth noting that most of the “rarely assessed” items in the questionnaire had to do with performance at the organizational, community, or population level rather than at an individual patient care level.

Limitations

One of the main limitations of the study was the small sample size.Scheduling conflicts and the relatively small number of emergency medicine physicians involved in training and evaluation in the city made hard to increase the sample size. Another limitation is the limited experience of participants (average, 4 years); emergency medicine is a young specialty in Saudi Arabia, and junior attending physicians outnumber senior attending physicians. Lastly, the use of survey methodology in the design represents an added weakness as well.

Conclusions

Appropriate evaluation of emergency medicine residents remains a challenge. Faculty development workshops should be offered to evaluators to improve the quality of their evaluation reports. Emergency medicine educators should create a systematic method of assessing and evaluating competencies related to the resident’s performance at the population and healthcare system level rather than only at the individual patient care level.

Authors' contributions

Abdulmohsen Alsaawi: Facilitator, field notes, 3rd coder, manuscript writing, submission, corresponding author.

Mohammed Alsultan: Interviewer, manuscript revision, consensus expert member.

Majid Alsalamah: Methodology expert assured coding and categorization accuracy, manuscript revision, and consensus expert member.

Abdullah Alanazi: Manuscript revision, consensus expert member.

Mishal Almarshady: 1st coder, organization of focus group sessions.

Abdullah Alzabin: 2nd coder, organization of focus group sessions.

References

  1. Dudek NL, Marks MB, Wood TJ, Dojeiji S, Bandiera G, et al. (2012) Quality evaluation reports: Can a faculty development program make a difference? Med Teach 34: e725-731.
  2. Dudek NL, Marks MB, Bandiera G, White J, Wood TJ (2013) Quality in-training evaluation reports--does feedback drive faculty performance? Acad Med 88: 1129-1134.
  3. Dudek NL, Marks MB, Wood TJ, Lee AC (2008) Assessing the quality of supervisors' completed clinical evaluation reports. Med Educ 42: 816-822.
  4. Mahler SA, McCartney JR, Swoboda TK, Yorek L, Arnold TC (2012) The impact of emergency department overcrowding on resident education. J Emerg Med 42: 69-73.
  5. Rogers RL (ed) (2009) Practical Teaching in Emergency Medicine. Oxford, UK: Wiley-Blackwell.
  6. Thurgur L, Bandiera G, Lee S, Tiberius R (2005) What do emergency medicine learners want from their teachers? A multicenter focus group analysis. AcadEmerg Med 12: 856-861.
  7. Nation JG, Carmichael E, Fidler H, Violato C (2011) The development of an instrument to assess clinical teaching with linkage to CanMEDS roles: A psychometric analysis. Med Teach 33: e290-296.
  8. Frank JR: The CanMEDS(2005) Physician Competency Framework. Ottawa, Canada: The Royal College of Physicians and Surgeons of Canada.
  9. Frank JR, Danoff D (2007) The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach 29: 642-647.
  10. Srinivasan M, Li ST, Meyers FJ, Pratt DD, Collins JB, et al. (2011) "Teaching as a Competency": competencies for medical educators. Acad Med 86: 1211-1220.
  11. Stutsky BJ, Singer M, Renaud R (2012) Determining the weighting and relative importance of CanMEDS roles and competencies. BMC Res Notes 5: 354.

References

  1. Dudek NL, Marks MB, Wood TJ, Dojeiji S, Bandiera G, et al. (2012) Quality evaluation reports: Can a faculty development program make a difference? Med Teach 34: e725-731.
  2. Dudek NL, Marks MB, Bandiera G, White J, Wood TJ (2013) Quality in-training evaluation reports--does feedback drive faculty performance? Acad Med 88: 1129-1134.
  3. Dudek NL, Marks MB, Wood TJ, Lee AC (2008) Assessing the quality of supervisors' completed clinical evaluation reports. Med Educ 42: 816-822.
  4. Mahler SA, McCartney JR, Swoboda TK, Yorek L, Arnold TC (2012) The impact of emergency department overcrowding on resident education. J Emerg Med 42: 69-73.
  5. Rogers RL (ed) (2009) Practical Teaching in Emergency Medicine. Oxford, UK: Wiley-Blackwell.
  6. Thurgur L, Bandiera G, Lee S, Tiberius R (2005) What do emergency medicine learners want from their teachers? A multicenter focus group analysis. AcadEmerg Med 12: 856-861.
  7. Nation JG, Carmichael E, Fidler H, Violato C (2011) The development of an instrument to assess clinical teaching with linkage to CanMEDS roles: A psychometric analysis. Med Teach 33: e290-296.
  8. Frank JR: The CanMEDS(2005) Physician Competency Framework. Ottawa, Canada: The Royal College of Physicians and Surgeons of Canada.
  9. Frank JR, Danoff D (2007) The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach 29: 642-647.
  10. Srinivasan M, Li ST, Meyers FJ, Pratt DD, Collins JB, et al. (2011) "Teaching as a Competency": competencies for medical educators. Acad Med 86: 1211-1220.
  11. Stutsky BJ, Singer M, Renaud R (2012) Determining the weighting and relative importance of CanMEDS roles and competencies. BMC Res Notes 5: 354.
Citation: Alsaawi A, Almarshady M, Alzabin A, Alanazil M, Alsalamah M, Alsultan M (2014) How do Emergency Medicine Attending Physicians Evaluate their Trainees? A Multicenter Focus Group Study. Emergency Med 4: 215.

Copyright: © 2014 Alsaawi A, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Top