OBJECTIVE

There is evidence that pediatric attending physicians value receiving feedback from trainees. With this study, we sought to determine the extent to which pediatric hospitalists value, solicit, and receive feedback from residents and medical students on specific areas of the attending’s performance and identify perceived barriers to trainees’ providing feedback.

METHODS

A web-based survey was sent to pediatric hospitalists at 9 institutions in 2022. Survey questions were developed from existing literature, trainee input, faculty expertise, and a framework on the qualities of exemplary pediatric educators. Respondents answered yes-no and multiitem Likert scale questions and selected answers from predetermined lists related to feedback solicitation from trainees. χ-Square and Wilcoxon rank test statistics were used to analyze questions.

RESULTS

Responses were gathered from 91 of 189 surveyed individuals (response rate: 48.1%). Respondents almost unanimously “agreed” or “strongly agreed” that feedback from medical students (88, 96.7%) and residents (89, 97.8%) can be valuable, but feedback was considered more valuable from residents (P <.05). Attending physicians asked for and received feedback more from residents than from medical students (P <.05). Attending physicians most commonly asked for feedback on “teaching skills.” The largest perceived barriers to receiving feedback from trainees were trainee lacking comfort with giving feedback, trainee lacking awareness that providing feedback is within their role, and fear of retaliation.

CONCLUSIONS

Although pediatric hospitalist respondents nearly unanimously valued feedback from trainees, attending physicians were inconsistent in their feedback solicitation practices. Attending physicians were more likely to ask for and receive feedback from residents than from medical students.

Pediatric hospitalist attending physicians play a critical role in trainee education and development. Therefore, the development of attending teaching skills is an essential component of improving medical education.1,2  The authors of previous literature have identified specific areas of development for master clinical educators, and publications reveal 4 domains of exemplary clinical educators: teaching skills, patient care skills, personal qualities, and role modeling.3  The recently piloted Clinician Educator Milestone Project provides a framework for the assessment of faculty educational skills, including competencies such as leadership, feedback, and teaching.4  All of these areas are observed by clinical trainees, making trainees uniquely positioned to provide external formative feedback to clinical medical educators.

This formative assessment, or upward feedback, in which trainees or employees provide feedback to supervisors, has been shown to be effective in nonmedical professional settings.5,6  Much like feedback to trainees, upward feedback is specific, bidirectional advice that can guide the future performance of attending physicians in a specific activity. In contrast, evaluation is unidirectional and summative and does not allow for conversation to further address an attending’s learning priorities.7  In the medical field, Robins et al conducted interviews with attending physicians and identified multiple potential benefits of upward feedback, such as improvements in attending teaching ability and cultivating feedback delivery skills in trainees.8  By requesting and obtaining feedback from residents and medical students, attending physicians can gain external insight into their educational, clinical, and professional abilities. However, multiple barriers to upward feedback from trainees to attending physicians exist, including fear of retaliation, lack of time, and lack of preceptor receptivity to feedback.6,8  These factors may deter a trainee from providing feedback to a supervisor, especially in unstructured settings in which a trainee has not been tasked with the goal of providing feedback.6 

Despite evidence that clinical attending physicians find value in upward feedback for their continued professional development,6  no multisite studies have been conducted to see current practice and opinions on this process. Previous literature has revealed that nearly all medical students and residents work with pediatric hospitalists,2  making this cohort an ideal sample to start learning about upward feedback in medical training. This study’s goals were to determine the extent to which pediatric hospitalists (1) value feedback from residents and medical students on areas of the attending physicians’ performance, (2) facilitate feedback by asking for it, (3) receive feedback, and (4) perceive barriers to trainees providing feedback. We hypothesized that the majority of pediatric hospitalists would report valuing feedback, report valuing feedback on different domains differently, and not report soliciting or receiving feedback from trainees.

In the winter of 2022, the authors designed a survey to evaluate attending physicians’ perceived value of feedback from medical students and residents, quantify the frequency with which attending physicians solicit feedback from trainees, and describe potential barriers to obtaining that feedback. Survey questions were developed utilizing previous literature,2,6,8  trainee input, and faculty expertise with a framework based on the qualities of exemplary pediatric educators as described by Fromme et al3  and the Clinician Educator Milestone Project.4  The logic and rigor of questions were developed to meet the standards put forth by Artino et al.9  The validity and logic of questions were confirmed through cognitive think-aloud interviews with 5 hospitalists, with changes incorporated after discussions with the research team.10 

The survey assessed each respondent’s opinion of the importance of trainee feedback, the frequency of asking for and receiving feedback from trainees in each domain of exemplary pediatric educators, satisfaction with existing feedback received from trainees, and potential barriers to upward feedback from trainees. The 4 domains were defined in our survey by using definitions from Fromme et al.3  The survey (Supplemental Fig 3) defines patient skills as “medical knowledge, knowledge acquisition, communication with patient, clinical care plans, system knowledge, etc,” personal qualities as “self-reflection, empathy, enthusiasm, motivation, respect for others, professional behavior,” teaching skills as “patient-centered teaching, setting expectations, stimulation of learner problem-solving, setting learning climate, etc,” and leadership skills as “encouraging autonomy, time management, conflict resolution, etc.” The survey had yes-no questions, multiitem Likert scale questions (1 = “Strongly Disagree,” 5 = “Strongly Agree” or 1 = “Always,” 5 = “Never”), and a predetermined list of answer choices to a question on barriers to upward feedback that included a free-text box if the respondent chose “other.”

Participating institutions were a convenience sample of 9 pediatric hospitalist programs in the Northeastern, Central, Southern, and Western regions of the United States. They were selected for diversity in program location, number of non-ICU inpatient pediatric hospital beds, size of medical school, and size of residency (Supplemental Table 4). Physicians who attend as pediatric hospitalists were sent invitations to an anonymous, web-based survey (REDCAP tool) via an e-mail from site leads. Sites included Advocate Children’s Hospital at Oak Lawn and at Park Ridge (Chicago, IL), Children’s Hospital of Richmond at Virginia Commonwealth University (Richmond, VA), Comer Children’s Hospital at University of Chicago (Chicago, IL), LA County-Harbor-UCLA Pediatrics (Los Angeles, CA), Lucile Packard Children’s Hospital at Stanford (Palo Alto, CA), NorthShore University Hospital (Evanston, IL), University of Florida Health Shands Children’s Hospital (Gainesville, FL), University of Rochester-Golisano Children’s Hospital (Rochester, NY), and University of South Alabama Health Children’s & Women’s Hospital (Mobile, AL). All respondents worked with residents and medical students. Respondents reported how many weeks per year they work with students. The answer choices are in intervals of 5 weeks because many hospitalists are on service for 1 or 2 weeks at a time; thus, 5 weeks captures 3 to 5 instances of being on service with students. Participants were sent 2 subsequent reminder emails over 4 weeks in March 2022. Participants received no compensation for completion. Site leads were offered anonymous composite data from their site after study completion.

Deidentified data were collected via REDCap and transferred to Microsoft Excel. The demographic characteristics of respondents were characterized by using frequency and percentage. χ-Square statistics were used to analyze yes-no questions. Wilcoxon rank test was used to compare Likert-scale questions. A P value <.05 was considered significant for an association between 2 questions. All analyses were conducted by using R (version 4.1.3) and Rstudio. The statistical methods of the study were determined with consultation from The University of Chicago’s Biostatistics Laboratory.

This study was determined to be exempt by The University of Chicago’s Institutional Review Board.

Results were gathered from 91 of the 189 surveyed individuals (response rate of 48.1%). Less than one-half of respondents (42, 46.2%) reported receiving formal training on feedback solicitation. More than one-half of respondents (60, 65.9%) reported working as hospitalist attending physicians for 5 or more years. The majority of respondents spent >5 weeks per year working with medical students on their clerkship inpatient pediatric rotation (77, 84.4%) (Table 1).

TABLE 1

Characteristics of Attending Respondents (n = 91)

n (percentage)
Job title  
 Pediatric Hospitalist 85 (93.4%) 
 Pediatric Hospitalist Fellow 1 (1.1%) 
 General Pediatrician 1 (1.1%) 
 Subspecialty Pediatrician 1 (1.1%) 
 Other 3 (3.3%) 
Years attending as hospitalist  
 0 to 4 y 31 (34.1%) 
 5 to 10 y 29 (31.9%) 
 11 to 15 y 16 (17.6%) 
 16 to 20 y 9 (9.9%) 
 21 or more y 5 (5.5%) 
 N/A: in fellowship 1 (1.1%) 
Weeks of service per y with medical students 
 1 to 5 wks 14 (15.6%) 
 6 to 10 wks 27 (30.0%) 
 11 to 15 wks 26 (28.9%) 
 16 to 20 wks 13 (14.4%) 
 21 or more wks 10 (11.1%) 
Region of practice  
 Central 45 (49.5%) 
 Western 17 (18.7%) 
 Southern 22 (24.1%) 
 Northeastern 7 (7.7%) 
Formal training in feedback solicitation  
 Yes 42 (46.2%) 
 No 49 (53.8%) 
n (percentage)
Job title  
 Pediatric Hospitalist 85 (93.4%) 
 Pediatric Hospitalist Fellow 1 (1.1%) 
 General Pediatrician 1 (1.1%) 
 Subspecialty Pediatrician 1 (1.1%) 
 Other 3 (3.3%) 
Years attending as hospitalist  
 0 to 4 y 31 (34.1%) 
 5 to 10 y 29 (31.9%) 
 11 to 15 y 16 (17.6%) 
 16 to 20 y 9 (9.9%) 
 21 or more y 5 (5.5%) 
 N/A: in fellowship 1 (1.1%) 
Weeks of service per y with medical students 
 1 to 5 wks 14 (15.6%) 
 6 to 10 wks 27 (30.0%) 
 11 to 15 wks 26 (28.9%) 
 16 to 20 wks 13 (14.4%) 
 21 or more wks 10 (11.1%) 
Region of practice  
 Central 45 (49.5%) 
 Western 17 (18.7%) 
 Southern 22 (24.1%) 
 Northeastern 7 (7.7%) 
Formal training in feedback solicitation  
 Yes 42 (46.2%) 
 No 49 (53.8%) 

A majority of respondents strongly agreed or agreed (88, 96.7%) that feedback from medical students was valuable. Similarly, respondents strongly agreed or agreed (89, 97.8%) that resident feedback was valuable. Yet, feedback from residents was considered significantly more valuable (P <.001).

Attending physicians believed trainees could provide useful feedback in all 4 domains associated with a master clinical educator. When comparing feedback given by medical students across domains, feedback on teaching skills was reported to be potentially useful to the largest proportion of attending physicians (90, 98.9%), whereas feedback on patient skills could be useful to the least (71, 78.0%) (Table 2). For feedback given by residents, feedback on teaching skills could be useful to all attending physicians (91, 100%), whereas feedback on patient care skills could be useful to the least (86, 94.5%) (Table 2). In the domains of personal qualities and leadership skills, there was a significant difference between the perceived utility of feedback from medical students and the perceived utility of feedback from residents (P <.001) (Table 2).

TABLE 2

Comparison of Attending physicians’ Perception of Feedback from Trainees (n = 91)

Medical Student, n (%)Resident, n (%)
Can trainees provide valuable feedback in Yes No Yes No χ2 (P
 Patient care 71 (78.0) 20 (22) 86 (94.5) 5 (5.5) <.001 
 Personal qualities 89 (97.8) 2 (2.2) 91 (100.0) 0 (0.0) N/Aa 
 Teaching skills 90 (98.9) 1 (1.1) 90 (98.9) 1 (1.1) 1.000 
 Leadership 83 (91.2) 8 (8.8) 88 (96.7) 3 (3.3) <.001 
Satisfaction of feedback Strongly agree or agree Neither agree nor disagree, disagree, or strongly disagree Strongly agree or agree Neither agree nor disagree, disagree, or strongly disagree Wilcoxon rank (P
 Quantity of feedback 23 (25.3) 68 (74.7) 38 (41.8) 53 (58.2) .010 
 Quality of feedback 25 (27.5) 66 (72.5) 37 (40.7) 54 (59.3) .041 
Medical Student, n (%)Resident, n (%)
Can trainees provide valuable feedback in Yes No Yes No χ2 (P
 Patient care 71 (78.0) 20 (22) 86 (94.5) 5 (5.5) <.001 
 Personal qualities 89 (97.8) 2 (2.2) 91 (100.0) 0 (0.0) N/Aa 
 Teaching skills 90 (98.9) 1 (1.1) 90 (98.9) 1 (1.1) 1.000 
 Leadership 83 (91.2) 8 (8.8) 88 (96.7) 3 (3.3) <.001 
Satisfaction of feedback Strongly agree or agree Neither agree nor disagree, disagree, or strongly disagree Strongly agree or agree Neither agree nor disagree, disagree, or strongly disagree Wilcoxon rank (P
 Quantity of feedback 23 (25.3) 68 (74.7) 38 (41.8) 53 (58.2) .010 
 Quality of feedback 25 (27.5) 66 (72.5) 37 (40.7) 54 (59.3) .041 
a

Cannot be calculated because Valuing Residents Personal Quality Feedback was 100%.

Attending physicians were generally unsatisfied with the feedback they received from medical students and residents. Approximately one-quarter of attending physicians were satisfied with the quantity (23, 25.3%) and quality (25, 27.5%) of feedback from medical students (Table 2). Approximately 40% of attending physicians were satisfied by the quantity (38, 41.8%) or quality (37, 40.7%) of feedback from residents (Table 2). Satisfaction with the quantity and quality of feedback from residents was significantly greater than satisfaction with feedback from medical students (P <.05) (Table 2). Attending physicians believed the 3 largest barriers to upward feedback for both medical students and residents were (1) trainee lack of comfort with giving feedback (medical students: 81, 89.0%; residents: 69, 75.8%), (2) lack of awareness that providing feedback is part of the trainees’ role (medical students: 72, 79.1%; residents: 64, 70.3%), and (3) fear of retaliation (medical students: 63, 69.2%; residents: 55, 60.4%).

With medical students, attending physicians were most likely to frequently or always ask for feedback on their teaching skills (56, 62.2%) and least likely to frequently or always ask about leadership qualities (20, 24.1%; Table 3; Fig 1). With residents, they were most likely to frequently or always ask for feedback on their teaching skills (67, 74.5%) and least likely to frequently or always ask about patient care skills (37, 43.0%; Table 3; Fig 2). When comparing attending feedback solicitation practices from trainees, attending physicians were statistically more likely to report asking for feedback frequently or always from residents than medical students (P <.001; Table 3).

TABLE 3

Comparison of Attending Physicians’ Feedback Solicitation Practices (n = 91)

Medical Student, n (%)Resident, n (%)Wilcoxon Rank Sum (P)
Ask for feedback Always or frequently Sometimes, rarely, or never Missing Always or frequently Sometimes, rarely, or never Missing  
 Patient care 19 (26.8) 52 (73.2) 20 37 (43.0) 49 (57.0) <.001 
 Personal qualities 35 (39.3) 54 (66.7) 50 (55.0) 41 (45.0) <.001 
 Teaching skills 56 (62.2) 34 (37.8) 67 (74.4) 23 (25.6) <.001 
 Leadership 20 (24.1) 63 (75.9) 49 (55.7) 39 (44.3) <.001 
Receive feedback        
 Patient care 9 (14.5) 53 (85.5) 29 20 (24.1) 63 (75.9) <.001 
 Personal qualities 17 (20.7) 65 (79.3) 26 (29.6) 62 (70.4) <.001 
 Teaching skills 34 (39.5) 52 (61.5) 42 (47.2) 47 (52.8) <.001 
 Leadership 12 (16.2) 62 (83.8) 17 34 (41.5) 48 (58.5) <.001 
Medical Student, n (%)Resident, n (%)Wilcoxon Rank Sum (P)
Ask for feedback Always or frequently Sometimes, rarely, or never Missing Always or frequently Sometimes, rarely, or never Missing  
 Patient care 19 (26.8) 52 (73.2) 20 37 (43.0) 49 (57.0) <.001 
 Personal qualities 35 (39.3) 54 (66.7) 50 (55.0) 41 (45.0) <.001 
 Teaching skills 56 (62.2) 34 (37.8) 67 (74.4) 23 (25.6) <.001 
 Leadership 20 (24.1) 63 (75.9) 49 (55.7) 39 (44.3) <.001 
Receive feedback        
 Patient care 9 (14.5) 53 (85.5) 29 20 (24.1) 63 (75.9) <.001 
 Personal qualities 17 (20.7) 65 (79.3) 26 (29.6) 62 (70.4) <.001 
 Teaching skills 34 (39.5) 52 (61.5) 42 (47.2) 47 (52.8) <.001 
 Leadership 12 (16.2) 62 (83.8) 17 34 (41.5) 48 (58.5) <.001 
FIGURE 1

For medical students: attending physicians’ reported frequency (n) of asking for feedback and frequency of receiving feedback stratified by domain.

FIGURE 1

For medical students: attending physicians’ reported frequency (n) of asking for feedback and frequency of receiving feedback stratified by domain.

Close modal
FIGURE 2

For residents: attending physicians’ reported frequency (n) of asking for feedback and frequency of receiving feedback stratified by domain.

FIGURE 2

For residents: attending physicians’ reported frequency (n) of asking for feedback and frequency of receiving feedback stratified by domain.

Close modal

With medical students, attending physicians were most likely to frequently or always receive feedback on teaching skills (34, 39.5%) and least likely to frequently or always receive feedback on patient care skills (9, 14.5%; Table 3; Fig 1). With residents, they were most likely to frequently or always receive feedback from residents on teaching skills (42, 47.2%) and least likely to frequently or always receive feedback on patient care skills (20, 24.1%; Table 3; Fig 2). Similar to the data on asking for feedback, attending physicians were statistically more likely to report receiving feedback frequently or always from residents than from medical students (P <.001; Table 3).

This is the first study, to our knowledge, that explores pediatric hospitalists’ perceptions of upward feedback in clinical educator competencies and reveals the frequency with which attending physicians solicit such feedback from trainees. We identified that attending physicians would like feedback from trainees in all domains demonstrated by exemplary pediatric educators.3  We found variance in attending physicians’ feedback solicitation practices and found that they are dissatisfied with the quantity and quality of feedback from trainees. Given that there are barriers to providing upward feedback,6,8,11  it is unsurprising that there is an unmet need for feedback from trainees.

We found that attending physicians value feedback from medical students and residents, which is consistent with previous literature and the fact that residents are expected to provide feedback to attending physicians (Supplemental Fig 4).8,1214  This study adds a layer of detail to our understanding of attending physicians’ perceptions of this feedback. Attending physicians value feedback more in certain domains, primarily teaching and personal qualities, than in other domains. This idea that attending physicians want feedback on particular components of their performance is consistent with publications revealing that attending physicians’ opinion of the usefulness of feedback is influenced by the type of feedback received.6,8,11  Our findings suggest that, although all areas of feedback are valued by attending physicians and are worth addressing, additional learning opportunities for faculty and trainees that focus on increasing upward feedback in teaching skills and personal qualities might result in higher yield changes for attending physicians.

The surveyed pediatric hospitalists were dissatisfied with the quantity and quality of feedback from trainees, and they were significantly more dissatisfied with the quality and quantity from medical students than residents. This dissatisfaction may be, in part, explained by the variance in attending feedback solicitation practices. One conceptual framework of the components of effective upward feedback suggests that safety and trust in the deliverer-recipient relationship are critical to the attending physicians’ judgment of the validity and appropriateness of feedback.11,15,16  Residents are more experienced members of the team and work more closely and often longitudinally with attending physicians, and our findings corroborate the idea that this connection may result in attending physicians’ more positive opinions of resident feedback. Moreover, the Association of American Medical Colleges explicitly encourages residents to provide feedback to attending physicians,14  which may help normalize residents giving feedback. For all trainees, there are multiple documented barriers to upward feedback in medical education that may be contributing to this dissatisfaction.6,8  We identified that most attending physicians think a lack of trainee comfort, a lack of awareness that giving feedback is a trainee responsibility, and a fear of retaliation are barriers to trainees providing upward feedback. These barriers may contribute to the dissatisfaction that attending physicians experience with the feedback they obtain. The path to generating satisfying feedback is likely influenced by attending and institutional differences that are worth identifying at the local level. An ethnographic study on providing feedback at a children’s hospital provides evidence that efforts to change the institutional culture around feedback can result in improved experiences with communicating feedback.17  Attending physicians may be able to create safer environments for medical students and residents to voice concerns if they preface discussions with assurances that there will no repercussions on their evaluations and if they create tones of learning and opportunity for change before feedback. Focusing the conversation on learning and change for future learners may strengthen teacher-student alliances and create more comfortable environments for upward feedback.15,17 

Asking for feedback is one strategy to generate specific and actionable feedback.8,18,19  The attending physicians in our sample reported requesting feedback from residents more than students. Yet, attending physicians expressed potential value in medical student feedback and dissatisfaction with feedback from medical students. Currently, there is no guidance from professional societies (e.g., the Association of American Medical Colleges or American Academy of Pediatrics) regarding feedback solicitation by attending physicians from medical students. Formalizing recommendations regarding feedback solicitation from medical students may generate more frequent or satisfying feedback. Although we found 1 curriculum on how trainees can solicit feedback from attending physicians,20  we could not find publicly available curricula to help attending physicians solicit feedback from trainees. Yet, nearly one-half of the respondents of this study reported formal training on feedback solicitation. In thinking about the development of upward feedback, providing both trainees and attending physicians with skills and advice on how to engage in bidirectional feedback could mitigate dissatisfaction with feedback.

Our study has several limitations. First, this survey is subject to response bias. Second, the generalizability of this study is limited by our study population and sample size. Next, the barriers to upward feedback that attending physicians could select were presented as a predetermined list on the basis of previous literature with an opportunity to add free text of an unlisted barrier.6,8,11  Although minimal attending physicians selected other, we could be missing additional barriers to upward feedback. Next, de novo survey instruments have limitations in their ability to obtain empirical evidence, although the Fromme et al3  framework and survey piloting provided validity evidence for survey definitions. Finally, this was a self-report survey based on attending perceptions, therefore results do not reflect the perception of learners.

Although upward feedback from trainees to attending physicians was nearly unanimously viewed as valuable by the pediatric hospitalists in our sample, attending physicians were inconsistent in their solicitation practices, and they were dissatisfied with the feedback they received from trainees. Attending physicians were more likely to ask for and receive feedback from residents than medical students. The exploration of barriers to upward feedback and increased training of trainees in feedback delivery could improve attending physicians’ ability to generate helpful feedback. Future steps in researching upward feedback include investigating the trainee perspectives on upward feedback, including barriers to and strategies in providing it. Future work should also focus on developing curricula on upward feedback for both attending physicians and trainees and determining if curricula impact the quality and quantity of feedback given.

FUNDING: The study was internally funded by the Department of Pediatrics, Section of Hospital Medicine at The University of Chicago, Comer Children’s Hospital.

The authors are responsible for the entirety of the manuscript’s development and preparation, without any influence of funding.

CONFLICT OF INTEREST DISCLOSURES: The authors have indicated they have no potential conflicts of interest relevant to this article to disclose.

Mr McKenzie conceptualized and designed the study, developed the survey instrument, recruited survey respondents, monitored data collection and assisted with analysis, and drafted the initial manuscript; Ms Vaughen assisted in the conceptualization and design of the study and assisted in data interpretation; Dr Black contributed to the design of the study and conducted the analysis and interpretation of data; Dr Shao assisted in the conceptualization and design of the study, assisted in development of the survey instrument, and assisted in data interpretation; Dr Fromme contributed to the conceptualization and design of the study and the development of the data collection tool, recruited survey respondents, and participated in data collection, analysis, and interpretation; and all authors reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.

1.
Freed
GL
,
Dunham
KM
,
Lamarand
KE
;
Research Advisory Committee of the American Board of Pediatrics
.
Hospitalists’ involvement in pediatrics training: perspectives from pediatric residency program and clerkship directors
.
Acad Med
.
2009
;
84
(
11
):
1617
1621
2.
Heydarian
C
,
Maniscalco
J
.
Pediatric hospitalists in medical education: current roles and future directions
.
Curr Probl Pediatr Adolesc Health Care
.
2012
;
42
(
5
):
120
126
3.
Fromme
HB
,
Bhansali
P
,
Singhal
G
, et al
.
The qualities and skills of exemplary pediatric hospitalist educators: a qualitative study
.
Acad Med
.
2010
;
85
(
12
):
1905
1913
4.
The Accreditation Council for Continuing Medical Education, The Association of American Medical Colleges, The American Association of Colleges of Osteopathic Medicine
.
The clinical educator milestone project, version 1 draft
.
5.
Hall
JL
,
Leiaecker
JK
,
DiMarco
C
.
What we know about upward appraisals of management: facilitating the future use of UPAs
.
Hum Resour Dev Q
.
1996
;
7
(
3
):
209
226
6.
Fluit
CV
,
Bolhuis
S
,
Klaassen
T
, et al
.
Residents provide feedback to their clinical teachers: reflection through dialogue
.
Med Teach
.
2013
;
35
(
9
):
e1485
e1492
7.
Geringer
JL
,
Surry
LT
,
Battista
A
.
Divergence and dissonance in residents’ reported and actual feedback to faculty in a military context. [published online ahead of print December 20, 2022]
Military Medicine
.
8.
Robins
L
,
Smith
S
,
Kost
A
, et al
.
Faculty perceptions of formative feedback from medical students
.
Teach Learn Med
.
2020
;
32
(
2
):
168
175
9.
Artino
AR
Jr
,
Phillips
AW
,
Utrankar
A
, et al
.
“The questions shape the answers”: assessing the quality of published survey instruments in health professions education research
.
Acad Med
.
2018
;
93
(
3
):
456
463
10.
Ericsson
KA
,
Simon
HA
.
Protocol Analysis: Verbal Reports as Data
.
Cambridge, MA
:
The MIT Press
;
1984
:
426
11.
Kost
A
,
Combs
H
,
Smith
S
, et al
.
A proposed conceptual framework and investigation of upward feedback receptivity in medical education
.
Teach Learn Med
.
2015
;
27
(
4
):
359
361
12.
Snell
L
,
Tallett
S
,
Haist
S
, et al
.
A review of the evaluation of clinical teaching: new perspectives and challenges
.
Med Educ
.
2000
;
34
(
10
):
862
870
13.
Dent
MM
,
Boltri
J
,
Okosun
IS
.
Do volunteer community-based preceptors value students’ feedback?
Acad Med
.
2004
;
79
(
11
):
1103
1107
14.
AAMC
.
Compact between resident physicians and their teachers
.
15.
Sargeant
J
,
Mann
K
,
Manos
S
, et al
.
R2C2 in action: testing an evidence-based model to facilitate feedback and coaching in residency
.
J Grad Med Educ
.
2017
;
9
(
2
):
165
170
16.
Bowen
L
,
Marshall
M
,
Murdoch-Eaton
D
.
Medical student perceptions of feedback and feedback behaviors within the context of the “educational alliance”
.
Acad Med
.
2017
;
92
(
9
):
1303
1312
17.
Balmer
DF
,
Tenney-Soeiro
R
,
Mejia
E
,
Rezet
B
.
Positive change in feedback perceptions and behavior: a 10-year follow-up study
.
Pediatrics
.
2018
;
141
(
1
):
e20172950
18.
Adarkwah
MA
.
The power of assessment feedback in teaching and learning: a narrative review and synthesis of the literature
.
SN Soc Sci
.
2021
;
1
(
3
):
75
19.
Fainstad
T
,
McClintock
AA
,
Van der Ridder
MJ
, et al
.
Feedback can be less stressful: medical trainee perceptions of using the prepare to ADAPT (Ask-Discuss-Ask-Plan Together) framework
.
Cureus
.
2018
;
10
(
12
):
e3718
20.
Mariani
A
,
Schumann
SA
,
Fromme
HB
, et al
.
Asking for feedback: helping learners get the feedback they deserve
.
MedEdPORTAL
.
2015
;
11
:
10228

Supplementary data