OBJECTIVES:

Progress notes communicate providers’ assessments of patients’ diagnoses, progress, and treatment plans; however, providers perceive that note quality has degraded since the introduction of electronic health records. In this study, we aimed to (1) develop a tool to evaluate progress note assessments and plans with high interrater reliability and (2) assess whether a bundled intervention was associated with improved intern note quality without delaying note file time.

METHODS:

An 8-member stakeholder team developed a 19-item progress note assessment and plan evaluation (PNAPE) tool and bundled intervention consisting of a new note template and intern training curriculum. Interrater reliability was evaluated by calculating the intraclass correlation coefficient. Blinded assessors then used PNAPE to evaluate assessment and plan quality in pre- and postintervention notes (fall 2017 and 2018).

RESULTS:

PNAPE revealed high internal interrater reliability between assessors (intraclass correlation coefficient = 0.86; 95% confidence interval: 0.66–0.95). Total median PNAPE score increased from 13 (interquartile range [IQR]: 12–15) to 15 (IQR: 14–17; P = .008), and median file time decreased from 4:30 pm (IQR: 2:33 pm–6:20 pm) to 1:13 pm (IQR: 12:05 pm–3:59 pm; P < .001) in pre- and postintervention notes. In the postintervention period, a higher proportion of assessments and plans indicated the primary problem requiring ongoing hospitalization and progress of this problem (P = .0016 and P < .001, respectively).

CONCLUSIONS:

The PNAPE tool revealed high reliability between assessors, and the bundled intervention may be associated with improved intern note assessment and plan quality without delaying file time. Future studies are needed to evaluate whether these improvements can be sustained throughout residency and reproduced in future intern cohorts and other inpatient settings.

Progress note assessments and plans communicate important information about a hospitalized patient’s diagnoses, treatment, and discharge plans while also fulfilling requirements for billing and legal documentation.1  Although widespread electronic health record (EHR) adoption has led to some efficiencies in note writing, convenient EHR functions, such as copy forward, copy and paste, and auto-populate, have introduced new issues that may impair note quality.13  For example, these functions can lead to inclusion of redundant data (“note bloat”) and perpetuation of outdated or erroneous information,1,4,5  which may hinder synthesis of and clear communication about a patient’s hospital problem(s), progress, and plan of care.6  These issues are further exacerbated during residency, when interns are still honing their note-writing skills.

There are limited published tools to evaluate and interventions to improve the quality of inpatient notes,710  none of which are focused on the assessment and plan. Stetson et al9,10  developed and validated the Physician Documentation Quality Instrument for progress notes written in the adult inpatient setting. This tool uses a Likert scale with 9 broad attributes to assess note quality but does not outline key components specific to the assessment and plan. Dean et al7  developed a tool that targets the reduction of copy and paste and clutter in pediatric progress notes; however, authors cited limitations in the tool’s ability to assess synthesis of information in the assessment and plan.

As part of an initiative to improve the quality of progress note assessments and plans written by interns, the objectives of this study were the following: (1) to develop a tool to evaluate assessment and plan quality with high interrater reliability and (2) to use the tool to determine if an intervention bundle was associated with improvements. We hypothesized that a combined intervention of a new EHR note template and intern training curriculum would be associated with improvements in assessment and plan quality without delaying the time of note filing.

This prospective pre- and postintervention study was conducted on a pediatric hospitalist service at a 111-bed quaternary Midwest children’s hospital, which serves as the primary training site for 15 pediatric interns per year. Interns write daily progress notes for all inpatients on this service using a standard note template within the organization’s EHR (Epic Systems, Verona, WI). These notes are reviewed, edited, and signed by hospitalist faculty. This study was deemed programmatic improvement by the university’s institutional review board and did not require review in accordance with federal regulations.

An 8-member stakeholder team convened twice monthly for 1 year to iteratively develop the evaluation tool and intervention bundle. Stakeholders included pediatric intern and resident physicians, hospitalist and intensivist faculty, 2 associate residency program directors, and the chief medical informatics officer; faculty had between 5 and 15 years of experience. This team first generated a list of ideal progress note assessment and plan components on the basis of review of published literature,1,2,4,11  institutional best practice guidelines,7  and consultation with a departmental billing expert. From this ideal state, the team then developed and iteratively refined the 19-item progress note assessment and plan evaluation (PNAPE) tool (Fig 1) and bundled intervention consisting of a new EHR note template and educational workshop.12  Any disagreements (eg, PNAPE items, phrasing, and note template changes) were resolved through consensus.

FIGURE 1

The PNAPE tool. NA, not applicable.

FIGURE 1

The PNAPE tool. NA, not applicable.

Close modal

Four faculty assessors from the stakeholder group used the PNAPE tool to independently review 10 intern progress notes identified from a convenience sample of fall 2017 hospitalist service notes. Tool interrater reliability was evaluated by calculating the intraclass correlation coefficient using a multilevel random intercept model to account for rater effect.13 

Team members delivered the educational workshop12  to all pediatric interns in August 2018. This 2.5-hour workshop provided a didactic overview of the purpose of progress notes, note-writing best practices, and the new template, along with an opportunity for interns to practice writing and assessing notes using the tool. The new EHR note template was made available immediately after the workshop. Inpatient health care team size, structure, senior resident availability, and resident didactic training related to notes were otherwise unchanged throughout the pre- and postintervention study period.

The PNAPE tool was used to assess 39 progress notes from 13 interns each from the pre- and postintervention periods (August to October 2017, August to October 2018). Notes were eligible if they were written by an intern about a patient with a length of stay of ≥2 days; progress notes from hospital day 2 were evaluated. The first 3 notes written by each intern during the study period that met these inclusion criteria were selected. Eligible notes were identified by using a computer‐generated report. The file time (the time signed by the intern) was recorded for each note, and the assessments and plans were isolated, deidentified, and randomly assigned. Two faculty assessors (K.A.M.N. and K.A.S.), blinded to the pre- and postintervention period and patient and intern name, then used the tool to each score half of the notes.

Pre- and postintervention differences in the 19 individual PNAPE item binary responses (yes or no) were compared using the χ2 or Fisher’s exact test. Differences in median total PNAPE score (sum of positive responses, maximum 19) and note file time (balancing measure) were compared using a nonparametric Wilcoxon rank test. Effect sizes were calculated using Cohen’s d. The study was powered to detect an anticipated 2-point mean difference in pre- and postintervention total PNAPE scores with 90% power at the 2-sided 0.05 significance level, assuming an SD of 3.7 points. All reported P values are 2-sided, and P < .05 was used to define statistical significance. Data analysis was conducted using SAS software (version 9.4; SAS Institute, Inc, Cary, NC).

A high degree of interrater reliability using the PNAPE tool was found between assessors, which was revealed by an intraclass correlation coefficient of 0.89 (95% confidence interval: 0.68–0.97). The median total PNAPE score significantly increased from 13 (interquartile range [IQR]: 12–15) to 15 (IQR: 14–17) in pre- and postintervention notes (P = .008, d = 0.62). Specifically, a higher proportion of assessments and plans indicated the primary problem requiring ongoing hospitalization and its progress in the postintervention period (Table 1; P = .0016 and P < .001, respectively). Although service census was not significantly different in the pre- and postintervention periods (mean [SD]: 12.1 [1.9] patients preintervention versus 10.4 [3.7] patients postintervention, P = .28), median note file time decreased from 4:30 pm (IQR: 2:33 pm–6:20 pm) to 1:13 pm (IQR 12:05 pm–3:59 pm) in pre- and postintervention notes (P < .001, d = 0.99).

TABLE 1

Affirmative Responses to Each of the 19 PNAPE Tool Binary Response Items in Pre- and Postintervention Progress Notes

PNAPE ItemPreintervention (n = 39), n (%)Postintervention (n = 39), n (%)P
1. Accurate patient age 38 (97) 39 (100) .99 
2. Relevant past medical history 34 (87) 37 (95) .25 
3. Admission and/or postoperative date 36 (92) 38 (97) .32 
4. Admission problem 39 (100) 38 (97) .99 
5. Current primary problem(s) requiring hospitalization 17 (44) 31 (79) .0016 
6. Progress of primary problem(s) 17 (44) 36 (92) <.001 
7. Avoids rehashing differential diagnosis 34 (87) 38 (97) .12 
8. Synthesis of symptoms and events 18 (46) 25 (64) .11 
9. Prioritizes problem(s) 30 (77) 32 (82) .58 
10. Omits irrelevant problems 31 (79) 31 (79) .99 
11. Identifies changes in management 21 (54) 20 (51) .82 
12. Provides rationale for changes 20 (51) 15 (38) .26 
13. References standard protocol, if relevant 37 (95) 35 (90) .40 
14. Avoids listing doses unless new or changed 16 (41) 21 (53) .26 
15. Contingency plans 14 (36) 10 (26) .33 
16. Discharge or transfer criteria 25 (64) 31 (79) .14 
17. Appropriate length for complexity 25 (64) 27 (69) .63 
18. Dates, not days 39 (100) 39 (100) .99 
19. Avoids errors due to copy and paste 37 (95) 38 (97) .56 
PNAPE ItemPreintervention (n = 39), n (%)Postintervention (n = 39), n (%)P
1. Accurate patient age 38 (97) 39 (100) .99 
2. Relevant past medical history 34 (87) 37 (95) .25 
3. Admission and/or postoperative date 36 (92) 38 (97) .32 
4. Admission problem 39 (100) 38 (97) .99 
5. Current primary problem(s) requiring hospitalization 17 (44) 31 (79) .0016 
6. Progress of primary problem(s) 17 (44) 36 (92) <.001 
7. Avoids rehashing differential diagnosis 34 (87) 38 (97) .12 
8. Synthesis of symptoms and events 18 (46) 25 (64) .11 
9. Prioritizes problem(s) 30 (77) 32 (82) .58 
10. Omits irrelevant problems 31 (79) 31 (79) .99 
11. Identifies changes in management 21 (54) 20 (51) .82 
12. Provides rationale for changes 20 (51) 15 (38) .26 
13. References standard protocol, if relevant 37 (95) 35 (90) .40 
14. Avoids listing doses unless new or changed 16 (41) 21 (53) .26 
15. Contingency plans 14 (36) 10 (26) .33 
16. Discharge or transfer criteria 25 (64) 31 (79) .14 
17. Appropriate length for complexity 25 (64) 27 (69) .63 
18. Dates, not days 39 (100) 39 (100) .99 
19. Avoids errors due to copy and paste 37 (95) 38 (97) .56 

The PNAPE tool revealed high internal interrater reliability between faculty assessors, and the bundled intervention was associated with improvements in note quality without delaying file time. They may be used in conjunction to assess and improve documentation of progress note assessments and plans.

Our bundled intervention offers a standardized, simplified note template and curriculum that can be used to teach interns how to write an optimal progress note. This builds on previous work, which suggests that combining educational interventions with best practice note templates can improve some aspects of overall note quality as measured by the Physician Documentation Quality Instrument and decrease note clutter7  and length,5  with variable effects on documentation efficiency. In addition, the PNAPE tool is composed of discrete binary items that allow faculty assessors to evaluate key components specific to the assessment and plan. In the future, the PNAPE tool may also be used by faculty to provide targeted feedback to interns during residency training.

The change in total PNAPE score was largely attributed to improvements in the following 2 items: identification of the patient’s ongoing primary problem and assessment of its progress during the admission. These key items convey perhaps the most important information within progress notes. Improved documentation of these components could facilitate improved care team communication, discharge planning, and hospital coding and billing. Next steps could include the use of the tool and workshop in collaboration with other sites to broaden reach.

Lack of change in other PNAPE items was likely multifactorial. Some items were already routinely documented before the intervention and/or were more easily auto-populated in the EHR, such as patient age and admission date. Others revealed non–statistically significant improvements (eg, documentation of past medical history and discharge criteria). These items may require a larger sample size to reveal significance because the power calculation was based on total and not individual items. Finally, other items may reflect skills refined later in residency training,14  such as prioritizing problems and identifying and providing rationale for management changes. These items highlight areas for future interventions, such as just-in-time feedback.

This study was conducted at a single pediatric academic center, which may limit generalizability of findings. Although available literature was considered, provider bias or current setting-specific standards may have influenced tool development. Future studies are needed to reveal external reliability with a larger sample at other organizations. Notes were assessed from 2 subsequent intern classes, and findings may be subject to cohort effect and confounding; however, repeated sampling of the same class would have introduced bias because of expected improvements throughout internship. In addition, patient complexity was not accounted for and may have influenced note file time. To limit potential effects of the level of intern experience and seasonal census changes, the same time period was chosen in each year.

This bundled intervention may be associated with improvements in the quality of assessments and plans in progress notes written by interns. Future studies are needed to evaluate whether these improvements can be sustained throughout residency and reproduced in future intern cohorts and in other inpatient settings and specialties.

Drs Kelly, Sklansky, Nackers, Coller, Dean, and Shadman worked together to conceive and design the study, perform data collection, analysis, and interpretation, and draft and revise the manuscript; Dr Eickhoff assisted with study design, data analyses, and interpretation and made critical manuscript revisions; Dr Bentley participated in creation of the data collection tool and analysis and made critical manuscript revisions; Ms Nacht assisted with literature review and data interpretation and made critical revisions; and all authors approved the final manuscript as submitted.

FUNDING: No external funding.

1
Kuhn
T
,
Basch
P
,
Barr
M
,
Yackel
T
;
Medical Informatics Committee of the American College of Physicians
.
Clinical documentation in the 21st century: executive summary of a policy position paper from the American College of Physicians
.
Ann Intern Med
.
2015
;
162
(
4
):
301
303
2
Weis
JM
,
Levy
PC
.
Copy, paste, and cloned notes in electronic health records: prevalence, benefits, risks, and best practice recommendations
.
Chest
.
2014
;
145
(
3
):
632
638
3
Stewart
E
,
Kahn
D
,
Lee
E
, et al
.
Internal medicine progress note writing attitudes and practices in an electronic health record
.
J Hosp Med
.
2015
;
10
(
8
):
525
529
4
Tsou
AY
,
Lehmann
CU
,
Michel
J
,
Solomon
R
,
Possanza
L
,
Gandhi
T
.
Safe practices for copy and paste in the EHR: systematic review, recommendations, and novel model for health IT collaboration
.
Appl Clin Inform
.
2017
;
8
(
1
):
12
34
5
Kahn
D
,
Stewart
E
,
Duncan
M
, et al
.
A prescription for note bloat: an effective progress note template
.
J Hosp Med
.
2018
;
13
(
6
):
378
382
6
Tierney
MJ
,
Pageler
NM
,
Kahana
M
,
Pantaleoni
JL
,
Longhurst
CA
.
Medical education in the electronic medical record (EMR) era: benefits, challenges, and future directions
.
Acad Med
.
2013
;
88
(
6
):
748
752
7
Dean
SM
,
Eickhoff
JC
,
Bakel
LA
.
The effectiveness of a bundled intervention to improve resident progress notes in an electronic health record
.
J Hosp Med
.
2015
;
10
(
2
):
104
107
8
Kotwal
S
,
Klimpl
D
,
Tackett
S
,
Kauffman
R
,
Wright
S
.
Documentation of clinical reasoning in admission notes of hospitalists: validation of the CRANAPL assessment rubric
.
J Hosp Med
.
2019
;
14
(
12
):
746
753
9
Stetson
PD
,
Bakken
S
,
Wrenn
JO
,
Siegler
EL
.
Assessing electronic note quality using the Physician Documentation Quality Instrument (PDQI-9)
.
Appl Clin Inform
.
2012
;
3
(
2
):
164
174
10
Stetson
PD
,
Morrison
FP
,
Bakken
S
,
Johnson
SB
;
eNote Research Team
.
Preliminary development of the physician documentation quality instrument
.
J Am Med Inform Assoc
.
2008
;
15
(
4
):
534
541
11
Shoolin
J
,
Ozeran
L
,
Hamann
C
,
Bria
W
 II
.
Association of Medical Directors of Information Systems consensus on inpatient electronic health record documentation
.
Appl Clin Inform
.
2013
;
4
(
2
):
293
303
12
Nackers
KAM
,
Shadman
KA
,
Kelly
MM
, et al
.
Resident workshop to improve inpatient documentation using the Progress Note Assessment and Plan Evaluation (PNAPE) tool
.
MedEdPORTAL
.
2020
;
16
:
11040
13
McGraw
KO
,
Wong
SP
.
Forming inferences about some intraclass correlations coefficients
.
[published correction appears in Psychol Methods. 1996;1(4):390]
.
Psychol Methods
.
1996
;
1
(
1
):
30
46
14
Tolsgaard
MG
,
Arendrup
H
,
Lindhardt
BO
,
Hillingsø
JG
,
Stoltenberg
M
,
Ringsted
C
.
Construct validity of the reporter-interpreter-manager-educator structure for assessing students’ patient encounter skills
.
Acad Med
.
2012
;
87
(
6
):
799
806

Competing Interests

POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.