Continuous pulse oximetry (oxygen saturation [Spo2]) monitoring in hospitalized children with bronchiolitis not requiring supplemental oxygen is discouraged by national guidelines, but determining monitoring status accurately requires in-person observation. Our objective was to determine if electronic health record (EHR) data can accurately estimate the extent of actual Spo2 monitoring use in bronchiolitis.
This repeated cross-sectional study included infants aged 8 weeks through 23 months hospitalized with bronchiolitis. In the validation phase at 3 children’s hospitals, we calculated the test characteristics of the Spo2 monitor data streamed into the EHR each minute when monitoring was active compared with in-person observation of Spo2 monitoring use. In the application phase at 1 children’s hospital, we identified periods when supplemental oxygen was administered using EHR flowsheet documentation and calculated the duration of Spo2 monitoring that occurred in the absence of supplemental oxygen.
Among 668 infants at 3 hospitals (validation phase), EHR-integrated Spo2 data from the same minute as in-person observation had a sensitivity of 90%, specificity of 98%, positive predictive value of 88%, and negative predictive value of 98% for actual Spo2 monitoring use. Using EHR-integrated data in a sample of 317 infants at 1 hospital (application phase), infants were monitored in the absence of oxygen supplementation for a median 4.1 hours (interquartile range 1.4–9.4 hours). Those who received supplemental oxygen experienced a median 5.6 hours (interquartile range 3.0–10.6 hours) of monitoring after oxygen was stopped.
EHR-integrated monitor data are a valid measure of actual Spo2 monitoring use that may help hospitals more efficiently identify opportunities to deimplement guideline-inconsistent use.
Overuse of continuous pulse oximetry (oxygen saturation [Spo2]) monitoring in hospitalized children with bronchiolitis is increasingly recognized as an appropriate target for widespread deimplementation efforts.1–3 Bronchiolitis is highly prevalent, causing >100 000 infants to be admitted each year, earning it the distinction of being the leading reason infants are hospitalized other than being born.4 Despite the existence of 3 sets of formal recommendations discouraging continuous Spo2 monitoring in patients with bronchiolitis not requiring supplemental oxygen5–7 backed by evidence revealing absence of benefit,8,9 potential harm,10–12 and nurse alarm fatigue that can result,13,14 the practice remains widespread.1
The first step toward deimplementing any overused practice is measuring the current extent of the practice.15,16 Previous quality improvement initiatives used the presence of physician orders as a measure of Spo2 monitoring use.17,18 Unfortunately, a more recent study revealed that orders do not accurately represent actual Spo2 monitoring use at the bedside.19 A recent prospective study using in-person observation of monitoring status in 56 hospitals revealed that 46% of children hospitalized with bronchiolitis received Spo2 monitoring although monitoring was unnecessary or inappropriate according to the guidelines.1 Although these data provided valuable insights into the prevalence of guideline-inconsistent monitoring, the in-person observational methods were resource intensive and not sustainable for ongoing measurement.
In contrast, Spo2 data from bedside monitors have the potential to provide comprehensive, longitudinal, automated measurement of actual Spo2 monitoring use to drive audit and feedback interventions. Whereas these data have been difficult to access in the past, increasingly, hospitals have chosen to integrate monitor data into their electronic health records (EHRs) to reduce nurses’ work effort spent manually documenting vital signs. When the monitor data are stored and saved in the EHR or data warehouse at high frequency, identifying patients who are actively monitored becomes feasible.
In this study of hospitalized infants with bronchiolitis, we aimed to (1) measure the validity of Spo2 monitor data integrated into the EHR against the gold standard of in-person observation to determine actual Spo2 monitoring use and (2) apply this measure to estimate the extent of Spo2 monitoring use in the absence of supplemental oxygen using EHR-integrated monitor data.
Methods
This work occurred in the context of 2 separate larger projects. Validation was performed as a substudy by using data from a 6-center pilot single-arm clinical trial in the Eliminating Monitor Overuse (EMO) research portfolio (National Clinical Trial Identifier NCT04178941) that measured the feasibility, acceptability, appropriateness, and outcomes of audit and feedback with educational outreach as a strategy to align continuous Spo2 monitoring use in stable patients with bronchiolitis with evidence and guideline recommendations. All participating centers were members of the Pediatric Research in Inpatient Settings Network, an independent, hospital-based research network that aims to improve the health of and health care delivery to hospitalized children and their families. As part of this study, we validated EHR-integrated monitor data against in-person observation in the 3 hospitals with integrated EHRs actively receiving physiologic monitor data. The study was approved by the institutional review board at the coordinating site, with reliance agreements established with the other sites. Waivers of consent, assent, parental permission, and Health Insurance Portability and Accountability Act authorization were granted at all sites.
After validation, we applied the validated measure to a full season of monitor data of patients with bronchiolitis as part of a quality improvement initiative conducted by the Patient Safety Learning Laboratory (PSLL) at the aforementioned coordinating site, a tertiary children’s hospital. The PSLL is funded by the Agency for Healthcare Research and Quality, and one of its primary aims is to locally reengineer the system of monitoring hospitalized children on acute care units, with a focus on reducing noninformative alarms and accelerating nurse responses to critical events. The hospital’s institutional review board determined that the PSLL’s work described in this article was consistent with quality improvement and did not meet criteria for human subjects research.
Validation of EHR-Integrated Monitor Data in 3 Hospitals
Setting and Participants
In the trial that provided the data for the validation, we collected data on infants 8 weeks through 23 months hospitalized with acute bronchiolitis on 8 units that cared for patients with bronchiolitis on generalist services at 3 geographically diverse tertiary children’s hospitals between December 1, 2019, and March 14, 2020. Consistent with previous work in the EMO portfolio,1 we only observed infants who were not currently receiving supplemental oxygen therapy (or room airflow). We excluded patients with premature birth (<28 weeks’ gestation or documented “premature” without gestational age listed), cyanotic congenital heart disease, pulmonary hypertension, home oxygen or positive pressure ventilation requirement, tracheostomy, neuromuscular disease, immunodeficiency, cancer, or diagnosis of heart failure, myocarditis, arrhythmia, or coronavirus disease 2019.
Data Collection
After identifying eligible patients meeting the criteria above, observers at each hospital walked to each patient’s bedside, visualized the physiologic monitor display to determine if Spo2 tracings were present, and recorded the date and time of the observation. Observers then returned to the EHR to obtain demographic and clinical information, including determining presence of integrated monitor data displayed in the EHR. Specifically, observers recorded whether monitor data were captured in the EHR during the same minute as in-person observation (“minute 0”), during each of the 15 minutes before in-person observation (“minutes −1 to −15”), and during each of the 15 minutes after in-person observation (“minutes +1 to +15”). Observers were also prompted to complete an optional free-text field with comments pertaining to monitor data.
Analysis
Using in-person observation as the gold standard for actual Spo2 monitoring use, we determined the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of Spo2 monitor data visible in the EHR at (1) minute 0, (2) minutes −1 to −15, (3) minutes +1 to +15, and (4) minutes −15 to +15, encompassing time periods (1), (2), and (3). We calculated these test characteristics overall and stratified by each participating hospital. Two investigators reviewed and classified observer comments to provide context for cases in which EHR-integrated monitor data and in-person observation were discordant. When there were no associated observer comments, discrepancies were considered “unexplained.” We decided a priori that a PPV and NPV of at least 80% would be sufficient for use in future improvement and research efforts.
Application of Validated Measure to Monitor Data of Patients With Bronchiolitis in 1 Hospital
Setting and Participants
To apply the new method for bronchiolitis quality improvement efforts at a single hospital, we accessed nursing flowsheet documentation data and physiologic monitor data from the EHR on all hospitalized patients aged 8 weeks to 23 months admitted to a general pediatrics service with a discharge diagnosis of acute bronchiolitis between September 3, 2019, and January 1, 2020. Discharge diagnoses were defined by the International Classifications of Diseases, Ninth Revision and International Classifications of Diseases, 10th Revision codes (466.x from the ninth revision; J21.x from the 10th revision). We excluded patients with discharge diagnoses corresponding to comorbid respiratory conditions (Supplemental Table 3), patients with complex chronic conditions as previously defined by the complex chronic condition version 2 classification system,20 and patients with an ICU stay during the hospitalization so as to study monitoring practices in a sample of patients who were (1) representative of hospitalized infants with bronchiolitis nationally and (2) less likely to have other reasons to have Spo2 monitoring because of comorbidities or recovery from critical illness. For context, physiologic monitors are present in every inpatient ward bed space at this hospital. Monitors are used if ordered by the physician and/or if the nurse deems its use clinically appropriate.
Data Collection
We defined our cohort of interest using EHR data from our institutional data warehouse. We extracted flowsheet data for nursing respiratory assessments with time stamps, through which nurses manually document information including supplemental oxygen (fraction of inspired oxygen [Fio2]), flow rate, and support devices (eg, nasal cannula). We then used a unique visit identifier to link visits in our cohort directly to the EHR’s central database containing the synchronized Spo2 data automatically transmitted every minute from the bedside monitor to the EHR (Clarity; Epic Systems, Verona, WI). Flowsheet data from the data warehouse were merged with the synchronized Spo2 data from Clarity and filtered on the basis of inpatient unit admission and discharge time stamps for each patient, allowing us to eliminate data points corresponding to time accrued in the emergency department.
Analysis
To determine the time intervals when a patient was receiving supplemental oxygen, we first identified initiation times, defined as the time of the first flowsheet assessment documenting its use. There were 3 flowsheet rows that could be used to indicate supplemental oxygen: “support device,” “flow rate,” and “Fio2.” Given the imperfection of clinical data for research, we took a conservative approach to identifying supplemental oxygen that would favor sensitivity over specificity. With that guiding principle in mind, the following entries were defined as indicating supplemental oxygen use: “nasal cannula” or “high flow nasal cannula” support device, or any documented flow rate >0, or Fio2 >21%. We conservatively assumed that supplemental oxygen continued for the entire interval between subsequent entries consistent with supplemental oxygen administration. The time point of supplemental oxygen discontinuation was defined as flowsheet entries with simultaneous documentation of Fio2 of “NA” or “21,” flow rate of NA or 21 (a common documentation error), without simultaneous documentation of a support device of nasal cannula or high flow nasal cannula. If a patient on supplemental oxygen had no flowsheet assessment documenting its discontinuation, a patient was assumed to remain on supplemental oxygen until the time of discharge. We did not make assumptions about supplemental oxygen use before the first flowsheet assessment documented during the hospitalization.
To determine when a patient was Spo2 monitored, we considered each minute for which a patient had an Spo2 value streamed from the bedside monitor into the EHR to be a “monitored” minute.
To determine when Spo2 monitoring was occurring in the absence of supplemental oxygen, for each minute of the hospitalization, we directly compared a patient’s supplemental oxygen status with the patient’s monitoring status. To approximate consensus recommendations from a recent expert panel, we added a transitional hour after the final documented use of supplemental oxygen during each hospitalization before considering monitoring use to be possibly inconsistent with guideline reccomendations.7 For patients who never received supplemental oxygen during the hospitalization, on the basis of all 3 existing guideline recommendations,5–7 we measured every minute of Spo2 monitoring after the first documented flowsheet assessment.
Using these definitions, for each hospitalization, we calculated the total number of monitored minutes in the absence of supplemental oxygen. We calculated monitored time overall and classified monitored time by whether supplemental oxygen was ever received during the hospitalization (yes versus no) and patient age group (2–11 vs 12–23 months). We performed a sensitivity analysis excluding hospitalizations exceeding the 75th percentile for length of stay, which we presumed may reflect atypical hospitalization courses. We used Research Electronic Data Capture 9.10.0 software for data management and Stata (Stata Corp, College Station, TX) version 16 for all analyses.
Results
Validation of EHR-Integrated Monitor Data
We observed 668 patients across 3 hospitals: 410 patients at hospital A, 161 patients at hospital B, and 97 patients at hospital C. Figure 1 reveals the test characteristics of EHR-integrated data for actual Spo2 monitoring use, considering each of the 4 time periods surrounding in-person observation and indicating discrepancies noted by observers between EHR data and in-person observation. The overall PPV among all hospitals over 4 time periods ranged from 88% to 93%; NPV ranged from 94% to 98%. EHR-integrated data present at minute 0 among all hospitals were the most sensitive marker of actual Spo2 monitoring use (90%). The decrements in sensitivity observed for minutes −1 to −15, +1 to +15, and −15 to +15 were explained in part by observer comments noting monitoring use for transient interventions including vital signs and suctioning, and the discontinuation of Spo2 monitoring after in-person observation. Conversely, EHR data were most specific for actual Spo2 monitoring use for minutes −15 to +15 (99%), with minimal decrements to specificity for the other time periods investigated (98% specificity for actual Spo2 monitoring use at minute 0). Test characteristics stratified by each participating hospital are displayed in Table 1; all PPV and NPV were >80% except for minute 0 at hospital A (79%).
Test characteristics of EHR-integrated data compared with actual pulse oximetry monitoring use. EHR-integrated monitor data were compared with in-person observation (as the gold standard for actual pulse oximetry monitoring status) in hospitalized infants with bronchiolitis at 3 hospitals. Sensitivity, specificity, PPV, and NPV were obtained for (A) minute 0, the minute at which in-person observation was completed; (B) minutes –1 to –15, each of the 15 minutes preceding in-person observation; (C) minutes +1 to +15, each of the 15 minutes after in-person observation; and (D) minutes –15 to +15, the minute at which in-person observation was completed and each of the 15 minutes preceding and after in-person observation. Boxes with the dashed outline indicate reasons for discrepancy between actual pulse oximetry monitoring status and EHR-integrated data that were obtained from observer comments during chart review. Discrepancies were considered unexplained when there were no associated observer comments.
Test characteristics of EHR-integrated data compared with actual pulse oximetry monitoring use. EHR-integrated monitor data were compared with in-person observation (as the gold standard for actual pulse oximetry monitoring status) in hospitalized infants with bronchiolitis at 3 hospitals. Sensitivity, specificity, PPV, and NPV were obtained for (A) minute 0, the minute at which in-person observation was completed; (B) minutes –1 to –15, each of the 15 minutes preceding in-person observation; (C) minutes +1 to +15, each of the 15 minutes after in-person observation; and (D) minutes –15 to +15, the minute at which in-person observation was completed and each of the 15 minutes preceding and after in-person observation. Boxes with the dashed outline indicate reasons for discrepancy between actual pulse oximetry monitoring status and EHR-integrated data that were obtained from observer comments during chart review. Discrepancies were considered unexplained when there were no associated observer comments.
Test Characteristics of EHR-Integrated Monitor Data as a Measure of Actual Pulse Oximetry Monitoring Use
Test Characteristic by Hospital . | Minutes During Which Pulse Oximetry Data Are Present in EHR . | |||
---|---|---|---|---|
0a . | −1 to –15b . | +1 to +15c . | −15 to +15d . | |
Hospital A (n = 410), n of n (%) | ||||
Sensitivity | 41 of 44 (93) | 36 of 44 (82) | 27 of 44 (61) | 25 of 44 (57) |
Specificity | 355 of 366 (97) | 358 of 366 (98) | 360 of 366 (98) | 361 of 366 (99) |
PPV | 41 of 52 (79) | 36 of 44 (82) | 27 of 33 (82) | 25 of 30 (83) |
NPV | 355 of 358 (99) | 358 of 366 (98) | 360 of 377 (95) | 361 of 380 (95) |
Hospital B (n = 161), n of n (%) | ||||
Sensitivity | 29 of 30 (97) | 24 of 30 (80) | 22 of 30 (73) | 19 of 30 (63) |
Specificity | 131 of 131 (100) | 131 of 131 (100) | 131 of 131 (100) | 131 of 131 (100) |
PPV | 29 of 29 (100) | 24 of 24 (100) | 22 of 22 (100) | 19 of 19 (100) |
NPV | 131 of 132 (99) | 131 of 137 (96) | 131 of 139 (94) | 131 of 142 (92) |
Hospital C (n = 97), n of n (%) | ||||
Sensitivity | 22 of 28 (79) | 22 of 28 (79) | 19 of 28 (68) | 19 of 28 (68) |
Specificity | 68 of 69 (99) | 69 of 69 (100) | 67 of 69 (97) | 69 of 69 (100) |
PPV | 22 of 23 (96) | 22 of 22 (100) | 19 of 21 (90) | 19 of 19 (100) |
NPV | 68 of 74 (92) | 69 of 75 (92) | 67 of 76 (88) | 69 of 78 (88) |
All hospitals (n = 668), n of n (%) | ||||
Sensitivity | 92 of 102 (90) | 82 of 102 (80) | 68 of 102 (67) | 63 of 102 (62) |
Specificity | 554 of 566 (98) | 558 of 566 (99) | 558 of 556 (99) | 561 of 566 (99) |
PPV | 92 of 104 (88) | 82 of 90 (91) | 68 of 76 (89) | 63 of 68 (93) |
NPV | 554 of 564 (98) | 558 of 578 (97) | 558 of 592 (94) | 561 of 600 (94) |
Test Characteristic by Hospital . | Minutes During Which Pulse Oximetry Data Are Present in EHR . | |||
---|---|---|---|---|
0a . | −1 to –15b . | +1 to +15c . | −15 to +15d . | |
Hospital A (n = 410), n of n (%) | ||||
Sensitivity | 41 of 44 (93) | 36 of 44 (82) | 27 of 44 (61) | 25 of 44 (57) |
Specificity | 355 of 366 (97) | 358 of 366 (98) | 360 of 366 (98) | 361 of 366 (99) |
PPV | 41 of 52 (79) | 36 of 44 (82) | 27 of 33 (82) | 25 of 30 (83) |
NPV | 355 of 358 (99) | 358 of 366 (98) | 360 of 377 (95) | 361 of 380 (95) |
Hospital B (n = 161), n of n (%) | ||||
Sensitivity | 29 of 30 (97) | 24 of 30 (80) | 22 of 30 (73) | 19 of 30 (63) |
Specificity | 131 of 131 (100) | 131 of 131 (100) | 131 of 131 (100) | 131 of 131 (100) |
PPV | 29 of 29 (100) | 24 of 24 (100) | 22 of 22 (100) | 19 of 19 (100) |
NPV | 131 of 132 (99) | 131 of 137 (96) | 131 of 139 (94) | 131 of 142 (92) |
Hospital C (n = 97), n of n (%) | ||||
Sensitivity | 22 of 28 (79) | 22 of 28 (79) | 19 of 28 (68) | 19 of 28 (68) |
Specificity | 68 of 69 (99) | 69 of 69 (100) | 67 of 69 (97) | 69 of 69 (100) |
PPV | 22 of 23 (96) | 22 of 22 (100) | 19 of 21 (90) | 19 of 19 (100) |
NPV | 68 of 74 (92) | 69 of 75 (92) | 67 of 76 (88) | 69 of 78 (88) |
All hospitals (n = 668), n of n (%) | ||||
Sensitivity | 92 of 102 (90) | 82 of 102 (80) | 68 of 102 (67) | 63 of 102 (62) |
Specificity | 554 of 566 (98) | 558 of 566 (99) | 558 of 556 (99) | 561 of 566 (99) |
PPV | 92 of 104 (88) | 82 of 90 (91) | 68 of 76 (89) | 63 of 68 (93) |
NPV | 554 of 564 (98) | 558 of 578 (97) | 558 of 592 (94) | 561 of 600 (94) |
The min at which in-person observation was completed.
Each of the 15 min preceding in-person observation.
Each of the 15 min after in-person observation.
The min at which in-person observation was completed and each of the 15 min before and after in-person observation.
Application of Validated Measure to Monitor Data of Patients With Bronchiolitis in 1 Hospital
A total of 317 hospitalizations were included in the final data set. We excluded 3 hospitalizations for which there were minimal or no monitor data despite >24 hours of supplemental oxygen administration, suggesting that data may not have been transmitted from the bedside monitor to the EHR. As demonstrated in Table 2, patients were monitored overall for a median 17.2 hours (interquartile range [IQR] 2.4–34.4 hours). Excluding a transitional hour after the final documented use of supplemental oxygen, a median 4.1 hours (IQR 1.4–9.4 hours) of monitoring occurred in the absence of supplemental oxygen. Monitoring in the absence of supplemental oxygen accounted for a median 10.6% of patients’ hospital length of stay (IQR 3.7%–24.3%). Patients who received supplemental oxygen during the hospitalization experienced a median 5.6 hours (IQR 3.0–10.6 hours) of monitoring after supplemental oxygen was stopped, and patients who did not receive supplemental oxygen experienced a median 1.4 hours (IQR 0.3–6.7 hours) of monitoring. A total of 94% of patients who received supplemental oxygen experienced >1 hour of monitoring after supplemental oxygen was stopped, and 64% experienced >4 hours of additional monitoring. The extent of Spo2 monitoring was similar by patient age category (2–11 and 12–23 months). A sensitivity analysis that excluded patients in the top 25th percentile of length of stay revealed consistent findings.
Pulse Oximetry Monitoring in Hospitalized Infants With Bronchiolitis Using EHR-Integrated Data
Admission Characteristics . | Monitoring Measure . | |||||
---|---|---|---|---|---|---|
Hospital Length of Stay, h, Median (IQR) . | Monitored Time, h, Median (IQR) . | Monitored Time Without O2,a h, Median (IQR) . | Proportion of Hospitalization Monitored Without O2, Median, (IQR) . | >1 h Monitored Without O2, n (%) . | >4 h Monitored Without O2, n (%) . | |
Overall (N = 317) | 33.1 (21.0–49.5) | 17.2 (2.4–34.4) | 4.1 (1.4–9.4) | 10.6 (3.7–24.3) | 246 (78) | 162 (51) |
Ever on O2 | ||||||
Yes (n = 186) | 42.2 (31.8–60.5) | 30.4 (19.7–42.2) | 5.6 (3.0–10.6) | 13.1 (6.9–22.1) | 175 (94) | 119 (64) |
No (n = 131) | 20.4 (16.0–30.7) | 1.4 (0.3–6.7) | 1.4 (0.3–6.7) | 5.4 (1.4–30.0) | 71 (54) | 43 (33) |
Age, mo | ||||||
2–11 (n = 220) | 34.2 (20.5–54.3) | 17.1 (2.0–34.8) | 4.2 (1.4–9.9) | 10.1 (3.8–23.9) | 172 (78) | 116 (53) |
12–23 (n = 97) | 31.8 (21.5–43.8) | 17.4 (6.3–28.3) | 3.8 (1.2–7.8) | 11.8 (3.5–24.7) | 74 (76) | 46 (47) |
Admission Characteristics . | Monitoring Measure . | |||||
---|---|---|---|---|---|---|
Hospital Length of Stay, h, Median (IQR) . | Monitored Time, h, Median (IQR) . | Monitored Time Without O2,a h, Median (IQR) . | Proportion of Hospitalization Monitored Without O2, Median, (IQR) . | >1 h Monitored Without O2, n (%) . | >4 h Monitored Without O2, n (%) . | |
Overall (N = 317) | 33.1 (21.0–49.5) | 17.2 (2.4–34.4) | 4.1 (1.4–9.4) | 10.6 (3.7–24.3) | 246 (78) | 162 (51) |
Ever on O2 | ||||||
Yes (n = 186) | 42.2 (31.8–60.5) | 30.4 (19.7–42.2) | 5.6 (3.0–10.6) | 13.1 (6.9–22.1) | 175 (94) | 119 (64) |
No (n = 131) | 20.4 (16.0–30.7) | 1.4 (0.3–6.7) | 1.4 (0.3–6.7) | 5.4 (1.4–30.0) | 71 (54) | 43 (33) |
Age, mo | ||||||
2–11 (n = 220) | 34.2 (20.5–54.3) | 17.1 (2.0–34.8) | 4.2 (1.4–9.9) | 10.1 (3.8–23.9) | 172 (78) | 116 (53) |
12–23 (n = 97) | 31.8 (21.5–43.8) | 17.4 (6.3–28.3) | 3.8 (1.2–7.8) | 11.8 (3.5–24.7) | 74 (76) | 46 (47) |
O2, supplemental oxygen.
Refers to monitored time >1 h after supplemental oxygen was stopped or any monitored time for patients who did not receive supplemental oxygen.
Discussion
We validated minute-level EHR-integrated Spo2 monitor data as a measure that can approximate actual Spo2 monitoring use. Using this method of identifying Spo2 monitoring use, we estimated that infants hospitalized with bronchiolitis at 1 children’s hospital were monitored in the absence of supplemental oxygen for a median 10.6% of their hospital length of stay.
Compared with orders, the EHR method we describe here appears to be favorable. In our previous study evaluating the presence of monitoring orders to in-person observation, sensitivity was 49%, specificity was 89%, PPV was 77%, and NPV was 69%; the test characteristics in our study outperform monitoring orders for each of these parameters. Our study also builds on previous work describing the use of Spo2 data streamed from physiologic monitors to estimate aggregate rates of monitored patients at the unit and hospital levels,21,22 as well as a single-center quality improvement initiative that used this method to estimate time on pulse oximetry monitoring in a population of children with wheezing who were not on supplemental oxygen.23 However, to our knowledge, this is the first study used to validate this method against in-person observation and to specifically estimate overuse on the basis of monitoring practices after the transition off supplemental oxygen, as recently recommended by consensus guidelines.7 Moreover, as recent guideline recommendations also address continuous Spo2 monitoring use in hospitalized children experiencing other diseases including asthma, community-acquired pneumonia, and croup,7 leveraging EHR-integrated data may enhance existing quality measurement tools for these diseases and ameliorate the limitations of existing quality indicators in predicting clinical outcomes.24 Finally, in estimating that transitioning patients weaned off supplemental oxygen to intermittent Spo2 monitoring could save up to a median 5.6 hours of monitored time per patient, this study highlights the opportunity for deimplementation efforts in this population to reduce unnecessary Spo2 monitoring and the associated costs,10 iatrogenic harm,11 and alarm fatigue13,14 described in the literature.
We note several limitations of this study. First, although the overall PPV and NPV of EHR-integrated data for actual Spo2 monitoring use were otherwise >80% in the validation phase, the PPV for minute 0 was 79% at hospital A, which after review of observer comments is attributable to at least 4 of 11 discordant instances in which the bedside monitor visual display screen was turned off (leading to a determination that the patient was unmonitored according to our technical definition) but the patient was in fact monitored, representing an area to target for improvement. Second, our full understanding of why EHR-integrated data differed from in-person observation is limited by the large number of discrepancies for which observers did not offer an explanation. Third, although generalizability may be limited by the current availability of EHR-integrated data at some hospitals, this study reveals the value of this tool as hospitals increasingly adopt interoperable health information technology.25,26 Fourth, neither Spo2 data stored in the EHR nor data collected in cross-sectional in-person observation distinguishes between continuous Spo2 monitoring and intermittent Spo2 measurement. However, our methodology overall provides a conservative estimate of Spo2 monitoring use in the absence of supplemental oxygen, which outweighs the likely trivial contribution of intermittent Spo2 measurements captured by coincidence, such as those obtained during routine vital sign measurements. Finally, our estimates of Spo2 monitoring use at a single hospital, such as a low (5.4%) proportion of hospitalization monitored among patients who never received supplemental oxygen, may reflect this hospital’s local monitoring culture and previous participation in the EMO research portfolio and may differ at other sites.
Conclusions
EHR-integrated data can be reliably used to determine actual Spo2 monitoring use at the bedside, and in a sample of infants hospitalized with bronchiolitis, reveal a high burden of Spo2 monitoring that appears to be outside of guideline recommendations after oxygen supplementation has been discontinued. This measure can be leveraged to inform targeted efforts to deimplement unnecessary use of continuous pulse oximetry monitoring in hospitalized patients with bronchiolitis, as well as in other patient populations.
Acknowledgments
We acknowledge the following individuals for their assistance with data collection: Waheeda Samady, MD, MSCI, and Kristin Van Genderen, MD (Ann and Robert H. Lurie Children’s Hospital of Chicago, Chicago, IL); Gaylan Dascanio, MD; Yomna Farooqi, MD; Mayra Garcia, DNP, RN, PCNS-BC; Sarah Khan, MD; Caitlin Reaves, BSN, RN, CPN; and Hailee Scoggins, BSN, RN, MSN (Children’s Medical Center Dallas, Dallas, TX); and Laura Goldstein, MD, and Muida Menon, RN (Children’s Hospital of Philadelphia, Philadelphia, PA).
FUNDING: Research reported in this publication was supported in part by a Cooperative Agreement from the National Heart, Lung, and Blood Institute of the National Institutes of Health under award U01HL143475 (Dr Bonafide, principal investigator). As a Cooperative Agreement, National Institutes of Health scientists participated in study conference calls and provided ongoing feedback on the conduct and findings of the study. Research reported in this publication was supported in part by a grant from the Agency for Healthcare Research and Quality under award R18HS026620 (Dr Bonafide, principal investigator). The funders had no role in the project design; in the collection, analysis, or interpretation of data; in the decision to submit the article for publication; or in the writing of the article. Funded by the National Institutes of Health (NIH).
Dr Kern-Goldberger conceptualized and designed the study, drafted the initial manuscript, and acquired, analyzed, and interpreted data; Drs Rasooly and Muthu conceptualized and designed the study, critically reviewed and revised the manuscript, and analyzed and interpreted data; Dr Luo conceptualized and designed the study, critically reviewed and revised the manuscript, and acquired data; Drs Craig, Parthasarathy, Sergay, Solomon, and Lucey critically reviewed and revised the manuscript and acquired data; Drs Ferro and Ruppel critically reviewed and revised the manuscript and analyzed and interpreted data; Dr Bonafide conceptualized and designed the study, critically reviewed and revised the manuscript, and acquired, analyzed, and interpreted data; and all authors approved the final manuscript as submitted.
References
Competing Interests
POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.
FINANCIAL DISCLOSURE: Dr Bonafide discloses additional grant funding from the National Science Foundation for research related to physiologic monitoring; the other authors have indicated they have no financial relationships relevant to this article to disclose.
Comments