BACKGROUND AND OBJECTIVES

Safety event management systems (SEMS) are rich sources of patient safety information, which can be used to improve organizational safety culture. An ideal SEMS can accomplish this when the system is improved with the intention of increasing learning and engagement across the organization. To support a global aim of improving overall patient safety and becoming a highly reliable learning health system, focus was directed toward increasing event review and follow-up completion and using this information to drive resource allocation and improvement efforts.

METHODS

A new integrated SEMS was customized, tested, and implemented based on identified organizational need. Revised policies were developed to define expectations for event review and follow-up. The new SEMS incorporated a closed-loop communication process which ensured information from events was shared with the event submitters and facilitated shared learning. The expected impacts, improved event reporting, and follow-up were studied and guided ongoing improvements.

RESULTS

After transitioning to a new SEMS, we experienced increased overall reporting by 8.6% and improved event follow-up, demonstrated by documentation on specified system forms, by 53.7%.

CONCLUSIONS

By implementing a new, efficient, and standardized SEMS, which decentralized event management processes, the organization saw increased reporting and better engagement with patient safety event review and follow-up. Overall, these results demonstrated a stronger reporting culture, which allowed for local problem solving and improved learning from every event reported. A robust reporting culture positively impacted the overall organizational culture of safety.

Health care organizations operate in high-risk environments, and patient harm events continue,1  despite increased focus on patient safety.2  In efforts to eliminate this harm, healthcare systems have tried to understand how organizational culture influences patient safety.3  Organizational engagement with a voluntary safety event management system (SEMS), including the recognition of and response to safety events, can be used to improve organizational safety culture and drive system learning.4  However, multiple barriers exist which can negatively impact engagement, such as inefficient reporting platforms, limited psychological safety of reporters, siloed work, or the incorrect perception that patient safety is not an organizational priority.57 

Our new accreditation organization, Det Norske Veritas, mandated development of an organizational quality management system. Therefore, the existing SEMS was evaluated utilizing a process audit of our incident reporting policy and associated procedures. Underreporting of events through query of the electronic medical record (EMR) and missing event follow-up documentation of reported events were recognized as opportunities for improvement. In aviation, safety reporting is used to develop strategies to address training gaps, ineffective processes, necessary structural changes, and organizational culture. Their voluntary SEMS can be a tool to predict problems related to faulty structure and process812  and enables improved situational awareness and understanding of event frequency, patterns, and trends.4,8,13  An organization can use the collected data to facilitate learning through standardized, discrete measures for event entry and follow-up while optimizing system efficiency.14  Decentralizing event follow-up to local leadership can facilitate awareness by sharing through multiple venues, including micro and macro system meetings.14  Functional awareness and understanding of complex processes across the organization are integral to reducing harm and uncovering the health system’s underlying risks.4,8,13 

As in aviation, the organization’s safety culture must support event reporting for healthcare workers in all roles, allowing staff to feel confident that reporting events will not result in retaliation, but rather improvements in patient and employee safety. A literature review identified other critical factors for a successful SEMS, including: (1) eliminating perceptions of event reporting as punitive;2,9,15,16  (2) embedding just culture (JC) principles into event reporting policies and procedures to ensure balanced accountability for both individuals and the organization;16,17  (3) improving ease and efficiency of use for input and export of meaningful event-related data;5,6  (4) utilizing this data for shared learning and driving change;5,6,8,18  and (5) providing closed-loop communication of event follow-up.19 

The global aim of this project was to improve overall safety and support the organizational journey to becoming a highly reliable learning health system (LHS). Improving the organization’s SEMS, implementing policies and practices that create a safe environment for reporting, providing user training, and decentralizing the event follow-up were expected to help achieve this aim. The SMART aim was to increase the percentage of event reports with completed follow-up, from 60.9% to 100% by October 2019 and sustain for at least 6 months. Follow-up compliance was defined as local leadership completing the designated areas of the SEMS documentation regarding event review and actions taken to mitigate risk or prevent reoccurrence.

This quality improvement project occurred at a pediatric health system with a freestanding 206-bed children’s hospital with 31 service locations, 2 ambulatory surgery, 3 urgent care, and 20 primary care locations. Mean annual volumes were ∼46 000 total patient days and ∼759 000 ambulatory visits from 2017 to 2020.

The literature, process audit findings, and 1:1 and small group informal user feedback were reviewed in the evaluation of the SEMS to identify desired traits to create the ideal SEMS. It was determined that our health system’s internally crafted incident reporting system, KiD Katches, would require considerable redesign to be comparable to other commercially available SEMS; however, no commercial options were designed for pediatric health systems. A key driver diagram (Supplemental Fig 5) was developed. One of the drivers was the creation and implementation of a customized version of the Verge Health (Mount Pleasant, South Carolina) Converge Platform Patient & Employee Safety - Event Management module to best address pediatric safety events. The second driver focused on improved learning and engagement with the new SEMS. Our operational definitions and outcome, process, and balancing measures are described in Table 1. This project was undertaken as an organizational quality improvement initiative, did not constitute human subject research, and therefore, did not require Institutional Review Board approval.

TABLE 1

Operational Definitions

Operational Definition
Term
 Completed event follow-up When events have been reviewed in the system and follow-up documentation has been provided in designated areas of the event forms 
 Provider event follow-up New process to engage the medical directors in the event review and follow-up process for events that directly involve providers. 
 Increased reporting The total number of reports entered in the event management system measured on a monthly basis 
 Event entry time The amount of time lapsed from entering the event management system to start reporting an event and the completion of submission as automatically measured by the system 
 Anonymous submissions Event reports submitted by an end user without utilizing their system, single sign-on or by entering their names in designated areas of the initial event forms 
Measures  
 Outcomes measure Completed event follow-up 
 Process measures Provider event follow-up 
 Anonymous submissions 
 Event entry time 
 Balancing measure Total number of reports entered monthly 
Operational Definition
Term
 Completed event follow-up When events have been reviewed in the system and follow-up documentation has been provided in designated areas of the event forms 
 Provider event follow-up New process to engage the medical directors in the event review and follow-up process for events that directly involve providers. 
 Increased reporting The total number of reports entered in the event management system measured on a monthly basis 
 Event entry time The amount of time lapsed from entering the event management system to start reporting an event and the completion of submission as automatically measured by the system 
 Anonymous submissions Event reports submitted by an end user without utilizing their system, single sign-on or by entering their names in designated areas of the initial event forms 
Measures  
 Outcomes measure Completed event follow-up 
 Process measures Provider event follow-up 
 Anonymous submissions 
 Event entry time 
 Balancing measure Total number of reports entered monthly 

All sites used the previous KiD Katches system, which was accessible from the EMR. Entries started with the selection of an overarching event group with one level of subgrouping. It used narrative, free text fields for both event entry and follow-up documentation. Over time, some event groups’ questions were modified to be more event-specific; however, this was the exception. The information collected was not standardized for data analysis nor readily searchable to facilitate breaking down event data beyond event groups, subgroups, and locations. Event submitters and reviewers were, at times, frustrated with these perceived and realized time-intensive workflow processes, which, in turn, limited organizational engagement with related safety efforts. This format limited analysis and data trending without time-intensive review and indexing of event details, which further deterred the use of the prior system to identify patient safety issues. Identification of safety trends from event reports progressively shifted away from local leadership and transformed into a reliance on the Quality and Safety Department to complete this essential activity.

Over the course of 3 years (2018–2020), feedback was garnered from across the organization to develop standardized, event entry and follow-up forms with accompanying processes to permit optimal tracking and trending of safety event data over time and increase local unit-based engagement, responsibility, and accountability with timely event problem solving.

In the new system, each entry starts with patient demographic information from the EMR and includes key data to future health equity analysis of events. Event detail forms contain specific questions based on the overarching event type selected. To ensure essential facts are collected, predefined, conditional branching logic triggers the addition or subtraction of questions based on answers provided to foundational questions. This format minimizes the number of “click selection” responses required, while maximizing discrete data points for building meaningful reports. For example, customized forms contain questions related to compliance with bundle elements used to reduce the risk of hospital-acquired conditions, defined by the Children’s Hospitals’ Solutions for Patient Safety Learning Network (SPS) (Cincinnati, OH; https://www.solutionsforpatientsafety.org).5,6  As such, in the event type “Hospital Acquired Infections and Infection Prevention and Control Concerns” form, when the answer to the first question is “Central Line Associated Blood Stream Infection,” follow-up questions focus data collection on catheter details (eg, type, size, location), insertion (if applicable) and maintenance bundle elements, which become discrete data points for reports, analysis, and dashboards (Supplemental Fig 6). The follow-up form build mirrored that of the initial entry to capture expected follow-up actions and documentation.

To meet evolving organizational needs, the new SEMS is easily modifiable. Improvements to the original customized forms were made based upon frontline feedback over the first 6 months post implementation and continue to occur as ongoing process improvement. For example, a large iteration, completed in July 2019, included creating new event type forms identified through analysis of “Other” events submitted as well as adding questions and answers to address outpatient reporting needs.

“Locations” were created and included departments and service lines. Each was assigned an “Area Manager” responsible for reviewing events for their Location. A new process to improve direct engagement with our providers was initiated by creating new provider locations. Medical directors were then assigned as Area Managers, including the Pediatric Residency program with chief residents in this role. The organization monitored provider follow-up separately, as a subset of the overall follow-up compliance because of implementing this new process. Specialized reviews were performed by identified subject matter experts based on event types, such as the medication safety pharmacist for all medication events. The SEMS generated reminder e-mails for outstanding event reviews and color-coded task notices to identify events not completed per policy, thus it created visual cues and built accountability. When there was persistent incomplete follow-up, the Quality and Safety team provided reeducation and used progressive escalation to include the chief quality and safety officer who communicated concerns to senior leadership.

Standardized event entry streamlined entry times and facilitated review by local leaders. By ensuring each entry contained the correct information, events were routed to the proper reviewer, thereby increasing the completed follow-up compliance rate. Time for event entry and event review compliance rates were tracked.

Implementation of the new SEMS included establishing expectations for reviewers in the policy and platform training.5 

The event reporting policy revisions included step-by-step procedures to define reporting expectations, local review and follow-up processes, and resource materials. Embedding JC principles9,15,17  and psychological safety20  in the policy enabled balanced reporting to support a safe learning environment and problem-solving culture. A JC algorithm (Supplemental Fig 7) was developed to clearly define when safety events resulting from an individual failure could result in disciplinary actions.

Anonymous submissions were allowed to improve reporting by submitters who did not feel safe identifying themselves. All anonymous submissions were reviewed by quality and safety leadership and, when the submitter identified themself within the event details, follow-up with the submitter occurred. A confidential response e-mail was developed to encourage non anonymous submissions, explain system limitations created and, most importantly, affirm psychological safety. Documented follow-up became a means of closed-loop communication with submitters regarding details gathered during event review, corrective actions taken, and lessons learned, preventing recurrence. This new system generated automated submission acknowledgment e-mails and follow-up notifications upon the review's conclusion, including event classification and completed causal analysis.

During implementation, multiple modes of training were used, including live 1-on-1 and group sessions, recorded videos, and tip sheets. The policy and training materials are available on our intranet and new users use those resources. Annual group training is performed with each class of rising chief residents. Additional training is provided as needed.

It was anticipated that a more efficient SEMS, with user support and resources available to overcome knowledge gaps, would result in increased overall reporting. When coupled with policies which support both JC and psychological safety to eliminate the fear of retaliation, without removing the anonymous submission option, a culture of reporting should be encouraged. Anonymous submissions will continue to be monitored to determine if the anticipated changes in the reporting culture are impacting the organization’s safety culture.

Statistical process control charts displayed data over time and Shewhart control chart rules were used for analysis and interpretation.21  All data analysis used Microsoft Excel (Redmond, WA) and QI Macros (Denver, CO).

Analysis of event reporting data from the previous system for January 2017 through September 2018, consisting of 9247 reports, was used to establish baselines for comparison. Any changes from the baseline data were tracked after the interval process change in October 2018, used to represent implementation of the new SEMS, annotated on figures, where comparison data were available.21  Analysis of post implementation event reporting data (October 2018 – March 2021) included 13 979 event reports.

The p-chart (Fig 1) shows an overall increase in event follow-up. The mean rate increased from 60.9% to 93.6% from January 2017 to March 2021. Beginning in November 2019 and again in June 2020, there was an upward shift in the centerline due to 8 data points above the centerline.21  Provider follow-up compliance, a subset of overall data from Fig 1, was tracked as a measure of engagement. The follow-up centerline for attending and resident physicians was 68.5% and shifted in May 2020 to 83.3% (21.6% improvement) (p-chart, Fig 2). The mean time necessary to submit an event report, a balancing measure for increased reporting, between 2017 and 2020 (X-bar chart, Fig 3) decreased from 16.6 to 10.9 minutes, a 5.7 minute reduction per the findings of automated reports from the previous and new systems.

FIGURE 1

Events with completed follow-up p-chart.

FIGURE 1

Events with completed follow-up p-chart.

Close modal
FIGURE 2

Events with complete provider follow-up p-chart.

FIGURE 2

Events with complete provider follow-up p-chart.

Close modal
FIGURE 3

Event entry time in minutes X-bar chart.

FIGURE 3

Event entry time in minutes X-bar chart.

Close modal

The average number of event entries per month (c-chart, Fig 4) increased from 440 to 478 (8.6%) from implementation through March 2021. However, from April to June 2020, the coronavirus disease 2019 (COVID-19) pandemic impacted our organization with a resulting decline in overall patient volume, reducing the number of reports submitted and decreasing the average monthly entries, representing special cause variation. These data points were omitted from the calculations for the centerline and control limits.21  When more normal operations resumed in July 2020, event entries started showing a return to the pre-COVID disruption levels (c-chart, Supplemental Fig 8).

FIGURE 4

Event reports entered c-chart.

FIGURE 4

Event reports entered c-chart.

Close modal

Anonymous submissions were tracked in the new SEMS as an indicator of psychological safety associated with event reporting (p-chart, Supplemental Fig 9). The centerline started at 12.1%, with downward shifts in December 2019 (9.9%) and June 2020 (5.4%), when 8 data points were consecutively below the centerline, representing a 55.3% reduction.

In summary, our redesigned SEMS was associated with increased event entry and ultimately event-related action item follow-up. The submitting process became simpler and consistent, allowing staff to anticipate time and information needed to report. When team members saw value placed on reporting through the SEMS, they felt safe and wanted to report more, as demonstrated by the decline in anonymous reporting and increase in overall entries. Actionable data elements from the SEMS allow the organization to align strategic goals, identify high-risk areas of concern, monitor zero harm initiatives, and track patient outcomes.5,6,18  This information is compiled and fed into our annual organizational strategic planning process. The ongoing SEMS-mediated learning is key to organizational agility and supports the journey to becoming a high reliability organization.22 

Event follow-up documentation allows the health system to learn from every event.14,18,23  By constructing an efficient SEMS, we experienced improvements across all disciplines in identifying, reporting, addressing, and eliminating risks, thus making our workplace safer.4,24  By integrating JC principles and providing closed-loop communication of completed follow-up, we created trust in the event reporting process, which is evident in the decline of anonymous reporting. By making the submitting process simpler, it is more predictable and reliable, thereby reducing barriers to reporting caused by time constraints and competing priorities.

Timely event review and follow-up, including problem solving, implementing mitigating solutions, and direct feedback to the submitter, are vital in sustaining a reporting culture.5,6  Submitters need to see organizational changes resulting from their event reporting. At our organization, these changes were shared broadly at the daily safety brief, safety coach meetings, and other system and leadership meetings. Health systems must also recognize team members’ efforts to identify and report events and opportunities for improvement.24  Creating a climate of safety requires the leadership team to model behaviors that demonstrate patient safety as a priority and coach frontline staff to speak up for safety.57,9  Adequate event follow-up, which can now be reviewed by submitters, reflects leadership’s appreciation of frontline staff’s commitment to safety and results in long-term culture changes within the organization.5 

There is a debate as to whether SEMS can improve culture and organizational learning by itself. Most hospital SEMS depend upon voluntary reporting rather than advanced automated incident monitoring systems, like those found on an automotive assembly line, which provide alerts when defects are noted in real time. Nonetheless, a SEMS is important for organizational learning when information is disseminated transparently with shared responsibility and accountability. There are multiple approaches to organizational learning. Defect mitigation often requires modification of organizational values and norms to optimize learning.8,2527 

Senge28  described 5 key disciplines that need to converge to innovate learning organizations: systems thinking, personal mastery, mental models, building shared vision, and team learning. We have tried to implement these aforementioned traits into our organization in an attempt to build the ideal LHS through removing the punitive elements of event reporting and replacing them with learning opportunities. As team member psychological safety is assured, safety reporting increases in frequency. This results in identification of more learning and improvement opportunities, which then reinforces the organizational value placed on event reporting and restarts the cycle of positively impacting the organization’s safety culture. Our organization is still on the journey to becoming a learning system, facilitated by our robust SEMS.

There were several limitations. Our SEMS depends upon voluntary reporting, which captures only a fraction of the safety events that are otherwise detectable through chart reviews or automatic trigger systems. The relationship between the frequency of SEMS reports and objective measures in health systems is unclear.29  The ability to monitor and measure user logins for the sole purpose of event review is not available from the SEMS vendor.

Definitions for ambulatory encounters are not fully established, so we used the broader definition as adopted by the SPS Ambulatory Safety workgroup as of August 2020, which will likely change as this work evolves.

The overall quality of the event follow-up documentation was not pursued as part of this paper but will be a future goal. A new analytic platform for building dashboards customizable to quality initiatives of a clinical division at the local department level was implemented. These dashboards permit regular visual management of safety event data and will become integral to risk identification.

The creation of a robust SEMS supports a culture of reporting and allows for local problem solving and learning to occur from every reported event. Barriers to reporting are reduced when staff can anticipate how much time and information are needed to facilitate the review and follow-up of safety events. By responding to these events in a timely and effective manner, local leadership demonstrates their commitment to safety and quality, continuous learning, and improvement.30,31  This behavior encourages a transparent culture, where acceptance and consideration of all viewpoints is normalized, assuring psychological safety, and providing staff with confidence that their concerns will be met with timely follow-up.30,31  A health system becomes safer by learning from these events and implementing cyclical, systematic changes to improve the safety of care delivery.18  The importance of having a solid organizational safety culture that supports the journey to becoming an LHS, and eventually a high reliability organization, cannot be overstated.

We thank Mr Christopher Mangum, Mr Robert Sepanski, and Dr Arno Zaritsky for their review and guidance with documenting these efforts. We thank Ms Susan Berman, Ms Yvette Conyers, Ms Elizabeth Martinez, Ms Elizabeth Rogers, Ms Angela Ortiz, and Ms Erin Smith, along with many other subject matter experts across the organization, for their contributions to building the safety event management system forms and conditional branching. We thank Ms Karen Mitchell, Dr Christopher Foley, and Dr Robert Kelly for providing fundamental support during implementation as agents of change management. We thank Dr Lloyd Provost and Dr Robert Lloyd for providing invaluable statistical process control chart guidance.

FUNDING: No external funding.

CONFLICT OF INTERST DISCLOSURES: The authors have indicated they have no financial relationships relevant to this article to disclose.

Ms Dawson conceptualized and managed the implementation of the new system, designed the data collection instruments, carried out the initial analyses, conceptualized and drafted the initial manuscript, reviewed and revised the manuscript, and approved the final manuscript as submitted;

Ms Saulnier reviewed supporting data analyses and interpretation, critically reviewed and revised the manuscript, and approved the final manuscript as submitted; Mr Campbell supported the implementation of the new system, critically reviewed supporting data analyses and interpretation, reviewed and revised the manuscript, and approved the final manuscript as submitted; Dr Godambe supported the implementation of the new system, conceptualized and critically reviewed supporting data analyses and interpretation, conceptualized and critically reviewed and revised the manuscript, and approved the final manuscript as submitted.

1.
Bates
DW
,
Singh
H
.
Two decades since to err is human: an assessment of progress and emerging priorities in patient safety
.
Health Aff (Millwood)
.
2018
;
37
(
11
):
1736
1743
2.
Kohn
LT
,
Corrigan
JM
,
Donaldson
MS
.
To err is human: building a safer health system
.
1999
.
3.
Pronovost
PJ
,
Berenholtz
SM
,
Goeschel
CA
et al
.
Creating high reliability in health care organizations
.
Health Serv Res
.
2006
;
41
(
4 Pt 2
):
1599
1617
4.
Crandall
KM
,
Almuhanna
A
,
Cady
R
et al
.
10,000 good catches: increasing safety Event reporting in a pediatric health care system
.
Pediatr Qual Saf
.
2018
;
3
(
2
):
e072
5.
Louis
MY
,
Hussain
LR
,
Dhanraj
DN
et al
.
Improving patient safety event reporting among residents and teaching faculty
.
Ochsner J
.
2016
;
16
(
1
):
73
80
6.
Szymusiak
J
,
Walk
TJ
,
Benson
M
et al
.
Encouraging resident adverse event reporting: a qualitative study of suggestions from the front lines
.
Pediatr Qual Saf
.
2019
;
4
(
3
):
e167
7.
Alingh
CW
,
van Wijngaarden
JDH
,
van de Voorde
K
,
Paauwe
J
,
Huijsman
R
.
Speaking up about patient safety concerns: the influence of safety management approaches and climate on nurses’ willingness to speak up
.
BMJ Qual Saf
.
2019
;
28
(
1
):
39
48
8.
Stavropoulou
C
,
Doherty
C
,
Tosey
P
.
How effective are incident-reporting systems for improving patient safety? a systematic literature review
.
Milbank Q
.
2015
;
93
(
4
):
826
866
9.
Paradiso
L
,
Sweeney
N
.
Just culture: it’s more than policy
.
Nurs Manage
.
2019
;
50
(
6
):
38
45
10.
ASRS
.
(Pub. 60) ASRS: The case for confidential incident reporting
.
2001
.
11.
ASRS
.
ASRS program briefing
.
2020
.
Available at: https://asrs.arc.nasa.gov/docs/ASRS_ProgramBriefing.pdf. Accessed July 30, 2021
12.
Vincent
C
.
Incident reporting and patient safety
.
BMJ
.
2007
;
334
(
7584
):
51
13.
de Vos
MS
,
Hamming
JF
,
Chua-Hendriks
JJC
,
Marang-van de Mheen
PJ
.
Connecting perspectives on quality and safety: patient-level linkage of incident, adverse event and complaint data
.
BMJ Qual Saf
.
2019
;
28
(
3
):
180
189
14.
Friedman
CP
,
Allee
NJ
,
Delaney
BC
et al
.
The science of learning health systems: foundations for a new journal
.
Learn Health Syst
.
2016
;
1
(
1
):
e10020
15.
Reason
JT
.
Managing the Risks of Organizational Accidents
.
Ashgate
;
1997
16.
Boysen
PG
II
.
Just culture: a foundation for balanced accountability and patient safety
.
Ochsner J
.
2013
;
13
(
3
):
400
406
17.
Meadows
S
,
Baker
K
,
Butler
J
.
The incident decision tree: guidelines for action following patient safety incidents
. In:
Henriksen
K
,
Battles
JB
,
Marks
ES
,
Lewin
DI
, eds.
Advances in Patient Safety: From Research to Implementation
.
2005
18.
Noritz
G
,
Boggs
A
,
Lowes
LP
,
Smoyer
WE
.
“Learn from every patient”: how a learning health system can improve patient care
.
Pediatr Qual Saf
.
2018
;
3
(
5
):
e100
19.
Keefer
P
,
Helms
L
,
Warrier
K
,
Vredeveld
J
,
Burrows
H
,
Orringer
K
.
SAFEST: use of a rubric to teach safety reporting to pediatric house officers
.
Pediatr Qual Saf
.
2017
;
2
(
6
):
e045
20.
Edmondson
AC
.
The Fearless Organization: Creating Psychological Safety in the Workplace of Learning, Innovation, and Growth
.
John Wiley & Sons, Inc
;
2018
21.
Provost
LP
,
Murray
SK
.
The Health Care Data Guide: Learning from Data for Improvement
.
Jossey-Bass
;
2011
22.
Weick
KE
,
Sutcliffe
KM
.
Managing the Unexpected: Sustained Performance in a Complex World
. Third ed
John Wiley & Sons, Inc
;
2015
.
23.
Smoyer
WE
,
Embi
PJ
,
Moffatt-Bruce
S
.
Creating local learning health systems: think globally, act locally
.
JAMA
.
2016
;
316
(
23
):
2481
2482
24.
Chassin
MR
,
Loeb
JM
.
High-reliability health care: getting there from here
.
Milbank Q
.
2013
;
91
(
3
):
459
490
25.
Argyris
C
,
Schön
DA
.
Organizational Learning: A Theory of Action Perspective
.
Addison-Wesley Publishing Company
;
1978
26.
Argyris
C
,
Schön
DA
.
Organizational Learning II: Theory, Method and Practice
.
Addison-Wesley Publishing Company
;
1996
27.
Garvin
DA
.
Building a learning organization
.
Harv Bus Rev
.
1993
;
71
(
4
):
78
91
28.
Senge
PM
.
The Fifth Discipline: The Art and Practice Of The Learning Organization
.
Penguin Random House, LCC
;
2006
29.
Howell
AM
,
Burns
EM
,
Bouras
G
,
Donaldson
LJ
,
Athanasiou
T
,
Darzi
A
.
Can patient safety incident reports be used to compare hospital safety? results from a quantitative analysis of the English National Reporting and Learning System data
.
PLoS One
.
2015
;
10
(
12
):
e0144107
30.
Conway
J
,
Federico
F
,
Stewart
K
,
Campbell
MJ
.
Respectful management of serious clinical adverse events
.
31.
National Steering Committee for Patient Safety
.
Safer together: a national action plan to advance patient safety
.
Available at: www.ihi.org/SafetyActionPlan. Accessed November 18, 2020

Supplementary data