In 2009, the Association of Pediatric Program Directors (APPD) Longitudinal Educational Assessment Research Network (LEARN), a national educational research network, was formed. We report on evaluation of the network after 10 years of operation by reviewing program context, input, processes, and products to measure its progress in performing educational research that advances training of future pediatricians. Historical changes in medical education shaped the initial development of the network. APPD LEARN now includes 74% (148 of 201) of US Pediatric residency programs and has recently incorporated a network of Pediatric subspecialty fellowship programs. At the time of this evaluation, APPD LEARN had approved 19 member-initiated studies and 14 interorganizational studies, resulting in 23 peer-reviewed publications, numerous presentations, and 7 archived sharable data sets. Most publications focused on how and when interventions work rather than whether they work, had high scores for reporting rigor, and included organizational and objective performance outcomes. Member program representatives had positive perceptions of APPD LEARN’s success, with most highly valuing participation in research that impacts training, access to expertise, and the ability to make authorship contributions for presentations and publication. Areas for development and improvement identified in the evaluation include adopting a formal research prioritization process, infrastructure changes to support educational research that includes patient data, and expanding educational outreach within and outside the network. APPD LEARN and similar networks contribute to high-rigor research in pediatric education that can lead to improvements in training and thereby the health care of children.
Collaborative networks have emerged as a powerful facilitator of medical education research that has often been limited by small sample sizes, lack of generalizability, and inequitable access to research resources.1,2 The APPD formed LEARN in 2009 to conduct educational research in the graduate training of pediatricians.3 APPD LEARN’s mission is “to conduct meaningful educational research that advances the training of future pediatricians by developing and promoting participation and collaboration in research by program directors for the purpose of improving the health and well-being of children.” The mission incorporates 6 broad strategic activities: conducting network research studies, maintaining a study repository including shareable data, professional development of program directors as education researchers, consultation on study design and analysis, communication or dissemination of network activities within and outside APPD, and coordination of research collaborations with other organizations.
During 2019 to 2020, APPD LEARN conducted a 10-year evaluation process to characterize the achievements and needs of the network and make recommendations for the network’s next decade of operation. We designed the evaluation around APPD LEARN’s mission-based logic model adopted in 2016 (Fig 1). Logic models are designed to structure the development, execution, and evaluation of organizational programs: for each of the network’s strategic activities, the logic model outlines anticipated inputs (necessary resources), outputs (measurable products of the activity), outcomes to be achieved, and aspirational program impacts.4 The goal of this article is to describe APPD LEARN’s early successes and challenges and to provide insight to others seeking to establish research networks.
Mission-based logic model of Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network (APPD LEARN), May 2016.
Mission-based logic model of Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network (APPD LEARN), May 2016.
Formation or Evolution
Two major changes in graduate medical education (GME) motivated several of the earliest studies conducted by APPD LEARN: duty hours restrictions5 and the implementation of the Accreditation Council of Graduate Medical Education’s (ACGME) Next Accreditation System, including twice-yearly reporting of resident competence (milestones).6,7 More broadly, during APPD LEARN’s first 10 years, medical education in the United States has increasingly focused on competency-based education,8–10 trainee burnout and resilience,11–13 and diversity, equity, and inclusion concerns.14 These contextual developments have been reflected in network activities (discussed later).
In addition, the emergence of medical education research networks themselves has shaped APPD LEARN’s mission and structure. There was already an active research network, CORNET,15 studying the impact of resident education in pediatric continuity practices on patient outcomes, and CORNET leadership provided uniquely useful consultation to APPD during APPD LEARN’s formation. To prevent duplication of effort with CORNET, APPD LEARN oriented its mission away from patient outcomes and focused solely on educational outcomes in contexts not limited to continuity clinics. Similarly, because APPD had an existing system for conducting surveys of residency program directors,16 APPD LEARN focused on studies in which residents are the target. In 2013, APPD LEARN assisted the Council of Pediatric Subspecialties, the APPD Fellowship Directors’ Executive Committee, and the American Board of Pediatrics (ABP) in the formation of the Subspecialty Pediatrics Investigator Network (SPIN), an educational research network for fellowship programs.17 At first, APPD LEARN provided research and data infrastructure to SPIN to support its initial studies of pediatric subspecialty entrustable professional activities (EPAs). In 2020, recognizing the economies of shared infrastructure and interest in examining outcomes across training periods, SPIN became a subnetwork of APPD LEARN. APPD LEARN’s original residency program network was renamed the Pediatric Residency Investigators in Medical Education subnetwork.
Current Resources
Network Sites
By 2019, 74% of pediatric residency programs (148 of 201 US programs) had joined APPD LEARN, indicating intention to participate in future studies. Of these, 114 (77%) had participated in at least 1 network study; 76 (51%) had participated in >1 study. As of this report, membership has increased to 157 residency programs in the Pediatric Residency Investigators in Medical Education subnetwork; SPIN has engaged over 650 fellowship program directors (out of ∼870 at the time) in at least 1 survey.17
Funding
The network is funded by APPD through membership dues and other general revenue. Stable and consistent funding is an enormous advantage for a research network, because it allows the network to focus on conducting studies rather than seeking funding, and enables the network to choose projects to support based on their scientific value. However, network studies have also sometimes been funded in whole or part through grants with subawards to APPD, or through contracts between partner organizations and APPD. Granting and partnering organizations have included the ABP and the ABP Foundation, the National Board of Medical Examiners, Association of American Medical Colleges, the Josiah Macy Foundation, the Council on Medical Student Education in Pediatrics, APPD’s own special projects grant program, and local institutional grants to investigators.
Human Resources
The network’s current director is a PhD medical education researcher. Choosing a network director who is not a pediatric program director (or even a physician) has pros and cons. A nonclinician network director may have useful expertise in educational research and fewer conflicting pressures on their time than a busy clinician-educator; a nonclinician director’s effort may also be less expensive, allowing the network to obtain more of the director’s time. On the other hand, a nonclinician director must closely consult with pediatric program directors to ensure that network studies address practical program needs and take notice of the regular cycles associated with GME program applications, admissions, and transitions.
Since 2012, the network has employed a part-time program manager through Degnon Associates, Inc. (McLean, VA), APPD’s association-management company, and sometimes a research assistant. Degnon Associates also provides oversight and support as needed for financial accounting, legal review, and other organizational needs. Historically, the program manager has focused on meeting coordination, development of study site institutional review board (IRB) kits, assistance with site questions, building and fielding online survey instruments, data cleaning, and data archiving.
Since inception, APPD LEARN has had 3 major standing committees. The advisory committee provides guidance to the director, sets policies for network activities and resources, and develops calls for research proposals. It includes nonvoting representatives to facilitate coordination between APPD LEARN and other APPD initiatives or closely-aligned organizations (eg, ABP). The proposal review committee reviews member-initiated study proposals. The educational development committee is charged with identifying and developing educational opportunities for network members, including conducting member surveys for needs assessment and evaluation purposes.
The SPIN subnetwork’s director is an MD fellowship program director and education researcher. SPIN’s steering committee, which includes representatives from each of the accredited pediatric subspecialties with certification primarily by the ABP, is now a fourth standing committee that promulgates policies for its subnetwork, reviews proposals involving multiple subspecialties, and assists in study recruitment of fellowship programs by subspecialty.
Information Technology and Communications
The network’s information technology resources include a data repository system,18 a contract with LimeSurvey GmbH (Hamburg, Germany) to provide the online survey platform, electronic forums for communicating with members, a membership database, and a section of the association Web site for providing network information and guidance. The network’s primary member communication vehicles are its Web site, an electronic mailing list of member program representatives (most often the program director), the monthly APPD newsletter, and presentations at the APPD annual spring and fall meetings.
A key component of the network is its research identifier (ID) algorithm, which enables programs and learners to generate a secure perpetual data collection ID on demand.3 Data submitted to the network are identified only by this data collection ID, so network personnel are unable to identify learners; data archived or shared outside the network employ a reencrypted data-sharing ID that prevents those who can generate data collection IDs from reidentifying learners.
Activities and Contributions
Review and Implementation of Network Studies
APPD LEARN accepts proposals in 3 review cycles each year. APPD LEARN encourages applicants to discuss their projects in advance with the director and seeks to provide comprehensive consultation and guidance in areas including study design, sampling and sample size estimation, survey or measurement design, best practices for collecting demographic data, planning quantitative and qualitative analyses, and approaches to study group coordination. Preproposal discussion has also sometimes allowed the network to match up investigators with similar interests to produce stronger collaborative proposals, as well as to identify mentoring opportunities for investigators seeking to develop additional skills (eg, in statistical analysis).
From 2012 to 2020, 27 member-initiated studies were reviewed and 19 approved. Before 2020, proposals were reviewed by the proposal review committee. Since 2020, recognizing the need for wider expertise, proposals have been reviewed by at least 1 proposal review committee member and additional ad hoc external reviewers before discussion by the full committee. Reviewers score proposals as excellent, very good, good, fair, or poor and provide feedback organized around 3 criteria: significance of the research question to pediatric GME, quality of the research plan, and feasibility or appropriateness for the network. Proposals that receive at least 1 “good” score are discussed by the committee, which makes holistic recommendations to the director, who makes the final determinations of which studies will be approved on the basis of the review criteria and network resources. The most common reasons that proposals are declined are when they ask a research question of narrow interest or expect to enroll all residents from all network programs (rather than a sample suitable for inference).
In addition to member-initiated studies, APPD LEARN has engaged in 14 interorganizational studies, which are discussed later. Figure 2 shows the cumulative number of member-initiated proposals approved and declined as well as collaborations. APPD LEARN supports approved studies by developing recruitment and IRB materials, building and fielding data collection instruments, conducting data analyses, and contributing to publications and presentations as needed by the study investigators.
Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network cumulative number of member-initiated proposals approved and declined as well as collaborations, 2012 to 2020.
Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network cumulative number of member-initiated proposals approved and declined as well as collaborations, 2012 to 2020.
Data Sharing
The network archives shareable data from its studies along with codebooks, site kits, IRB approvals, and manuscripts. Less formal data sharing among network projects is also common. For example, several studies have adopted processes for new investigators to affiliate, suggest secondary questions, and receive subsets of study data. Investigators have also collaborated by using one study’s programs as a recruitment population for another study or by linking data to answer new questions involving different variables collected in each study.
Member Professional Development
Beyond study participation, member development activities have included opportunities to serve on network committees and lead educational workshops held at the APPD annual spring and fall meetings that provide research skill training to other members in areas including formulating research questions, study design, and survey research. The network director and program manager regularly consult with investigators and provide guidance in all stages of study proposal, design, human subjects protections, conduct, analysis, and dissemination. In member surveys conducted in 2016 and 2019, members indicated they highly valued participation in the network, including participating in research that will impact training, access to expertise, and the ability to contribute to authorship of publications and presentations.
Research Guidance
The network has developed and published on its Web site guidance on best practices for study group management and authorship in studies with a large number of investigators. For example, guidance refers to the International Committee of Medical Journal Editors authorship criteria, encourages study principal investigators to provide paper and presentation coauthorship opportunities to all participating site investigators, explains the use of group authors and PubMed’s distinction between authors and collaborators, and offers suggestions for how nonauthor investigators can represent their study participation in their curricula vitae.
The network has also developed documents to facilitate IRB approval for network studies, including model language for IRB protocols, a letter describing the network’s data security provisions, and a model data use agreement. Institutions’ own data use agreements are rarely suitable because they do not anticipate perpetual data storage and reuse. However, APPD LEARN has had considerable success in proposing its model agreement to institutions.
Interorganizational Collaborations
The network has engaged in successful collaborations with other medical education organizations, including organizations engaged in undergraduate medical education and specialties other than Pediatrics. Table 1 shows interorganizational studies and the network’s role(s) in each. Many of these collaborations include financial contributions from the partner organizations to help support the network and defray study costs. Nearly all partners provide a perpetual license for the network to reuse the deidentified data collected during the study. In addition, ACGME and ABP provide the network with milestone ratings and in-training exam scores, respectively, identified by APPD LEARN IDs, to facilitate linking those data into network studies when relevant.
Collaborative studies Conducted by the Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network, 2012–2020.
Starting y . | Project . | Partner(s) . | APPD LEARN Role(s) . | ||
---|---|---|---|---|---|
Study Design . | Data Collection and Management . | Data Analyses . | |||
2012 | Pediatric Milestones Assessment Pilot | NBME | • | • | • |
2014 | Subspecialty EPAs cross-sectional study | ABP, SPIN | • | • | • |
2014 | Pediatric Milestones Assessment Collaborative (PMAC) Module 1 | ABP, NBME | • | • | • |
2015 | Education in Pediatrics Across the Continuum (EPAC) | AAMC | • | ||
2015 | General pediatric EPAs study | ABP | • | • | • |
2016 | PMAC Module 2 | ABP, NBME | • | • | • |
2017 | Fellowship program director survey on EPAs | ABP, SPIN | • | • | • |
2017 | Subspecialty EPAs longitudinal | ABP, SPIN | • | • | • |
2017 | PMAC Module 3 | ABP, NBME | • | • | • |
2017 | Surgical EPAs | ABS | • | • | |
2018 | Milestones and in-training examinations | ABP, ACGME | • | • | • |
2019 | Impact of pediatric boot camps | COMSEP | • | • | |
2020 | Predictors of Success: From Medical Student to Pediatric Resident | COMSEP | • | • | |
2020 | EPA or milestone crosswalk study | ABP | • | • | • |
Starting y . | Project . | Partner(s) . | APPD LEARN Role(s) . | ||
---|---|---|---|---|---|
Study Design . | Data Collection and Management . | Data Analyses . | |||
2012 | Pediatric Milestones Assessment Pilot | NBME | • | • | • |
2014 | Subspecialty EPAs cross-sectional study | ABP, SPIN | • | • | • |
2014 | Pediatric Milestones Assessment Collaborative (PMAC) Module 1 | ABP, NBME | • | • | • |
2015 | Education in Pediatrics Across the Continuum (EPAC) | AAMC | • | ||
2015 | General pediatric EPAs study | ABP | • | • | • |
2016 | PMAC Module 2 | ABP, NBME | • | • | • |
2017 | Fellowship program director survey on EPAs | ABP, SPIN | • | • | • |
2017 | Subspecialty EPAs longitudinal | ABP, SPIN | • | • | • |
2017 | PMAC Module 3 | ABP, NBME | • | • | • |
2017 | Surgical EPAs | ABS | • | • | |
2018 | Milestones and in-training examinations | ABP, ACGME | • | • | • |
2019 | Impact of pediatric boot camps | COMSEP | • | • | |
2020 | Predictors of Success: From Medical Student to Pediatric Resident | COMSEP | • | • | |
2020 | EPA or milestone crosswalk study | ABP | • | • | • |
AAMC, Association of American Medical Colleges; ABS, American Board of Surgery; COMSEP, Council on Medical Student Education in Pediatrics; NBME, National Board of Medical Examiners, SPIN, Subspecialty Pediatrics Investigator Network (Before incorporation into APPD LEARN in 2020).
Dissemination
At the time of this evaluation, investigators of studies that had completed data collection and analyses had published 23 papers, given 31 oral or platform presentations, and presented 35 scientific posters on the basis of studies. Published articles reflect high levels of scientific rigor, important outcomes, and visibility. For example, network studies were more likely than typical medical education studies to ask research questions about how and when interventions work than whether they work (56% vs 12%19 ; P < .001 by χ2), more completely reported their methods and findings than typical studies (Medical Education Research Study Quality Instrument scores 14.1 ± 2.4 vs 9.95 ± 23.4,20 t(230) = 7.8, P < .001), and often studied high-level educational outcomes21,22 such as “organizational impact” (56% of APPD LEARN studies) outcomes or “objective judged performance” (an additional 18%). Most articles have already been cited at least once (median citations 6, interquartile range 2–10). As a whole, network publications currently have an h-index of 8 using Google Scholar citation data; that is, there are at least 8 publications with at least 8 citations each.23 The network’s data repository currently contains materials related to 13 completed studies, including 7 complete and deidentified data sets, with corresponding codebooks and data documentation.
Discussion
From 2009 to 2020, APPD LEARN established itself as a productive educational research network, supporting both member-initiated and collaborative studies that span a range of research questions of importance to Pediatric GME. Results have been published and presented in competitive venues and have been positively received by the field. Through this process, APPD LEARN has developed and promulgated valuable guidance and other infrastructure for medical education network research.
Network membership has expanded and now includes SPIN, a subnetwork of subspecialty fellowship programs. In addition, several collaborative studies have successfully included medical students as participants. As a result, APPD LEARN is well-positioned to conduct research that extends across the continuum of medical education.
Challenges and Lessons Learned
Despite, or in some cases because of, the network’s success, we identified several areas for further attention within APPD LEARN as well as for others developing education research networks.
Research Prioritization
Because network resources are limited, APPD LEARN cannot always accept every proposal. Accepting a proposal of narrow interest today may reduce resources for projects of greater interest in the future. APPD LEARN and similar networks should develop methods to prioritize projects in areas that are most important to its stakeholders, such as workforce equity or impacts on child health.24,25
Human Subjects Protections
APPD LEARN’s research id system facilitates data linking and increases engagement of program directors at participating programs, because they are responsible for generating encrypted research ids based on resident personal identifiers. APPD LEARN itself never receives personal identifiers, and studies are typically found to be exempt by the IRBs at participating sites. Ironically, although IRBs may (and in some cases must) rely on a single central IRB’s approval for nonexempt research, each participating site in an exempt study must obtain its own IRB’s determination of exemption. Although changes had been proposed to the Common Rule in 2015 that would have allowed institutions to let investigators determine exemption on their own on the basis of a decision tool26 (as in some other countries), these changes were not adopted in the final rule.27 Education research networks need to weigh the value of research IDs versus fully anonymous research (which may not require active participation of program directors and thus may require only IRB exemption at the principal investigator’s or network’s institution).
Data Stewardship
A key function of the network is the perpetual archiving and sharing of data. Although most participating programs have not required formal data sharing agreements, some hospital-based programs have required such agreements, often on the basis of fear of liability more properly applicable to research involving protected health information or projects involving only 2 sites. For example, these agreements nearly always assume that data will be destroyed, rather than preserved, at the end of the study; some demand that the institution approve manuscripts before submission. Given the impracticality of making agreements on different terms with dozens of institutions contributing a fraction of the data to the final deidentified data set, APPD LEARN has developed a model data use agreement template that provides the network with a suitable license to make use of site data in compiled research data sets without impediment.
To date, the network has had some notable successes in linking data across studies to conduct additional ancillary research. Although the network has developed terms and conditions to permit data sharing, no data have been shared outside of study investigators. Because principal investigators want sufficient time to present data and publish articles before making that data available to others, there has been some reluctance to apply a “one-size-fits-all” embargo period before making data available. APPD LEARN is considering a process to annually query investigators who have not yet shared their network study data about when they are prepared to do so.
Before data sharing, networks need to consider how to mitigate the possibility of reidentification of learners through demographic information. In part, this safeguard includes appropriate outgoing data-sharing terms and conditions, which APPD LEARN has already developed. Other protections include sharing limited data sets28 or conducting some analyses in-house. Although APPD LEARN has applied each of these approaches on an ad hoc basis, future work will include a study of suitability and effectiveness of a set of standard approaches.
APPD LEARN currently does not collect or maintain any protected health information. However, the use of patient data in the assessment of trainees is at the cutting edge of medical education research.29
Although the network can collect aggregated (at the learner level, by program) patient care data, it does not currently conduct research with patients as a unit of analysis. When patient-level outcomes are an area of interest, networks need to develop standards and approaches tailored to safely collecting patient data, such as those used by the Agency for Healthcare Research and Quality to certify patient safety organizations.
Educational Activities
A network goal is continual development of member knowledge and educational research skills. Although the network has contributed to member development through guidance in the research activities themselves, it has only rarely identified or conducted formal educational activities. APPD LEARN is considering several ways to improve in this area, including asking or requiring investigators to host an educational session that focuses on their experiences in designing, proposing, conducting, or disseminating their projects; holding an annual session focused on available data sets, providing a warm introduction to data that other members could use for secondary analysis, as well as a tutorial on analytic methods; and closer coordination with other professional development programs established in APPD.
Diversity, Equity, and Inclusion
Based on APPD’s mission and values of diversity, equity, and inclusion, APPD LEARN is seeking to diversify network participation and increase equity and inclusion in medical education and medical education research. This goal includes diversifying the educational research network governance and study investigators and coauthors through encouraging and supporting studies led by program directors from historically underrepresented groups. Likewise, the network will encourage research studies that invite the active participation and coproduction of underrepresented residents and studies that address questions of concern to advancing equity and inclusion in training.
Conclusions
Educational research networks, like clinical research networks, facilitate multisite and longitudinal research that can lead to faster and more widespread adoption of effective innovations that can improve child health; for APPD LEARN, these are innovations in the training of pediatricians in the United States who will care for children. Through APPD LEARN, pediatrics educators have participated in collaborative and wide-ranging scholarship that strengthens both the research process and the pediatric training experience. APPD LEARN’s model and outcomes can help inform future research networks in their own development.
Acknowledgments
We thank Patricia B. Mullan, PhD, for her assistance in the initial design of the evaluation, and Rebecca J. Fiala and Amy Binns-Calvey for developmental editing of pre-publication versions of the manuscript.
Dr Schwartz conceptualized and designed the study, conducted data analyses, made substantial contribution to collection and interpretation of the data, and drafted the initial manuscript; Ms King conceptualized and designed the study, made substantial contribution to collection and interpretation of the data, and critically reviewed and revised the manuscript for important intellectual content; Drs Mink, Turner, Abramson, and Blankenburg, and Ms Degnon made substantial contributions to the design of the study and interpretation of the data, and critically reviewed and revised the manuscript for important intellectual content; and all authors approved the final manuscript as submitted and agree to be accountable for all aspects of the work.
This manuscript was first presented as a formal report to the Association of Pediatric Program Directors as “Now We Are 10: Evaluation of the Association of Pediatric Program Directors Longitudinal Educational Assessment Research Network’s First Decade” at the Fall 2021 (virtual) APPD meeting; October 2021; (https://www.appd.org/wp-content/uploads/2021/06/APPD-LEARN-10-year-Evaluation.pdf).
FUNDING: Internal funding for this project was provided by the Association for Pediatric Program Directors (APPD). The funder did not participate in the work.
CONFLICT OF INTEREST DISCLOSURES: Dr Schwartz serves as Director of APPD Longitudinal Educational Assessment Research Network through a contract from APPD to his institution; Ms King was an employee of Degnon Associates, APPD’s management company, during the time of this study; and Ms Degnon is an owner and employee of Degnon Associates. The other authors have indicated they have no potential conflicts of interest to disclose.
- ABP
American Board of Pediatrics
- ACGME
Accreditation Council for Graduate Medical Education
- EPA
Entrustable Professional Activity
- GME
graduate medical education
- ICMJE
International Committee of Medical Journal Editors
- ID
identifier
- IRB
institutional review board
- LEARN
Longitudinal Educational Assessment Research Network
- SPIN
Subspecialty Pediatrics Investigator Network
Comments