BACKGROUND AND OBJECTIVES:

The American Board of Pediatrics (ABP) certifies that general and subspecialty pediatricians meet standards of excellence established by their peers, immediately after training and over the course of their careers (ie, Maintenance of Certification [MOC]). In 2015–2016, the ABP developed the Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) as an alternative assessment to the current proctored, closed-book general pediatrics (GP) MOC examination. This article is 1 of a 2-part series examining results from the MOCA-Peds pilot in 2017.

METHODS:

We conducted quantitative and qualitative analyses with 5081 eligible pediatricians who registered to participate in the 2017 pilot; 81.4% (n = 4016) completed a quarter 4 survey and/or end-of-year survey (January 2018) and comprise the analytic sample.

RESULTS:

The majority of pediatricians considered the MOCA-Peds to be feasible and acceptable as an alternative to the proctored MOC GP examination. More than 90% of respondents indicated they would participate in the proposed MOCA-Peds model instead of the examination. Participants also offered recommendations to improve the MOCA-Peds (eg, enhanced focus of questions on outpatient GP, references provided before taking questions); the ABP is carefully considering these as the MOCA-Peds is further refined.

CONCLUSIONS:

Pilot participant feedback in 2017 suggested that the MOCA-Peds could be implemented for GP starting in January 2019, with all 15 subspecialties launched by 2022. Current and future evaluations will continue to explore feasibility, acceptability, and learning and practice change as well as sustainability of participation.

What’s Known on This Subject:

The Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) is a Web-based knowledge assessment developed by The American Board of Pediatrics as an alternative to an examination at a secure test center. Both the model and Web-based platform were developed with extensive input from practicing pediatricians in 2015–2016.

What This Study Adds:

This study is 1 of 2 articles in which authors report results from the MOCA-Peds 2017 pilot conducted with >5000 pediatricians. We examine the feasibility and acceptability of the MOCA-Peds for generalists and subspecialists maintaining their general pediatrics certification and summarize their recommendations.

The American Board of Pediatrics (ABP), 1 of 24 boards affiliated with the American Board of Medical Specialties (ABMS), provides certification for general and subspecialty pediatricians after completion of training and throughout their career through its Maintenance of Certification (MOC) program. In 2013, Hawkins et al,1  on behalf of the ABMS, published an article describing the MOC’s theoretical basis. Continuing certification aims to address gaps in the quality of medical care related to the exponential increase in new medical knowledge, knowledge lost over time, and gaps and delays in practice change in response to new knowledge.25  In their article, Hawkins et al1  highlighted the role of the ABMS and its member boards in continually examining their programs to “assure the public and the profession that they are meeting expectations, are clinically relevant, and provide value to patients and participating physicians, and to refine and improve them as ongoing research indicates.”

In 2015, the ABP hosted a Future of Testing Conference to consider changes to its assessment programs. Presentations highlighted the role of assessments in both quantifying a priori medical knowledge as well as increasing opportunities to refresh previous knowledge and learn new information.6  Coupled with emerging theories on how adult physicians learn7,8  and technological advances in assessment,912  the ABP Board of Directors made the decision to design and pilot an alternative assessment approach to the existing proctored, closed-book MOC examination.13  The Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) was modeled after the American Board of Anesthesia’s (ABA) recently piloted assessment, termed MOCA-Minute,14,15  and was envisioned as quarterly delivering multiple-choice questions to pediatricians through a Web-based interface with immediate feedback after answering to encourage learning.

In 2015–2016, the ABP partnered with RTI International (RTI), a nonprofit research institution, to obtain input from certified pediatricians on the proposed MOCA-Peds’ model to be tested in 2017–2018 and early prototypes of the Web-based platform through user panels and focus groups. The model had 3 primary goals: maintain production of a valid assessment of medical knowledge, incorporate opportunities for learning, and improve the experience for pediatricians participating in continuing certification.16  User panels and focus groups conducted in 2016 recommended the ABP focus on offering opportunities for learning that incorporated flexibility and accessibility. Proposed features included (1) 80 total multiple-choice questions per year, delivered quarterly through a Web-based platform (with an alternative mobile interface); (2) flexible use of resources (eg, books, Internet); (3) immediate feedback regarding each question (eg, rationale, references, peer-benchmarking); and (4) review of completed questions and answers through a question history page.

User panels and focus groups also provided feedback on the Web-based platform to ensure usability of the interface. The user panel and focus groups raised concerns about the potential stresses associated with a timed, Web-based assessment conducted continuously over a pediatrician’s career. Policies put in place to address these concerns consisted of (1) learning objectives for each year’s questions to permit studying if desired, (2) consistent time allocation to answer each question (5 minutes), and (3) opportunities to improve retention of information by receiving 2 questions for each learning objective. (For additional details, see https://www.abp.org/mocapeds.)

A joint RTI-ABP research team proposed a formative evaluation process in 2017–2018, with initial piloting of the model and Web-based platform in 2017 followed by refinement of both in 2018. Because one-fifth of subspecialists maintaining at least 1 subspecialty also took the general pediatrics (GP) MOC part 3 examination in 2014–2016, the ABP was committed to developing an alternative GP assessment that would be feasible and acceptable for both general pediatricians and subspecialists maintaining their GP certification. In addition, the ABP hoped a similar model might work for the MOC examinations for its 15 subspecialties. With this article, we thus address the following questions. Is the MOCA-Peds feasible and acceptable to both general pediatricians and subspecialists as an alternative to the GP MOC examination? Will this same model be feasible and acceptable as an alternative for subspecialty MOC examinations? In the partner article, “Pediatrician Perspectives on Learning and Practice Change in the MOCA-Peds 2017 Pilot,” the authors describe in more detail participant reports related to learning and practice change.

The research team employed elements from the construct, inputs, processes, and products evaluation model17  and the intervention mapping model,18  which are both formal frameworks for program planning and design, production, implementation, and evaluation. These approaches are focused on the measurement of practical aspects of implementation to ensure success, including feasibility and acceptability, with iterative refinement and testing based on feedback from program developers (the ABP) and end users (pediatricians). We defined feasibility and acceptability as whether pediatricians can easily participate in the MOCA-Peds as designed (eg, technological capacity, time) and whether the MOCA-Peds meets pediatricians’ expectations (eg, satisfaction, quality, intent to use in the future).

After review by the ABP’s Research Advisory Committee and an exempt designation by the RTI Institutional Review Board, all pediatricians (n = 6814) eligible to take the GP MOC examination in 2017 were contacted to participate in the pilot and enrolled via a registration survey. In total, 5081 pediatricians enrolled. As part of participation, pediatricians were sent surveys after completion of each quarter’s questions on elements under development; 2017 MOCA-Peds participants were also randomly selected to participate in focus groups to (1) provide feedback on the results of the quarterly surveys and (2) review any recommended changes to the Web-based platform before implementation of those changes. In January 2018, a voluntary end-of-year survey was administered to those pediatricians who completed the 2017 pilot and met the passing standard.

Of the 5081 pilot participants, 4238 responded to the quarter 4 and/or voluntary end-of-year survey and composed the analytical sample for this article; 62.2% (n = 2634) completed both surveys, 32.6% (n = 1382) completed the quarter 4 survey only, and 5.2% (n = 222) completed the end-of-year survey only. Available demographic characteristics (sex, age, medical school type, certification status) were examined for differences between the analytic sample and participants not completing these surveys. No significant differences were observed (data not shown; P > .05).

The ABP’s Certification Management System (CMS) provided baseline demographic characteristics (age, sex, medical school type, certification status [general pediatricians versus subspecialists]); the registration survey captured additional characteristics identified in focus groups that might impact feasibility of the MOCA-Peds, specifically clinical hours worked, comfort with technology, and library access for articles. Surveys after quarters 1 to 3 and user panel and focus group results are not reported in this article. The quarter 4 survey targeted participants’ experience over the entire pilot year and included domains related to feasibility and acceptability (Table 1). The end-of-year survey garnered feedback on the 5-year model proposed to be implemented in 2019, which specified (1) alignment of the MOCA-Peds with the 5-year MOC cycle, reducing confusion with different examination and MOC deadlines; (2) automatic dropping of the 4 lowest quarters in an individual’s 5-year cycle to accommodate work and/or life events; and (3) a dedicated fifth year in the MOC cycle as an optional year to take the proctored MOC examination should an individual not meet the MOCA-Peds passing standard in years 1 to 4. In addition, it included 1 open-ended item “What one change would you make to MOCA-Peds to provide more value to your clinical practice?” the quarter 4 and end-of-year surveys also investigated subspecialty issues, specifically, (1) the feasibility and acceptability of the GP version of the MOCA-Peds for subspecialists and (2) the MOCA-Peds platform as an alternative to the MOC examination for 15 subspecialties.

Sample characteristics were calculated by using data from the ABP’s CMS and the registration survey. Response scales for key variables of interest were collapsed into categories (eg, responses “strongly agree” and “agree” were recoded to “agree,” and the responses “strongly disagree” and “disagree” were recoded to “disagree”). χ2 tests were conducted to determine statistically significant differences between general pediatricians and subspecialists. SPSS version 25 (IBM SPSS Statistics, IBM Corporation) was used for all quantitative analyses.19 

Qualitative responses to the open-ended question regarding 1 recommended change were evaluated by using thematic analysis techniques.20  After an initial review of open-ended responses, a set of 4 broad categories was identified. A second round of thematic analysis began that involved reviewing the open-ended text to ensure the existing code assigned to a response was appropriate and examining themes with relatively few counts to determine if they could be reclassified. The third and final round of analysis was focused on narrowing the scope of the existing categories to 3 categories and determining if there was a need for subcategories. The first round of analyses was conducted by 2 of the authors (M.G.O., V.D.) independently; this was followed by a review of the coding by both authors and then the second round of coding by the same authors. For the final round of coding, a third author (L.K.L.) reviewed the coding, and where there were differences, the authors worked to adjudicate the coding to ensure consistency. In this process, 9 subcategories were identified, including counts and sample quotes, providing a more specific classification of responses.

The majority of the analytic sample comprised general pediatricians (73.9%); the remaining sample was subspecialists. Ages ranged from 35 to 75 years; the majority were female (66.7%) and American medical graduates (78.0%). With respect to feasibility, the average hours worked per week varied, with the largest group working 40 to 50 hours/week (32.8%). The majority displayed a high comfort level with technology, and just more than half (56.2%) had access to an academic library. The results, in particular the comfort with technology, indicate that the MOCA-Peds is feasible because diplomates have access to and are comfortable with technology (Table 2).

Quarter 4 survey respondents agreed or strongly agreed (hereafter “agreed”) that questions aligned with the learning objectives (88.2%), assessed clinical judgment (81.8%), and were relevant to the practice of GP (81.5%) and to their specific practice setting (59.1%). For most, the time to answer questions was sufficient (78.4%). Differences were noted by certification status, with general pediatricians reporting a higher rate of agreement than subspecialists for all survey items except for question relevance to GP, for which subspecialists reported higher agreement (Table 3).

Acceptability with the MOCA-Peds as an alternative assessment was also measured on the quarter 4 survey (n = 4016) (Table 4). Approximately three-quarters (73.7%) agreed that the MOCA-Peds was an adequate assessment of fundamental GP practice knowledge and helped one to stay current in GP (79.6%); most were satisfied with it as a replacement for the current part 3 examination (93.1%), would prefer to participate in the MOCA-Peds (88.7%), and were likely to recommend the MOCA-Peds to a friend (92.8%). Both general pediatricians (71.4%) and subspecialists (90%) perceived the MOCA-Peds as feasible for subspecialists to maintain their GP knowledge.

With respect to potential anxiety associated with continuous assessment, the majority (88.7%) agreed they felt less anxiety participating in the MOCA-Peds compared to the proctored examination, and their anxiety about the MOCA-Peds decreased during the pilot as their comfort with the program increased (81.1%).

When asked about their personal future plans for participation (data not shown), 96.7% of general pediatricians and 94.9% of subspecialists planned to participate in the MOCA-Peds to maintain their GP certificate. Almost all subspecialists (95.3%) also planned to participate to maintain their subspecialty certificate.

The end-of-year survey also queried respondents (n = 2856) about key features of the proposed 5-year model. Topics included (1) ability to complete 20 questions per quarter, (2) learning promotion through tailored receipt of repeated questions marked as relevant or incorrectly answered, (3) inclusion of featured readings (eg, current guidelines and/or review articles) as an additional mechanism for staying up to date, (4) automatically dropping the 4 lowest quarters during years 1 to 4, and (5) maintaining year 5 as an optional year to take the proctored examination without loss of certification for those who did not pass the MOCA-Peds or chose not to participate. Overall, for each of these elements, the majority of participants (>70% for each item) expressed agreement. No differences were noted by certification type (Table 5).

The 3 themes identified in the qualitative data from the end-of-year survey are shown in Table 6, including a brief description, illustrative quotes, and count and percentage of participants who endorsed each subtheme. The first theme, endorsed by 21% of respondents, was focused on either neutral or positive feedback (ie, no suggestion for improvement, no requested change, or general positive feedback). The second theme, endorsed by 20% of respondents, was focused on question content improvement (ie, more focus on general practice, outpatient care, and common diagnoses; suggested formatting changes). The final remaining theme, endorsed by 13% of respondents, addressed process improvement (ie, more time for questions, requests for access to reference materials to guide study before taking questions to improve efficiency and focus of preparation).

Overall, the feasibility for the MOCA-Peds was high. Most participants had a moderate or higher comfort with technology. Differences in access to an academic library, however, confirmed the ABP’s decision to primarily provide access to the featured readings if not available in the public domain.

In addition, acceptability of the MOCA-Peds was strikingly positive with >75% agreement endorsed by both general pediatricians and subspecialists for most items on the quarter 4 and end-of-year survey. One exception to this high agreement was the item, “questions were relevant to my practice” (Table 3) for which 68.2% of general pediatricians agreed compared with 33.5% of subspecialists. This contrasted with responses at 79.8% and 86.4%, respectively, for “questions were relevant to general pediatrics” (Table 3), with subspecialists giving the higher endorsement. However, the item “MOCA-Peds is an adequate assessment of the fundamental knowledge used in everyday general pediatrics practice” (Table 4) revealed close to 75% agreement by both groups (74.3% general pediatricians; 72.0% subspecialists). This suggests that questions are measuring the components of GP as perceived by both general pediatricians and subspecialists but may not as closely mirror an individual’s practice.

Although burden and test anxiety were significant concerns raised in user panels and focus groups in 2016, responses to these surveys indicated relatively low levels of anxiety and that pediatricians felt that participation in the MOCA-Peds was manageable. Recommendations proposed by user panels and focus groups in 2016 appear to have addressed the majority of concerns about preparing for and participating in a continuous assessment process (eg, provision of learning objectives, repeat questions, consideration of work and/or life events, ability to take the proctored examination in year 5 without loss of one’s certification). Recommendations from this evaluation in 2017 included greater access to resources and references to study from before starting the MOCA-Peds questions in addition to the featured readings. For the 2018 pilot, the ABP incorporated several article-based questions and is continuing this in 2019.

Another recommendation was focused on improving the relevance of questions for an individual’s practice. Several factors will help to address this issue in future iterations of the MOCA-Peds. First, the MOCA-Peds platform allows pediatricians to enter comments about individual questions. These comments are then reviewed by general pediatricians who in turn provide feedback to pediatricians serving as question writers and may result in a modification or deletion of a question. Second, starting in 2019, question delivery is tailored, by using machine learning strategies, on the basis of an individual’s previous responses to the relevance of individual questions to his or her practice as well as reported confidence with a wrong answer.21  The MOCA-Peds will also continue to include less common clinical presentations and critical material that will help pediatricians stay informed of important guidelines. Last, in 2019, the ABP also launched the inclusion of questions on emerging public health topics (eg, measles).

The time limit for questions was raised as a concern in 2016 and in the 2017 pilot. Because the MOCA-Peds remains a summative assessment that indicates whether a pediatrician meets standards of excellence, the ABP must balance the functionality of the MOCA-Peds as a learning tool with its assessment function.16  The ABP felt it was necessary to provide some time limit to the MOCA-Peds as an assessment but chose not to vary the time by perceived item difficulty, as originally proposed, but to provide a consistent 5-minute window for each item. Approximately three-quarters of respondents felt that this 5-minute window was sufficient. In addition, data from quarter 3 delineated in the pilot evaluation report available on the ABP Web site suggest that 63.3% of respondents found that 5 minutes was insufficient for only 0 to 2 out of 20 questions, and another 22.7% found the time insufficient for 3 to 5 out of 20 questions; the time spent per question in 2017 averaged to 1 minute, 54 seconds.22  At this time, the ABP plans to continue with the 5-minute window for all questions.

Limitations of this research include the use of self-report by participants who completed the quarter 4 survey and/or end-of-year survey and the resulting lack of inclusion of those who did not register for the pilot, did not answer the MOCA-Peds questions through quarter 4, or did not meet the passing standard. Although there were no differences between respondents and nonrespondents by demographics, the potential for bias remains. The research team is investigating reasons for nonparticipation and partial participation and welcomes feedback. In addition, the ABP made the decision to model their assessment after the ABA’s longitudinal assessment, MOCA-Minute. Other longitudinal assessment tools that are primarily article-based are being developed and piloted through other ABMS boards23 ; the impact of these different models of assessment on lifelong learning and practice improvement is not known. Issues related to feasibility for physicians maintaining certifications across different boards with different approaches also were not addressed in this study but are actively being examined by the ABMS. Last, we did not recruit a cohort to take both the proctored examination and the MOCA-Peds, permitting direct comparisons by the same individuals.

In this article, we review survey results from the MOCA-Peds 2017 pilot. On the basis of these results, in 2019 the ABP launched a modified, operational version of the MOCA-Peds as an approved alternative to the current GP MOC examination in GP and in 3 subspecialties (child abuse pediatrics, pediatric infectious diseases, pediatric gastroenterology). Three key priorities continue to be (1) creating a valid knowledge assessment to ensure that diplomates meet standards of medical knowledge established by their peers, (2) supporting the lifelong learning of physicians across their career, and (3) improving the MOC experience. The ABP seeks feedback from participants as the MOCA-Peds is rolled out for all general pediatricians and subspecialists and welcomes input at mocapeds@abpeds.org.

We thank the multiple stakeholders who provided feedback to help improve the MOCA-Peds model and its Web-based and mobile platforms: the >11 000 pediatricians who engaged in the 2017–2018 pilot; the members of the ABP’s multiple committees, boards, and task forces; and the ABP’s Family Leadership Advisory Group. We also thank the ABA for collaborating with us in the development and evaluation of the model and platform. We also appreciate the efforts of the pediatrician volunteers and staff at the ABP for their monumental efforts in launching the pilot. Last, we thank Brenda Nuncio, Valerie Haig, Virginia Moyer, Laura Brooks, David Nichols, and members of the ABP Research Advisory Committee for their editorial review.

Drs Leslie and Olmsted, Mr Turner, and Ms Smith contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation, the acquisition of the data, analysis, and interpretation of the data, and drafted the initial manuscript; Dr Dounoucos contributed to the analyses and interpretation of the qualitative data and drafted the initial manuscript; Dr Althouse contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation; and all authors reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.

FUNDING: The evaluation of the 2017 Maintenance of Certification Assessment for Pediatrics pilot was funded by The American Board of Pediatrics (ABP) Foundation, a nonprofit type 1 supporting organization to the ABP.

COMPANION PAPER: A companion to this article can be found online at www.pediatrics.org/cgi/doi/10/1542/peds.2019-2305.

ABMS

American Board of Medical Specialties

ABP

American Board of Pediatrics

CMS

Certification Management System

GP

general pediatrics

MOC

Maintenance of Certification

MOCA-Peds

Maintenance of Certification Assessment for Pediatrics

RTI

RTI International

ABA

American Board of Anesthesia

1
Hawkins
RE
,
Lipner
RS
,
Ham
HP
,
Wagner
R
,
Holmboe
ES
.
American Board of Medical Specialties maintenance of certification: theory and evidence regarding the current framework
.
J Contin Educ Health Prof
.
2013
;
33
(
suppl 1
):
S7
S19
2
McGlynn
EA
,
Asch
SM
,
Adams
J
, et al
.
The quality of health care delivered to adults in the United States
.
N Engl J Med
.
2003
;
348
(
26
):
2635
2645
3
Bornmann
L
,
Mutz
R
.
Growth rates of modern science: a bibliometric analysis based on the number of publications and cited references
.
J Assoc Inf Sci Technol
.
2015
;
66
(
11
):
2215
2222
4
Custers
EJ
,
Ten Cate
OT
.
Very long-term retention of basic science knowledge in doctors after graduation
.
Med Educ
.
45
(
4
):
422
430
5
Hawkins
RE
,
Irons
MB
,
Welcher
CM
, et al
.
The ABMS MOC part III examination: value, concerns, and alternative formats
.
Acad Med
.
2016
;
91
(
11
):
1509
1515
6
The American Board of Pediatrics
.
Future of testing conference. 2015. Available at: https://www.abp.org/content/future-testing-conference. Accessed May 17, 2019
7
Schumacher
DJ
,
Englander
R
,
Carraccio
C
.
Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment
.
Acad Med
.
2013
;
88
(
11
):
1635
1645
8
Kerfoot
BP
,
Shaffer
K
,
McMahon
GT
, et al
.
Online “spaced education progress-testing” of students to confront two upcoming challenges to medical schools
.
Acad Med
.
2011
;
86
(
3
):
300
306
9
Nickson
CP
,
Cadogan
MD
.
Free open access medical education (FOAM) for the emergency physician
.
Emerg Med Australas
.
2014
;
26
(
1
):
76
83
10
McLean
SF
.
Case-based learning and its application in medical and health-care fields: a review of worldwide literature
.
J Med Educ Curric Dev
.
2016
;
3
:
JMECD.S20377
11
Wolff
M
,
Wagner
MJ
,
Poznanski
S
,
Schiller
J
,
Santen
S
.
Not another boring lecture: engaging learners with active learning techniques
.
J Emerg Med
.
2015
;
48
(
1
):
85
93
12
Harden
RM
.
A new vision for distance learning and continuing medical education
.
J Contin Educ Health Prof
.
2005
;
25
(
1
):
43
51
13
The American Board of Pediatrics
.
Cognitive expertise exam (part 3). Available at: https://www.abp.org/content/cognitive-expertise-exam-part-3. Accessed April 19, 2018
14
Sun
H
,
Zhou
Y
,
Culley
DJ
, et al
.
Association between participation in an intensive longitudinal assessment program and performance on a cognitive examination in the Maintenance of Certification in Anesthesiology Program®
.
Anesthesiology
.
2016
;
125
(
5
):
1046
1055
15
Macario
A
,
Harman
AE
,
Hosansky
T
, et al
.
Evolving board certification - glimpses of success
.
N Engl J Med
2019
;
380
(
2
):
115
118
16
Leslie
LK
,
Olmsted
MG
,
Turner
AL
, et al
.
MOCA-Peds: development of a new assessment of medical knowledge for continuing certification
.
Pediatrics
.
2018
;
142
(
6
):
e20181428
17
Stufflebeam
DL
,
Coryn
CLS
.
Evaluation Theory, Models, and Applications
, 2nd ed.
San Francisco, CA
:
Jossey-Bass
;
2014
18
Bartholomew
LK
,
Parcel
GS
,
Kok
G
.
Intervention mapping: a process for developing theory- and evidence-based health education programs
.
Health Educ Behav
.
1998
;
25
(
5
):
545
563
19
IBM SPSS Statistics [computer program]
. Version 25.
Armonk, NY
:
IBM Corporation
;
2017
20
Braun
V
,
Clarke
V
.
Using thematic analysis in psychology
.
Qual Res Psychol
.
2006
;
3
(
2
):
77
101
21
The American Board of Pediatrics
.
ABP’s Qunbar to receive award for MOCA-Peds recommender system. 2019. Available at: https://www.abp.org/news/qunbar-award-moca-peds-recommender-system. Accessed May 17, 2019
22
Turner
A
,
Leslie
L
,
Furter
R
,
Haig
V
.
2017 Pilot Summary: Summary of the 2017 MOCA-Peds Pilot, Use Patterns, Feasibility, Acceptability, and Scoring
.
Chapel Hill, NC
:
The American Board of Pediatrics
;
2018
.
23
American Board of Medical Specialties
.
CertLink delivers longitudinal assessment online. Available at: https://www.abms.org/initiatives/certlink/. Accessed May 7, 2019

Competing Interests

POTENTIAL CONFLICT OF INTEREST: Drs Leslie and Althouse and Mr Turner are employees of The American Board of Pediatrics (ABP). Drs Dounoucos and Olmsted and Ms Smith are employees of RTI International, an international nonprofit research firm with whom the ABP contracted to conduct this evaluation.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.