The American Board of Pediatrics (ABP) certifies that general and subspecialty pediatricians meet standards of excellence established by their peers. Certification helps demonstrate that a general pediatrician or pediatric subspecialist has successfully completed accredited training and fulfills continuous certification requirements (Maintenance of Certification [MOC]). One current component of the MOC program is a closed-book examination administered at a secure testing center (ie, the MOC Part 3 examination). In this article, we describe the development of an alternative to this examination termed the "Maintenance of Certification Assessment for Pediatrics" (MOCA-Peds) during 2015–2016. MOCA-Peds was conceptualized as an online, summative (ie, pass/fail), continuous assessment of a pediatrician’s knowledge that would also promote learning. The system would consist of a set number of multiple-choice questions delivered each quarter, with immediate feedback on questions, rationales clarifying correct and incorrect answers, references for further learning, and peer benchmarking. Questions would be delivered quarterly and taken at any time within the quarter in a setting with Internet connectivity and on any device. As part of the development process in 2015–2016, the ABP actively recruited pediatricians to serve as members of a yearlong user panel or single-session focus groups. Refinements to MOCA-Peds were made on the basis of their feedback. MOCA-Peds is being actively piloted with pediatricians in 2017–2018. The ABP anticipates an expected launch in January 2019 of MOCA-Peds for General Pediatrics, Pediatric Gastroenterology, Child Abuse, and Pediatric Infectious Diseases with launch dates for the remaining pediatric subspecialties between 2020 and 2022.

Since 1934, the American Board of Pediatrics (ABP) has certified general pediatricians and pediatric subspecialists on the basis of standards of excellence developed by their peers.1 Certification status has been used by various stakeholders (eg, the public, health care providers, hospital credentialers, licensing boards, and insurers) to determine if a pediatrician meets these standards. In 2000, recognizing the rapid change in medical knowledge, the American Board of Medical Specialties (ABMS) and its 24-member board implemented the Maintenance of Certification (MOC) program. Its goal was to demonstrate that a physician was actively engaged in staying current with new knowledge and consequently making changes in their practice.2 Currently, this process includes a periodic assessment of a physician’s medical knowledge via a proctored, multiple-choice examination administered at a secure testing center (ie, the MOC Part 3 examination).3 

In May 2015, the ABP invited >80 leaders in assessment, psychometrics, medical education, pediatric practice, and technology to participate in a conference focused on improving the MOC Part 3 examination.4 Similar to other certifying boards, the ABP visualized its examinations as summative (ie, pass or fail) assessments of a pediatrician’s current knowledge and clinical judgment. However, the ABP was exploring multiple potential changes to the MOC examination, including the following 2 technological enhancements: (1) administration of the examination at home or work with remote proctoring through one’s computer and (2) the incorporation of reference material to mimic access typically available in practice.

Evidence presented at the conference5 on the exponential increase in new medical knowledge,6 the loss of physician knowledge over time,7,9 contemporary theories regarding self-assessment10,12 and adult learning,13,17 and technological advances in both assessment and education provided additional, compelling stimuli for change. Two presentations in particular helped to further shift the ABP’s reconsideration of the proctored multiple-choice examination. In the first presentation, Cees van der Vleuten,18 an internationally recognized leader in medical education and assessment, challenged the ABP’s assumption that examinations conducted by certifying organizations should only focus on the assessment of learning with a summative measure. Dr van der Vleuten18 encouraged the ABP and other ABMS member boards to consider whether examinations can also encourage learning. In a second presentation, members of the American Board of Anesthesiology (ABA) described their 2014 pilot of an innovative approach to their MOC program. This pilot, termed MOCA-Minute, consisted of brief multiple-choice questions distributed via weekly e-mail with immediate feedback provided after each question was answered.19 

These presentations initiated new conversations about alternative approaches to the ABP’s MOC examination. In June 2015, the ABP Board of Directors unanimously agreed the ABP should develop and pilot an online, continuous assessment model that would consist of multiple-choice questions delivered on a periodic basis over a set interval of time, similar to MOCA-Minute. The assessment, termed the Maintenance of Certification Assessment for Pediatrics (MOCA-Peds), would eliminate the need to go to a testing center, permit the use of resources, and incorporate learning opportunities.

To translate this mission into an acceptable, sustainable, and psychometrically sound approach, the ABP established several teams, including (1) the MOCA-Peds Executive Team, whose purpose was to make overarching business and policy decisions; (2) the MOCA-Peds Work Group Team, composed of several ABP departments who converted business and policy decisions into daily operations; (3) the MOCA-Peds Task Force, composed of primary care pediatricians who developed questions, wrote rationales, and identified references for questions; and (4) a research team who would lead the ABP’s efforts to engage pediatricians who were maintaining certification in the development of MOCA-Peds.

A launch date for a MOCA-Peds pilot in general pediatrics was set for January 2017. To prepare for the pilot, the ABP planned to develop an initial model during Fall 2015 and engage pediatricians in refinement of the model and in the development of Web-based and mobile platforms through a yearlong user panel and single-session focus groups in 2016 (see Table 1 for time line). The ABP contracted with an internationally recognized research organization, RTI International (RTI), to bring additional evaluative rigor to the engagement process with pediatricians and to proactively address any potential concerns about bias in data collection or analyses. In this article, we discuss the development of the MOCA-Peds model and platform in 2015–2016. Additional information about MOCA-Peds is available at https://www.abp.org/mocapeds.

TABLE 1

Time Line for the Development of the MOCA-Peds Model in 2015–2016

ActivityTime Line
Future of testing conference May 2015 
Board of directors decision to develop MOCA-Peds June 2015 
Public announcement to develop and pilot MOCA-Peds June 2015 
Internal planning regarding initial program model June to December 2015 
First iteration of KDD developed November 2015 
Approval for stakeholder engagement approach proposed by the ABP-RTI Research Team September to November 2015 
IRB submission December 2015 
E-mail request for pediatrician participation to 72 353 pediatricians January 2016 
System development January to December 2016 
Random selection of user panel and focus group participants from >3000 respondents January to February 2016 
Face-to-face, daylong user panel meetings February to March 2016 
User panel virtual meetings April to November 2016 
Focus group meetings April to November 2016 
Review of communication materials by user panel and focus groups July to November 2016 
β-Testing of Web-based platform September to December 2016 
Registration and preparation for pilot launch September to December 2016 
Pilot launch January 2017 
ActivityTime Line
Future of testing conference May 2015 
Board of directors decision to develop MOCA-Peds June 2015 
Public announcement to develop and pilot MOCA-Peds June 2015 
Internal planning regarding initial program model June to December 2015 
First iteration of KDD developed November 2015 
Approval for stakeholder engagement approach proposed by the ABP-RTI Research Team September to November 2015 
IRB submission December 2015 
E-mail request for pediatrician participation to 72 353 pediatricians January 2016 
System development January to December 2016 
Random selection of user panel and focus group participants from >3000 respondents January to February 2016 
Face-to-face, daylong user panel meetings February to March 2016 
User panel virtual meetings April to November 2016 
Focus group meetings April to November 2016 
Review of communication materials by user panel and focus groups July to November 2016 
β-Testing of Web-based platform September to December 2016 
Registration and preparation for pilot launch September to December 2016 
Pilot launch January 2017 

IRB, institutional review board.

From the outset, the ABP recognized the inherent difficulty in designing a single tool that functioned as both a summative assessment of general pediatric knowledge and an opportunity for learning. In addition, MOCA-Peds would need to be compatible with busy pediatricians’ lives. The ABP chose to capture these 3 functions in a key driver diagram (KDD) to guide the development of MOCA-Peds (see Fig 1). KDDs20 are visual tools commonly employed in program evaluation and quality-improvement projects in which researchers demonstrate the relationships among (1) the overall aim(s) of a project, (2) the key drivers (domains) that are hypothesized to contribute directly to achieving the aim, (3) the secondary drivers (components) related to those domains, and (4) specific change strategies or activities linked to each component that the ABP would incorporate into MOCA-Peds and test during the 2017–2018 MOCA-Peds pilot. KDDs function as “living” documents that guide the work as a project is planned, implemented, and modified; components and change strategies may evolve over time.

FIGURE 1

MOCA-Peds conceptual KDD. Change strategies are displayed in Table 2.

FIGURE 1

MOCA-Peds conceptual KDD. Change strategies are displayed in Table 2.

For MOCA-Peds, the overarching goal and aim were clear: to improve child health outcomes through the development of a high-quality alternative to the current MOC examination. The following 3 key domains were identified in Fall 2015: (1) crafting a psychometrically sound, continuous assessment of medical knowledge; (2) incorporating opportunities for learning; and (3) improving the MOC experience. Often, KDDs incorporate aims that specifically state a target audience, end date, and measure(s) of success. For the purposes of the development of the initial MOCA-Peds model, the ABP chose to use the KDD as a conceptual tool, with formal aim statements crafted later as part of the evaluation.

To identify potential components and change strategies, the teams (1) consulted with experts in the medical education and assessment fields, (2) conducted a literature review on adult learning, (3) researched assessment approaches used in other professional disciplines and countries, and (4) closely examined the ABA’s MOCA-Minute19 to consider the applicability to pediatrics of different elements of their program (eg, 1-minute questions, quarterly question delivery, 120 questions per year, rationale after each question, and scoring approach).

These efforts led to an initial conceptualization in Fall 2015 of the MOCA-Peds model, including its components and identified change strategies. Below we discuss the initial KDD conceptualized in 2015. We then describe the methodologies employed to gather pediatrician input and the resulting changes to the MOCA-Peds’ components and change strategies. Table 2 delineates initial components and change strategies identified in 2015 as well as modifications based on pediatrician feedback in 2016.

TABLE 2

Key Drivers (Domains), Secondary Drivers (Components), and Change Strategies in the Initial MOCA-Peds Model in 2015 and After Pediatrician Feedback in 2016

Primary Driver or Domain in Initial Model in 2015Secondary Drivers or Components in Initial Model in 2015Change Strategies in Initial Model in 2015Modifications or Additions to MOCA-Peds Based on Pediatrician Feedback in 2016
Craft a psychometrically sound, continuous assessment of medical knowledge Ensure the assessment is valid, relevant, and reliable Implement a continuous assessment that keeps pace with rapid medical knowledge NA 
  Administer more questions over the lifetime of a pediatrician’s career NA 
  Provide opportunity for pediatricians to give feedback on individual questions NA 
 Ensure answers to questions reflect an individual pediatrician’s knowledge and clinical judgment Require personal authentication during log-in Dropped 
  Use MOCA-Peds code of conduct Modified: language clearer and more user-friendly 
 Deter inappropriate test-taking behavior Deliver questions in random order for different individuals using the platform NA 
  Establish different time limits for each question on the basis of difficulty Modified: all questions have same 5-min time limit 
  Conduct ongoing data analyses to identify potential security threats NA 
Incorporate opportunities for learning Provide tools for self-reflection on personal knowledge gaps identified during the assessment Provide immediate feedback on answers to individual questions NA 
  Display personal performance on a dashboard Modified: display enhanced 
  Offer benchmark comparisons with peers NA 
  Provide a question history page to review previous questions Modified: display enhanced 
  Distribute similar questions derived from the same learning objectives later in year to reinforce learning NA 
  Provide opportunity to rate clinical relevance of each question to personal practice Added: present personalized aggregation of relevance ratings 
  Ask pediatricians to rate their confidence in their answers Added: present personalized aggregation of confidence ratings 
 Provide tools for keeping up-to-date with medical knowledge Permit access to references when answering questions NA 
  Include up-to-date rationale and references NA 
  Deliver questions randomly from entire content outline as opposed to questions grouped by content area NA 
  Not considered in the 2015 initial model Added: release learning objectives (ie, question topics) for the upcoming year to permit review 
  Not considered in the 2015 initial model Added: include questions that incorporate practice recommendations from recently published guidelines and emerging public health topics in the 2017–2018 pilot 
Improve the MOC experience Incorporate flexibility and choice in the assessment process and how one learns Deliver questions quarterly with choice within the quarter about when to answer questions NA; because of differences in opinions among focus group and user group participants regarding quarterly versus twice yearly or yearly delivery of questions, the ABP chose to maintain quarterly delivery 
  Provide Web-based and mobile platforms Modified: display enhanced 
  Offer choice questions weighted to inpatient, outpatient, or mixed settings NA 
  All pediatricians would participate in MOCA-Peds Continue to offer the standard proctored examination as an alternative to MOCA-Peds 
 Combine continuous certification requirements Not considered in the 2015 initial model Added: recommend a 5-y cycle to align with MOC cycle as opposed to a 3-y summative assessment cycle as originally envisioned 
  Not considered in the 2015 initial model Added: provide MOC Part 2 credit 
 Accommodate for life events Not considered in the 2015 initial model Added: drop 4 lowest quarters once MOCA-Peds goes live in 2019 
 Address consequence of loss of credentials with new system Not considered in the 2015 initial model Added: MOCA-Peds questions delivered in years 1–4; year 5 available to take proctored examination if the test taker does not meet passing standard 
 Reduce anxiety about changing to a new summative assessment system Not considered in the 2015 initial model Added: provide learning objectives (also above) and provide orientation materials (eg, videos and sample questions) 
Primary Driver or Domain in Initial Model in 2015Secondary Drivers or Components in Initial Model in 2015Change Strategies in Initial Model in 2015Modifications or Additions to MOCA-Peds Based on Pediatrician Feedback in 2016
Craft a psychometrically sound, continuous assessment of medical knowledge Ensure the assessment is valid, relevant, and reliable Implement a continuous assessment that keeps pace with rapid medical knowledge NA 
  Administer more questions over the lifetime of a pediatrician’s career NA 
  Provide opportunity for pediatricians to give feedback on individual questions NA 
 Ensure answers to questions reflect an individual pediatrician’s knowledge and clinical judgment Require personal authentication during log-in Dropped 
  Use MOCA-Peds code of conduct Modified: language clearer and more user-friendly 
 Deter inappropriate test-taking behavior Deliver questions in random order for different individuals using the platform NA 
  Establish different time limits for each question on the basis of difficulty Modified: all questions have same 5-min time limit 
  Conduct ongoing data analyses to identify potential security threats NA 
Incorporate opportunities for learning Provide tools for self-reflection on personal knowledge gaps identified during the assessment Provide immediate feedback on answers to individual questions NA 
  Display personal performance on a dashboard Modified: display enhanced 
  Offer benchmark comparisons with peers NA 
  Provide a question history page to review previous questions Modified: display enhanced 
  Distribute similar questions derived from the same learning objectives later in year to reinforce learning NA 
  Provide opportunity to rate clinical relevance of each question to personal practice Added: present personalized aggregation of relevance ratings 
  Ask pediatricians to rate their confidence in their answers Added: present personalized aggregation of confidence ratings 
 Provide tools for keeping up-to-date with medical knowledge Permit access to references when answering questions NA 
  Include up-to-date rationale and references NA 
  Deliver questions randomly from entire content outline as opposed to questions grouped by content area NA 
  Not considered in the 2015 initial model Added: release learning objectives (ie, question topics) for the upcoming year to permit review 
  Not considered in the 2015 initial model Added: include questions that incorporate practice recommendations from recently published guidelines and emerging public health topics in the 2017–2018 pilot 
Improve the MOC experience Incorporate flexibility and choice in the assessment process and how one learns Deliver questions quarterly with choice within the quarter about when to answer questions NA; because of differences in opinions among focus group and user group participants regarding quarterly versus twice yearly or yearly delivery of questions, the ABP chose to maintain quarterly delivery 
  Provide Web-based and mobile platforms Modified: display enhanced 
  Offer choice questions weighted to inpatient, outpatient, or mixed settings NA 
  All pediatricians would participate in MOCA-Peds Continue to offer the standard proctored examination as an alternative to MOCA-Peds 
 Combine continuous certification requirements Not considered in the 2015 initial model Added: recommend a 5-y cycle to align with MOC cycle as opposed to a 3-y summative assessment cycle as originally envisioned 
  Not considered in the 2015 initial model Added: provide MOC Part 2 credit 
 Accommodate for life events Not considered in the 2015 initial model Added: drop 4 lowest quarters once MOCA-Peds goes live in 2019 
 Address consequence of loss of credentials with new system Not considered in the 2015 initial model Added: MOCA-Peds questions delivered in years 1–4; year 5 available to take proctored examination if the test taker does not meet passing standard 
 Reduce anxiety about changing to a new summative assessment system Not considered in the 2015 initial model Added: provide learning objectives (also above) and provide orientation materials (eg, videos and sample questions) 

NA, not applicable.

Developing a psychometrically sound assessment of general pediatric knowledge is complex and multifaceted, and the ABP employs several strategies in pursuing that goal. For example, with general pediatrics, the ABP regularly conducts a rigorous practice analysis with practicing general pediatricians21 to update the General Pediatrics Content Outline, a publicly available and comprehensive list of medical knowledge topics that board-certified pediatricians are expected to know.22 Practice analysis, which consists of surveys with practitioners about their practice patterns combined with panel review, is widely recognized as the primary methodology for establishing a certification examination’s content validity and relevance.23,26 Second, examination questions are written by practicing pediatricians, each of which must align with the content outline. Third, questions are chosen to represent the various categories identified during the practice analysis and covered within the content outline. Fourth, committees of pediatricians review each question for accuracy and relevance. Fifth, before scoring an individual’s examination, the psychometric properties of each question are examined, and questions that perform poorly are dropped. Last, the ABP works with a separate panel of practicing pediatricians in a well-established process known as standard setting27 to set the passing standard, or cut score, that reflects the minimum level of knowledge required for board certification.

Because continuous assessments have not been commonly used by certifying organizations, the ABP committed to using this same rigorous approach for MOCA-Peds. In addition, several change strategies were theorized to improve the validity, reliability, and relevance of MOCA-Peds compared with the examination, including (1) implementation of a continuous assessment of the general pediatrician that keeps pace with rapid medical knowledge expansion (ie, enhanced validity), (2) administration of questions over the lifetime of a pediatrician’s career (ie, increased reliability), and (3) provision of opportunity for pediatricians to give feedback on individual questions (ie, improved relevance).

High-stakes certification examinations also have been typically proctored in secure testing environments to help address the other identified components in this domain, which are to ensure answers to questions reflect an individual pediatrician’s knowledge and clinical judgment and to deter inappropriate test-taking behavior. Requiring a continuous assessment of the general pediatrician to be administered in a proctored environment, even if remote, would help address these concerns but would be difficult to implement. The ABP identified the following 2 change strategies designed to authenticate the identity of the test taker: (1) require personal authentication during log-in and (2) use a MOCA-Peds code of conduct. To prevent examination content from being shared publicly, which would invalidate MOCA-Peds as a summative assessment of general pediatric knowledge, the following 3 additional change strategies were identified: (1) delivery of questions in random order, (2) set time limits for each question, and (3) conduct ongoing data analyses to identify security threats.

The second domain is focused on the following 2 components: providing tools for self-reflection on personal knowledge gaps and learning new information.

In anecdotal feedback regarding the ABA’s MOCA-Minute and the current MOC Part 3 examination, physicians had previously identified several potential change strategies to encourage self-reflection, such as (1) provide immediate feedback on answers to individual questions, (2) display personal performance on a dashboard, (3) offer benchmark comparisons with peers, (4) provide a question history page that would permit review of previously answered questions, and (5) distribute similar questions derived from the same learning objective later in the year. The latter mimicked, to the degree possible, the reinforced learning benefits demonstrated with spaced-education progress testing (ie, repeating questions on the same topic area spaced over time).16,28 On the basis of the literature,11,12,14 the ABP proposed the following 2 additional change strategies: (1) provide the opportunity for pediatricians to rate the clinical relevance of each question to their personal practice and (2) ask pediatricians to rate their confidence in an answer before seeing whether the answer was correct. These would offer the opportunity for an individual pediatrician to identify questions rated with high relevance or confidence but answered incorrectly.

Keeping up-to-date with current medical knowledge was also seen as important for learning. The following 2 change strategies were proposed: (1) permit access to resources when answering questions and (2) include an up-to-date rationale for the correct answer and references for each question. In addition, varying the topical content of individual questions in an examination as opposed to grouping questions by content areas (eg, cardiology) has been shown to improve performance on subsequent testing and, by proxy, knowledge.29,31 Therefore, the ABP chose a third change strategy, which was to deliver questions randomly drawn from the entire content outline as opposed to questions sequentially grouped by content area.

The third domain emphasized improving the MOC experience. Mindful of the potential impact of a continuous assessment of general pediatric knowledge, the ABP sought to incorporate flexibility and choice into MOCA-Peds through the following change strategies: (1) deliver questions quarterly with choice within the quarter about when to answer questions, (2) provide both Web-based (ie, used on a Web browser on a laptop or desktop) and mobile (ie, phone or tablet) platforms, (3) allow pediatricians to select a practice profile (inpatient, outpatient, or mixed) and deliver a small percentage of profile-specific questions, and (4) continue to offer the standard proctored examination as an alternative to MOCA-Peds, if preferred. In addition, the ABP realized it would need to add flexibility for the individual to accommodate life events and combine continuous certification requirements where possible. Accordingly, the ABP prioritized seeking pediatrician feedback in 2016 on these 2 areas.

The ABP-RTI Research Team determined that the MOCA-Peds development process in 2016 and in the 2017–2018 pilot would be based on published evaluation approaches that incorporate engagement with both program developers (ie, the ABP) and end users (ie, pediatricians), starting with the initial stages of planning the MOCA-Peds model and platform through to its implementation. The team used key components of the Construct, Inputs, Processes, and Products evaluation model32 and the Intervention Mapping model,33 both formal frameworks for program design, development, and implementation commonly employed in the evaluation field. Both frameworks outline steps that evaluators take specifically to (1) assess needs, problems, opportunities, and constraints before program design; (2) incorporate end user (ie, pediatrician) feedback in multiple waves of user testing; (3) prioritize program refinements on the basis of that feedback, taking into account any design constraints; and (4) focus on practical aspects of implementation to ensure success (eg, pretesting messaging, addressing unintended consequences, and assessing sustainability by review of costs, feasibility, and acceptability to pediatricians). The research team also worked with the ABP’s information technology group to incorporate human-centered design approaches34 (ie, approaches that consider intuitive design, usability, and acceptability throughout the system development process) in developing the MOCA-Peds Web-based and mobile platforms.

After review by the ABP’s Research Advisory Committee and the RTI Institutional Review Board, RTI staff extended to all currently certified pediatricians (n = 72 353) the opportunity to participate in stakeholder engagement groups through an open-call e-mail in January 2016. In the call, pediatricians were asked about their interest in participating in an ongoing user panel that would provide feedback on multiple topics over the course of 2016 or in single-session focus groups that would target specific topics. A total of 3291 pediatricians responded and completed an online interest form associated with the call by January 15, 2016. To facilitate sample selection, all respondents were assigned a unique number. The research team used random sampling with specific demographics such as location, age, sex, and generalists versus specialists to ensure that participants represented a broad cross section of the pediatric community who maintain the General Pediatrics certification.35 Selected participants were contacted by RTI to request participation in serving on a user panel or in a focus group on the basis of their indicated interest and time availability. A total of 597 pediatricians were ultimately invited to participate, with 153 participating in a focus group or user panel (Table 3).

TABLE 3

Pediatrician Engagement in MOCA-Peds’ Development in 2016

Focus Groups (Randomly Selected Participants)
User Panel (Randomly Selected Participants)In-person Focus Groups at 2016 Pediatric Academic Societies MeetingVirtual Focus Groups Time 1Virtual Focus Groups Time 2Virtual Focus Groups Time 3In-person Focus Group of Local PediatriciansIn-person Usability Testing by Local PediatriciansTotals
No. sessions 24a 41 
Attendance status         
 Invited 40 136 201 94 63 35 28 597 
 Participated 37 32 32 22 13 153 
Participant summary         
 Specialty         
  General pediatrician 20 11 28 15 12 102 
  Pediatric subspecialist 17 21 — 51 
 Sex         
  Male 12 12 12 54 
  Female 25 20 20 16 99 
 Medical school graduate type         
  American medical school 28 24 30 16 11 123 
  International medical school — 30 
 Age, y         
  ≤40 10 40 
  41–50 13 12 14 54 
  51–60 14 47 
  ≥61 — — 12 
Focus Groups (Randomly Selected Participants)
User Panel (Randomly Selected Participants)In-person Focus Groups at 2016 Pediatric Academic Societies MeetingVirtual Focus Groups Time 1Virtual Focus Groups Time 2Virtual Focus Groups Time 3In-person Focus Group of Local PediatriciansIn-person Usability Testing by Local PediatriciansTotals
No. sessions 24a 41 
Attendance status         
 Invited 40 136 201 94 63 35 28 597 
 Participated 37 32 32 22 13 153 
Participant summary         
 Specialty         
  General pediatrician 20 11 28 15 12 102 
  Pediatric subspecialist 17 21 — 51 
 Sex         
  Male 12 12 12 54 
  Female 25 20 20 16 99 
 Medical school graduate type         
  American medical school 28 24 30 16 11 123 
  International medical school — 30 
 Age, y         
  ≤40 10 40 
  41–50 13 12 14 54 
  51–60 14 47 
  ≥61 — — 12 

—, No pediatricians in this category.

a

Includes a kickoff (in-person) and a series of 5 follow-up meetings (virtual).

The user panel was arranged to provide continuous feedback throughout the development process; its members participated in a daylong, face-to-face orientation early in 2016 and then in 5 additional 90-minute virtual meetings over the rest of the year. At the orientation, user panel members reviewed the goals, proposed designs and policies associated with the testing and implementation of MOCA-Peds, and provided their personal perspectives on the initial components and change strategies. User panel members then met virtually in five 90-minute meetings over the course of 2016 to provide feedback as change strategies were refined or new change strategies were identified. Panel members also participated in giving repeated rounds of feedback on the layout and components of the Web-based platform. Last, panel members were asked to help revise communication materials (eg, demonstration videos, webinars, newsletters, ABP Web site notices, and sampled questions) the ABP was preparing to invite participation in the 2017–2018 pilot. Online surveys after each meeting provided an additional opportunity to offer feedback. User panel participants received an honorarium of $500 as well as reimbursement for travel costs.

In contrast to the user panel, focus groups were designed as single-session, 90-minute virtual or face-to-face (ABP offices, hospitals, or national meetings) meetings. Their purpose was to gather a broad understanding of pediatricians’ reactions to the proposed MOCA-Peds model, address specific topics that arose during development, and test the Web-based platform. Participating pediatricians received an honorarium of $75 in the form of a gift card after their participation.

Pediatricians who participated in the 41 meetings of the user panel and focus groups provided a wide range of feedback on the proposed model, Web-based platform, communications, and policies. For each meeting, the research team compiled written notes of the questions, answers, and observations for each encounter. The notes were then evaluated by using thematic analysis techniques36 to determine what the main themes were from the feedback obtained. Themes, along with illustrative quotes, were compiled into brief reports, distributed to the other MOCA-Peds teams after each of the panel meetings or focus groups, and summarized across the entire year in a single qualitative data set. The number of focus groups was predetermined, and the data were reviewed after each focus group to determine if topic saturation had been achieved (ie, the point at which no new themes are identified).37,38 Although the total number of participants who participated in focus groups was low, the research team determined that the discussions reached saturation typically after 2 to 3 virtual meetings within each round of focus groups, particularly as these themes were echoed by user panel members.

Other informal stakeholder engagement activities were conducted throughout 2016. For instance, key committees at the ABP, North Carolina chief residents and program directors, and the 14 ABP subboard committees, totaling >200 additional pediatricians who provided feedback, reviewed the development of the MOCA-Peds model, the user panel and focus group themes, and the Web-based and mobile platforms. Themes resulting from the user panel and focus groups resonated with pediatricians participating in these other meetings. The research team judged that the feedback obtained from user panel members, focus groups, and internal meeting attendees was representative of the majority of pediatricians maintaining certifications and that additional discussions with pediatricians were not indicated.

Major themes for improvement of the model were focused primarily on the following 5 topical areas: learning, life accommodation, burden, test anxiety, and usability (see Table 4 for pediatrician feedback, example quotations for each theme, and the ABP’s response to the feedback). Although participants varied in their needs and concerns about certification, they were for the most part positive about the ABP’s plan to develop an alternative to the current proctored examination. Participants encouraged the ABP to promote learning whenever possible because this would provide added value, and a number of suggestions were offered on how to balance security and learning. One adopted recommendation to promoting learning was that the ABP release learning objectives in advance of each year of MOCA-Peds to allow time to study for those who were interested. This met another expressed concern of those who were anxious about the change to a continuous online system and of those individuals who struggled with test anxiety. Participants also recommended the ABP develop introductory materials before the MOCA-Peds pilot launch in 2017. An additional change to promote learning based on pediatrician feedback was to include questions on recently published guidelines or emerging public health issues (eg, the Zika epidemic).

TABLE 4

Themes From User Panel and Focus Group Participants in 2016 Regarding MOCA-Peds

Theme and/or SubthemeDescriptionKey QuotesResolution and/or Response
Learning    
 Promotion of learning Pediatricians would be better served if learning is prioritized over assessment more so than in the current MOC part 3 model. “Currently board certification is not something I do to improve myself; I do it to keep my job.” “If the board is focused on lifelong learning, doing questions continually with immediate feedback is a much better way to accomplish it than taking a test every 10 years and not knowing what you even got right or wrong.” A summative assessment is a necessary part of the ABP’s mission; however, multiple opportunities for learning are being incorporated (see Table 2). 
 Question feedback Pediatricians expressed frustrations with the lack of feedback that follows the proctored examination. Not receiving feedback on incorrect and correct items was a limitation. Most agreed having this would allow them to identify gaps in their knowledge. “If you get immediate feedback, you can use that information to understand why one is right and the other is wrong.” “[We would be] theoretically learning for life, not for the test.” MOCA-Peds provides immediate feedback to support lifelong learning. 
 Study and/or preparation Pediatricians often study extensively (eg, between 1 and 6 mo) for the current proctored examination. “They study at least an hour a day during the week and part or all of a day on the weekends.” Question topic areas (learning objectives) are being released before beginning questions to focus study for those who wish to prepare.a 
 No collaboration Pediatricians pointed out that physicians are always collaborating in the field and discussing topics, which is largely seen as a positive approach to solving issues in a clinical setting. “For those of us who work with other people, we are always discussing clinical problems we encounter. Why is this a bad thing? Why not have discussions about a question and how to approach it? If we are looking at how adults learn, this is how we work together.” Because providing a summative assessment of an individual is part of the ABP’s mission, collaboration is currently restricted. 
Life accommodation    
 Accessibility Pediatricians liked that users can access it from any computer and do not have to “rearrange” their lives to complete this requirement. Pediatricians responded positively to the elimination of the need to go to a secure testing center. “It’s less disruptive to clinical practice.” “I applaud these changes. Anything that keeps me out of a testing center is worthwhile.” MOCA-Peds can be accessed anywhere and at any time, and it does not require going to a secure testing center. 
 Flexibility Pediatricians liked the flexibility of being able to complete requirements on their own time in small “bites.” It “fits in with the workflow of a practicing pediatrician.” Questions can be completed at a time convenient for the pediatrician. 
Burden and test anxiety    
 Time limit Pediatricians questioned the purpose of the time limit. Many felt untimed questions would have a significant and positive impact on pediatricians. Pediatricians questioned how different time limits from 1 to 5 min were proposed for questions. “The 5-min time limit makes it so that you are just trying to look [the answer] up and are only concerned with finding the answer and not learning.” Time limit kept because a primary domain is a secure, summative assessment of the individual. However, 5 min was chosen for all questions as opposed to varying the time from 1 to 5 min.a 
Continuous testing Some were concerned about the burdens of continuous assessment and question fatigue, particularly in light of variable clinical, research, and educational demands throughout the y. MOCA-Peds will “always be over your head, and you are never quite done.” 2 policies implemented to allow for time off while participating (see text).a 
 Cost (money) Pediatricians did not want a cost increase with MOCA-Peds. Others asked if the payment schedule would change to a quarterly or yearly format to better match the new MOCA-Peds schedule. “Have you determined how much this will cost yet? I hope that it will not be more than what I pay now.” No additional costs to participate in MOCA-Peds and MOC annual payment option implemented.a 
 Test anxiety Some pediatricians reported getting very anxious about having to take tests, which leads to lost sleep, decreased productivity in their practice, time away from family, etc. They expressed concerns about having to take questions that could cover the entire field and that they might not be prepared for. “It isn’t – oh my God I have 4 hours to vomit everything I know.” “I really like that I can do the questions at my convenience. Although I always passed the exams, I have test anxiety, and this is very easy for me since I read and do questions regularly, and this is just an extension of that type of learning.” Learning objectives provided in advance of each year and question feedback provided immediately after each question. 
 Anxiety Pediatricians indicated that the more they learned about MOCA-Peds, the more feasible it seemed. They thought anxious colleagues’ perceptions would change with experience. They recommended providing introductory materials before starting to help decrease anxiety. “The more we’ve spoken about this the more I think it is doable; I can see this is not that big of a deal now.” Introductory materials produced.a 
Usability    
 Interface Pediatricians from the user panel iteratively gave feedback on a no. of wireframes and mockups, ultimately leading to a β test.  The platform “looked good and was simple and easy to understand.” “Oh! So, you wouldn’t have to click on those links, and you could go back and review later! I like it!” “It [the platform] looks cleaned up; there’s not too much information anymore.” Iterative implementations of user panel’s recommendations as moved from wireframe to prototype.a 
 Technology capabilities Pediatricians noted they felt the questions should mirror reality as much as possible (ie, include videos, images, and graphics) to really assess competency in pediatric medicine and align with what physicians actually see in practice. “Life is not still images.” ABP has included some images in current pilot and is considering ways to further implement media content. 
 Experience In user testing, pediatricians reported feeling significantly less anxious after trying the platform out and realizing that it was “easier than expected” to navigate and use. “This is so much better than [an electronic medical record]!” Usability testing focused on making the system easy to use and understand. 
Theme and/or SubthemeDescriptionKey QuotesResolution and/or Response
Learning    
 Promotion of learning Pediatricians would be better served if learning is prioritized over assessment more so than in the current MOC part 3 model. “Currently board certification is not something I do to improve myself; I do it to keep my job.” “If the board is focused on lifelong learning, doing questions continually with immediate feedback is a much better way to accomplish it than taking a test every 10 years and not knowing what you even got right or wrong.” A summative assessment is a necessary part of the ABP’s mission; however, multiple opportunities for learning are being incorporated (see Table 2). 
 Question feedback Pediatricians expressed frustrations with the lack of feedback that follows the proctored examination. Not receiving feedback on incorrect and correct items was a limitation. Most agreed having this would allow them to identify gaps in their knowledge. “If you get immediate feedback, you can use that information to understand why one is right and the other is wrong.” “[We would be] theoretically learning for life, not for the test.” MOCA-Peds provides immediate feedback to support lifelong learning. 
 Study and/or preparation Pediatricians often study extensively (eg, between 1 and 6 mo) for the current proctored examination. “They study at least an hour a day during the week and part or all of a day on the weekends.” Question topic areas (learning objectives) are being released before beginning questions to focus study for those who wish to prepare.a 
 No collaboration Pediatricians pointed out that physicians are always collaborating in the field and discussing topics, which is largely seen as a positive approach to solving issues in a clinical setting. “For those of us who work with other people, we are always discussing clinical problems we encounter. Why is this a bad thing? Why not have discussions about a question and how to approach it? If we are looking at how adults learn, this is how we work together.” Because providing a summative assessment of an individual is part of the ABP’s mission, collaboration is currently restricted. 
Life accommodation    
 Accessibility Pediatricians liked that users can access it from any computer and do not have to “rearrange” their lives to complete this requirement. Pediatricians responded positively to the elimination of the need to go to a secure testing center. “It’s less disruptive to clinical practice.” “I applaud these changes. Anything that keeps me out of a testing center is worthwhile.” MOCA-Peds can be accessed anywhere and at any time, and it does not require going to a secure testing center. 
 Flexibility Pediatricians liked the flexibility of being able to complete requirements on their own time in small “bites.” It “fits in with the workflow of a practicing pediatrician.” Questions can be completed at a time convenient for the pediatrician. 
Burden and test anxiety    
 Time limit Pediatricians questioned the purpose of the time limit. Many felt untimed questions would have a significant and positive impact on pediatricians. Pediatricians questioned how different time limits from 1 to 5 min were proposed for questions. “The 5-min time limit makes it so that you are just trying to look [the answer] up and are only concerned with finding the answer and not learning.” Time limit kept because a primary domain is a secure, summative assessment of the individual. However, 5 min was chosen for all questions as opposed to varying the time from 1 to 5 min.a 
Continuous testing Some were concerned about the burdens of continuous assessment and question fatigue, particularly in light of variable clinical, research, and educational demands throughout the y. MOCA-Peds will “always be over your head, and you are never quite done.” 2 policies implemented to allow for time off while participating (see text).a 
 Cost (money) Pediatricians did not want a cost increase with MOCA-Peds. Others asked if the payment schedule would change to a quarterly or yearly format to better match the new MOCA-Peds schedule. “Have you determined how much this will cost yet? I hope that it will not be more than what I pay now.” No additional costs to participate in MOCA-Peds and MOC annual payment option implemented.a 
 Test anxiety Some pediatricians reported getting very anxious about having to take tests, which leads to lost sleep, decreased productivity in their practice, time away from family, etc. They expressed concerns about having to take questions that could cover the entire field and that they might not be prepared for. “It isn’t – oh my God I have 4 hours to vomit everything I know.” “I really like that I can do the questions at my convenience. Although I always passed the exams, I have test anxiety, and this is very easy for me since I read and do questions regularly, and this is just an extension of that type of learning.” Learning objectives provided in advance of each year and question feedback provided immediately after each question. 
 Anxiety Pediatricians indicated that the more they learned about MOCA-Peds, the more feasible it seemed. They thought anxious colleagues’ perceptions would change with experience. They recommended providing introductory materials before starting to help decrease anxiety. “The more we’ve spoken about this the more I think it is doable; I can see this is not that big of a deal now.” Introductory materials produced.a 
Usability    
 Interface Pediatricians from the user panel iteratively gave feedback on a no. of wireframes and mockups, ultimately leading to a β test.  The platform “looked good and was simple and easy to understand.” “Oh! So, you wouldn’t have to click on those links, and you could go back and review later! I like it!” “It [the platform] looks cleaned up; there’s not too much information anymore.” Iterative implementations of user panel’s recommendations as moved from wireframe to prototype.a 
 Technology capabilities Pediatricians noted they felt the questions should mirror reality as much as possible (ie, include videos, images, and graphics) to really assess competency in pediatric medicine and align with what physicians actually see in practice. “Life is not still images.” ABP has included some images in current pilot and is considering ways to further implement media content. 
 Experience In user testing, pediatricians reported feeling significantly less anxious after trying the platform out and realizing that it was “easier than expected” to navigate and use. “This is so much better than [an electronic medical record]!” Usability testing focused on making the system easy to use and understand. 
a

Details are further explained in the text.

Concerns were raised about how MOCA-Peds would take into account life events (eg, major work deadlines, medical issues, deployment overseas, and family crises) and promote flexibility and choice; recommendations were made about how to accommodate different types of needs. As a result, the ABP decided to automatically drop the lowest 4 quarters within each MOC cycle for all participants. Some participants, particularly those in rural communities, described the consequences to patients if they were to lose certification and requested that the ABP provide time for an individual to take a proctored examination if he or she did not meet the passing standard for MOCA-Peds. In response, the ABP determined that questions would only be delivered during the first 4 years of the 5-year MOC cycle; the fifth year would be available for a pediatrician to study and take the proctored MOC examination if needed.

In general, pediatricians highlighted the complexity of the current MOC system and asked for more simplification and overlap of MOC activities. Consequently, the ABP selected a 5-year assessment period for MOCA-Peds to match the standard 5-year MOC cycle instead of the initially proposed 3-year cycle. Additionally, the ABP approved providing some MOC Part 2 (ie, lifelong learning) credit for this activity. Regarding the security of MOCA-Peds, participants understood the need for it but asked for simplicity in the user’s experience. The ABP agreed to (1) remove the need for 2-stage authentication, (2) develop a user-friendly MOCA-Peds code of conduct, and (3) set a standard time for all questions (5 minutes) instead of varying the time from 1 to 5 minutes per question on the basis of perceived difficulty. There was not consensus regarding the optimal question delivery time frame (eg, quarterly or twice yearly), and the ABP chose to maintain a quarterly schedule.

Participating pediatricians also proposed numerous changes to enhance the layout, visual design, and functions of the MOCA-Peds Web-based and mobile platforms. Participants first reviewed initial wireframe concepts (ie, a simple representation of the platform and how each element would work) and later functional prototypes (ie, working versions of the platform); eventually they used a β-system (see Fig 2). On the basis of recommendations, the ABP made changes to the platform design, added hover-over help messages, updated communication about the system to new users, created system notifications and status messages, and developed scoring reports to help users understand MOCA-Peds (see Fig 2 for examples). Participants also requested that future iterations of MOCA-Peds take advantage of technological advances and include visuals (eg, photos, videos, and auditory clips) that simulate information pediatricians consider in their clinical decision-making.

FIGURE 2

Original MOCA-Peds wireframe and first β-version. A, Wireframe version. B, β-version.

FIGURE 2

Original MOCA-Peds wireframe and first β-version. A, Wireframe version. B, β-version.

Finally, the ABP implemented numerous policies around the rollout of MOCA-Peds as a direct response to pediatrician feedback. For those participating in the pilot in 2017–2018, the ABP determined it would defer the MOC Part 3 secure examination requirement for each year of participation and that no participant would lose certification solely by failing the MOCA-Peds pilot. The ABP chose to keep the cost of MOCA-Peds neutral, keeping the fees the same as in previous years and offering an annual payment option as an alternative to the current, every-5-years payment model. Although the ABP had initially conceptualized MOCA-Peds as a replacement for the secure Part 3 examination, the ABP agreed to the participants’ recommendation that the ABP offer MOCA-Peds as an alternative, allowing pediatrician choice. Last, the ABP reasserted the importance of ongoing evaluation regarding MOCA-Peds to address concerns regarding the evidence base for ABP MOC programs.

On the basis of this developmental work in 2015–2016, the ABP is piloting the first iteration of MOCA-Peds in 2017–2018. For the pilot, the research team will be using a formative evaluation approach that permits continued opportunity for pediatric feedback on MOCA-Peds; questions about feasibility, acceptability, and impact on learning and clinical practice will be the primary focus. The research team will also be investigating issues related to subspecialists and MOCA-Peds. In January 2019, the ABP plans to launch MOCA-Peds as a formally approved alternative to the current MOC Part 3 examination in general pediatrics and in 3 subspecialties (Child Abuse Pediatrics, Pediatric Infectious Diseases, and Pediatric Gastroenterology), with the remaining subspecialties available by 2022.

The ABP is committed to improving its processes and providing additional value to the pediatrician, above and beyond providing documentation to the public that an individual pediatrician meets its standards of excellence. The planned launch of MOCA-Peds in January 2019 as an alternative to the MOC Part 3 examination is occurring during a period of heightened discussion and review of the initial and continuing certification processes for all 24 boards associated with ABMS. This vigorous dialogue includes physicians, insurers, hospitals, practices, specialty societies, state medical boards, patients and families, and consumer organizations. In recognition of feedback from these groups, other ABMS member boards are also developing continuous assessment tools similar to MOCA-Minute and MOCA-Peds. To continue this dialogue, the ABP welcomes any and all feedback as it seeks to improve its processes.

     
  • ABA

    American Board of Anesthesiology

  •  
  • ABMS

    American Board of Medical Specialties

  •  
  • ABP

    American Board of Pediatrics

  •  
  • KDD

    key driver diagram

  •  
  • MOC

    Maintenance of Certification

  •  
  • MOCA-Peds

    Maintenance of Certification Assessment for Pediatrics

  •  
  • RTI

    RTI International

Dr Leslie contributed to the conception, design, and implementation of the Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) and its formative evaluation and to the acquisition, analysis, and interpretation of the data, and she drafted the initial manuscript; Drs Althouse, Dwyer, Olmsted, and Mr Turner contributed to the conception, design, and implementation of MOCA-Peds and its formative evaluation and to the acquisition, analysis, and interpretation of the data, and they drafted components of the manuscript; Dr Carraccio contributed to the conception, design, and implementation of the MOCA-Peds program; and all authors reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.

FUNDING: Funded by the American Board of Pediatrics (ABP) with the exception of the research and evaluation component, which was funded by the ABP Foundation, a nonprofit, supporting organization to the ABP.

We acknowledge the dedicated pediatricians who participated in the development of the model in 2016 and the >11 000 pediatricians who have or are currently engaged in the pilot in 2017–2018, all of whom have provided critical feedback to help improve the MOCA-Peds model and the Web-based and mobile platforms. We also thank the members of the ABP’s multiple committees, boards, and task forces, including the Family Leadership Advisory Group for their feedback on MOCA-Peds as well as their attention to MOCA-Peds within the context of ABP’s mission to the public. We appreciate the willingness of the American Board of Anesthesiology to share with us many of the details of their initial pilot program MOCA-Minute, their technological code for their platform, and the strengths and challenges of their current iteration. We thank Andrew Bradford, Valerie Haig, Virginia Moyer, Marshall Land, and David Nichols for their editorial review. Last, we thank the staff members at the ABP who worked tirelessly to implement the pilot program within a narrow 18-month window of time. The MOCA-Peds model and platforms are infinitely improved because of the input of all these individuals.

1
Pearson
HA
,
Finberg
L
.
The American Board of Pediatrics, 1933-2008
.
Chapel Hill, NC
:
The American Board of Pediatrics
;
2008
2
American Board of Medical Specialties
. Standards for the ABMS program for Maintenance of Certification (MOC). 2014. Available at: www.abms.org/media/1109/standards-for-the-abms-program-for-moc-final.pdf. Accessed April 19, 2018
3
The American Board of Pediatrics
. Cognitive expertise secure exam (part 3). Available at: https://www.abp.org/content/cognitive-expertise-exam-part-3. Accessed April 19, 2018
4
The American Board of Pediatrics
. Future of testing conference summary. Available at: https://abpedsfoundation.org/fotc-2015/. Accessed March 29, 2018
5
The American Board of Pediatrics
. Future of testing conference presentations. Available at: https://abpedsfoundation.org/fotc-2015/#presentations. Accessed March 29, 2018
6
Bornmann
L
,
Mutz
R
.
Growth rates of modern science: a bibliometric analysis based on the number of publications and cited references.
J Assoc Inf Sci Technol
.
2015
;
66
(
11
):
2215
2222
7
Custers
EJFM
.
Long-term retention of basic science knowledge: a review study.
Adv Health Sci Educ Theory Pract
.
2010
;
15
(
1
):
109
128
[PubMed]
8
Custers
EJ
,
Ten Cate
OT
.
Very long-term retention of basic science knowledge in doctors after graduation.
Med Educ
.
2011
;
45
(
4
):
422
430
9
D’Eon
MF
.
Knowledge loss of medical students on first year basic science courses at the University of Saskatchewan.
BMC Med Educ
.
2006
;
6
:
5
[PubMed]
10
Davis
DA
,
Mazmanian
PE
,
Fordis
M
,
Van Harrison
R
,
Thorpe
KE
,
Perrier
L
.
Accuracy of physician self-assessment compared with observed measures of competence: a systematic review.
JAMA
.
2006
;
296
(
9
):
1094
1102
[PubMed]
11
Kruger
J
,
Dunning
D
.
Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments.
J Pers Soc Psychol
.
1999
;
77
(
6
):
1121
1134
[PubMed]
12
Eva
KW
,
Regehr
G
.
Self-assessment in the health professions: a reformulation and research agenda.
Acad Med
.
2005
;
80
(
suppl 10
):
S46
S54
[PubMed]
13
Rohrer
D
,
Pashler
H
.
Recent research on human learning challenges conventional instructional strategies.
Educ Res
.
2010
;
39
(
5
):
406
412
14
Schumacher
DJ
,
Englander
R
,
Carraccio
C
.
Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment.
Acad Med
.
2013
;
88
(
11
):
1635
1645
[PubMed]
15
Ryan
RM
,
Deci
EL
.
Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being.
Am Psychol
.
2000
;
55
(
1
):
68
78
[PubMed]
16
Kerfoot
BP
,
Shaffer
K
,
McMahon
GT
, et al
.
Online “spaced education progress-testing” of students to confront two upcoming challenges to medical schools.
Acad Med
.
2011
;
86
(
3
):
300
306
[PubMed]
17
Kerfoot
BP
,
Lawler
EV
,
Sokolovskaya
G
,
Gagnon
D
,
Conlin
PR
.
Durable improvements in prostate cancer screening from online spaced education a randomized controlled trial.
Am J Prev Med
.
2010
;
39
(
5
):
472
478
[PubMed]
18
Southgate
L
,
van der Vleuten
CPM
.
A conversation about the role of medical regulators.
Med Educ
.
2014
;
48
(
2
):
215
218
[PubMed]
19
Sun
H
,
Zhou
Y
,
Culley
DJ
,
Lien
CA
,
Harman
AE
,
Warner
DO
.
Association between participation in an intensive longitudinal assessment program and performance on a cognitive examination in the Maintenance of Certification in Anesthesiology Program®.
Anesthesiology
.
2016
;
125
(
5
):
1046
1055
[PubMed]
20
Institute for Healthcare Improvement
. Driver diagram. Available at: www.ihi.org/resources/Pages/Tools/Driver-Diagram.aspx. Accessed March 29, 2018
21
Dwyer
A
,
Althouse
L.
Validity evidence for the general pediatrics board certification examinations: a practice analysis.
J Pediatr
.
In press
.
22
The American Board of Pediatrics
. General pediatrics Content Outline. 2017. Available at: https://www.abp.org/sites/abp/files/pdf/gp_contentoutline_2017.pdf. Accessed December 1, 2017
23
American Educational Research Association
;
American Psychological Association
;
National Council on Measurement in Education
.
Standards for Educational and Psychological Testing
.
Washington, DC
:
AERA Publications
;
2014
Available at: www.aera.net/Publications/Books/Standards-for-Educational-Psychological-Testing-2014-Edition. Accessed December 1, 2017
24
Raymond
M
. Job analysis, practice analysis, and the content of credentialing examinations. In:
Lane
S
,
Raymond
M
,
Haladyna
T
, eds.
Handbook of Test Development
.
New York, NY
:
Routledge
;
2016
:
144
164
25
Henderson
J
,
Smith
D
. Job/practice analysis. In:
Knapp
J
,
Anderson
L
,
Wild
C
, eds.
Certification: The ICE Handbook
.
Washington, DC
:
The Institute for Credentialing Excellence
;
2009
:
123
148
26
National Commission for Certifying Agencies
.
Standards for the Accreditation of Certification Programs
.
Washington, DC
:
Institute for Credentialing Excellence
;
2014
27
Buckendahl
CW
,
Davis-Becker
S
. In:
Cizek
G
, ed.
Setting Performance Standards: Foundations, Methods, and Innovations
. 2nd ed.
New York, NY
:
Routledge
;
2012
28
Schuwirth
LWT
,
van der Vleuten
CPM
.
The use of progress testing.
Perspect Med Educ
.
2012
;
1
(
1
):
24
30
[PubMed]
29
Richland
LE
,
Bjork
RA
,
Finley
JR
,
Linn
MC
. Linking cognitive science to education: generation and interleaving effects
. In: Proceedings of the Twenty-Seventh Annual Conference of the Cognitive Science Society
. Stresa, Italy;
2005
: 1850-1855
30
Rohrer
D
.
Interleaving helps students distinguish among similar concepts.
Educ Psychol Rev
.
2012
;
24
(
3
):
355
367
31
Birnbaum
MS
,
Kornell
N
,
Bjork
EL
,
Bjork
RA
.
Why interleaving enhances inductive learning: the roles of discrimination and retrieval.
Mem Cognit
.
2013
;
41
(
3
):
392
402
[PubMed]
32
Stufflebeam
DL
,
Coryn
CLS
.
Evaluation Theory, Models, and Applications
. 2nd ed.
San Francisco, CA
:
Jossey-Bass
;
2014
33
Bartholomew
LK
,
Parcel
GS
,
Kok
G
.
Intervention mapping: a process for developing theory- and evidence-based health education programs.
Health Educ Behav
.
1998
;
25
(
5
):
545
563
[PubMed]
34
International Organization for Standardization
. Ergonomics of human-system interaction — part 210: human-centered design for interactive systems. 2010. Available at: https://www.iso.org/obp/ui/#iso:std:iso:9241:-210:ed-1:v1:en. Accessed November 11, 2017
35
Patton
M
.
Qualitative Evaluation and Research Methods
.
Beverly Hills, CA
:
Sage
;
1990
36
Braun
V
,
Clarke
V
.
Using thematic analysis in psychology.
Qual Res Psychol
.
2006
;
3
(
2
):
77
101
37
Morgan
DL
.
Focus groups.
Annu Rev Sociol
.
1996
;
22
(
1
):
129
152
38
Hancock
ME
,
Amankwaa
L
,
Revell
MA
,
Mueller
D
.
Focus group data saturation: a new approach to data analysis.
Qual Rep
.
2016
;
21
(
11
):
2124
2130

Competing Interests

POTENTIAL CONFLICT OF INTEREST: Drs Leslie, Carraccio, Dwyer, and Althouse and Mr Turner are all employees of the American Board of Pediatrics (ABP) and receive salary compensation for their work at the ABP. No additional support was provided for the research described and/or manuscript production. Dr Olmsted is an employee of RTI International, a research organization contracted through the ABP Foundation to assist in the design and conduct of the stakeholder engagement activities described in this article and the subsequent evaluation of Maintenance of Certification Assessment for Pediatrics. RTI International received funds for these activities and for time spent drafting and editing the manuscript.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.