BACKGROUND AND OBJECTIVES:

This article is the second of a 2-part series examining results regarding self-reported learning and practice change from the American Board of Pediatrics 2017 pilot of an alternative to the proctored, continuing certification examination, termed the Maintenance of Certification Assessment for Pediatrics (MOCA-Peds). Because of its design, MOCA-Peds has several learning advantages compared with the proctored examination.

METHODS:

Quantitative and qualitative analyses with 5081 eligible pediatricians who registered to participate in the 2017 pilot; 81.4% (n = 4016) completed a quarter 4 survey and/or the end-of-year survey (January 2018) and compose the analytic sample.

RESULTS:

Nearly all (97.6%) participating pediatricians said they had learned, refreshed, or enhanced their medical knowledge, and of those, 62.0% had made a practice change related to pilot participation. Differences were noted on the basis of subspecialty status, with 68.9% of general pediatricians having made a practice change compared with 41.4% of subspecialists. Within the 1456 open-ended responses about participants’ most significant practice change, responses ranged widely, including both medical care content (eg, “care for corneal abrasions altered,” “better inform patients about. . .flu vaccine”) and nonspecific content (eg, providing better patient education, using evidence-based medicine, increased use of resources in regular practice).

CONCLUSIONS:

As a proctored examination alternative, MOCA-Peds positively influenced self-reported learning and practice change. In future evaluation of MOCA-Peds and other medical longitudinal assessments, researchers should study ways to further encourage learning and practice change and sustainability.

What’s Known on This Subject:

Maintenance of Certification Assessment for Pediatrics is a Web-based knowledge assessment developed by the American Board of Pediatrics as an alternative to an examination at a secure test center. Both the model and Web-based platform were developed with extensive input from practicing pediatricians in 2016.

What This Study Adds:

This article is the second of 2 articles in which results are reported from the Maintenance of Certification Assessment for Pediatrics 2017 pilot conducted with over 5000 pediatricians. We review learning and clinical practice change resulting from participation in this new longitudinal assessment format for continuing certification.

Pediatricians participating in continuing certification (also referred to as Maintenance of Certification [MOC]) have a broad range of attitudes toward the program and whether it brings value to participants. Whereas initial certification demonstrates a candidate has met training requirements and achieved a minimum threshold of medical knowledge and skills, continuing certification requires that pediatricians (1) meet professionalism standards, (2) participate in lifelong learning through self-assessment, (3) maintain current clinical knowledge through periodic assessment, and (4) advance practice through quality improvement.1,2  Meeting professionalism standards and participating in periodic assessments ensure that pediatricians continually meet external thresholds for practice, whereas self-assessment and quality improvement activities are intended to address knowledge and practice gaps as medical knowledge and practice change over time.

Despite the intended functions of continuing certification, anecdotal feedback, focus group analyses,3  published commentaries,4,5  and surveys6  suggest that continuing certification has been perceived by some practicing pediatricians to be of limited value despite agreement with these underlying goals. Many voiced that MOC was a “hoop to jump through” and that the proctored examination was “degrading” and promoted “cramming” rather than learning. In response, the American Board of Pediatrics (ABP) and other American Board of Medical Specialties (ABMS) member boards have begun implementing several changes to the certification process. The largest recent change has been to pilot test alternatives to the proctored examination that seek to increase relevance to clinical practice and opportunities for learning.7 

In 2015, the ABP launched its efforts to update the certification process by hosting the Future of Testing Conference. Topics covered included adult learning theory,810  the testing effect (ie, when tested, memory is often enhanced for a topic when forced to retrieve relevant information),11,12  and the importance of immediate feedback in encouraging learning.13  Importantly, several internationally recognized experts proposed an assessment framework that would simultaneously measure current medical knowledge and provide opportunities for refreshing or learning new information.14  Further literature review on spaced education progress testing (ie, longitudinally receiving questions on the same topic) demonstrated how repeated questions on similar content over time could both increase learning and change practice (eg, improve appropriate prostate-specific antigen screening).1517  Given the desire to incorporate learning in a value-added, meaningful process into continuing certification, the ABP’s Board of Directors approved piloting a longitudinal assessment in 2017 to 2018 as a proctored examination alternative.

At that time, only the American Board of Anesthesiology (ABA) had launched a longitudinal assessment program, MOCA Minute18 ; there was little published research on how a longitudinal assessment framework might influence physician learning and subsequent practice change. Maintenance of Certification Assessment for Pediatrics (MOCA-Peds) was modeled after MOCA Minute, with modifications suggested by pediatrician input through focus groups and user panels conducted in 2015 to 2016.3  Questions would be delivered over the course of a year in quarterly increments and could be completed throughout a quarter or all at once, as determined by the individual pediatrician. Learning opportunities would include (1) the ability to preview learning objectives (ie, 40 specific question topics provided before the beginning of the year)19 ; (2) access to resources (eg, search engines, clinical decision software) while answering questions; (3) immediate feedback after each question, including the correct answer, rationale for answers, and references; (4) repeated exposure to topics using similar questions; and (5) access to a “questions history” page with previously answered questions and answers.

The 2017 pilot year provided an ideal opportunity to prospectively evaluate outcomes related to learning and associated clinical practice change. We hypothesized pediatricians would report having learned or refreshed their knowledge but expected that a relatively small proportion of participants would indicate practice change as a result of participation. In the partner article “Pediatrician Perspectives on Feasibility and Acceptability of the MOCA-Peds 2017 Pilot,” the authors explore the acceptability and feasibility of MOCA-Peds.20 

The ABP partnered with RTI International to design and conduct the 2017 MOCA-Peds pilot evaluation using a mixed methods approach with quantitative and qualitative analyses. After review by the ABP’s Research Advisory Committee and an exempt designation from the RTI Institutional Review Board, all pediatricians (n = 6814) eligible to take the General Pediatrics MOC Examination in 2017 were recruited to participate in the pilot and enrolled via a registration survey. In total, 5081 pediatricians enrolled. Enrollees were sent surveys after the completion of each quarter’s questions, and some were randomly selected to participate in focus groups (29 participating) to support ongoing work with a new 2017 user panel (25 participating), neither of which are the focus of this analysis. In January 2018, a voluntary end-of-year survey was administered to pediatricians who completed the 2017 pilot and met the passing standard.

The registration survey was used to enroll pediatricians for the 2017 pilot and capture participant characteristics (eg, demographics, work setting), attitudes toward lifelong learning, and the MOC process. The quarter 4 survey was used to address participants’ overall experience and the use of MOCA-Peds for learning. The end-of-year survey was used to address overall satisfaction with MOCA-Peds and the impact of MOCA-Peds on participants’ learning and clinical practice. In addition, those respondents who affirmed they had learned, refreshed, or enhanced their medical knowledge also received a single open-ended question: “What was the most significant practice change(s) you made as a result of participation in the 2017 pilot?” (see Fig 1). Quarter 1 to 3 surveys were used to cover topics outside the scope of this article and are not addressed in this article.

FIGURE 1

Survey responses about knowledge and clinical practice change. aGeneral pediatrics and at least 1 subspecialty. bGeneral pediatric versus subspecialty distinctions shown because of their statistical significance (<0.001) and relevance to analyses.

FIGURE 1

Survey responses about knowledge and clinical practice change. aGeneral pediatrics and at least 1 subspecialty. bGeneral pediatric versus subspecialty distinctions shown because of their statistical significance (<0.001) and relevance to analyses.

Close modal

The analytic sample (n = 4238) consisted of pediatricians responding to the quarter 4 and/or end-of-year survey containing questions on learning and practice change; 62.2% (n = 2634) completed both surveys, 32.6% (n = 1382) completed the quarter 4 survey only, and 5.2% (n = 222) completed the end-of-year survey only. Available demographic characteristics (sex, age, medical school type, certification status) were examined for differences between the analytic sample and participants not completing either or both surveys (ie, nonresponders). No significant differences were observed (data not shown; P > .05) except for age (P < .001). Nonresponders tended to be slightly younger, although differences were small and not clinically meaningful.

Demographic characteristics (ie, sex, age, medical school graduate type, and general pediatrician [GP] versus subspecialist), lifelong learning attitudes, and practice variables were evaluated with descriptive statistics. Data from attitude survey response scales were collapsed into categories (ie, “strongly agree” and “agree” were recoded to “agree,” and “strongly disagree” and “disagree” were recoded to “disagree”). χ2 analyses were conducted to determine if there were any significant differences between responders and nonresponders or across key variables of interest, including demographics. In some cases, respondents did not complete all questions; therefore, the number of responses for an item may be lower than reported above. SPSS 25 (IBM SPSS Statistics, IBM Corporation) was used for all quantitative analyses.21 

Of the 1727 participants positively indicating a practice change, 1456 (84.3%) responded to the open-ended question concerning the specific change made; these responses were uploaded into the Web-based software Dovetail 2018, a qualitative analysis tool.22  Using thematic analysis,23  the open-ended text was first read by each reviewer (A.L.T., A.B.) to develop a coding scheme of major themes. Responses were then assigned codes by one reviewer and fully reviewed by a second reviewer. Differences of opinion regarding the code assignments were discussed and resolved during the second review. The coding scheme was revised as needed throughout the process. All authors reviewed and approved the final results and exemplar quotes.

The coding scheme was split into 3 broad categories: (1) type of change mentioned, (2) responses’ association to the 2017 learning objectives19  and the content domain for general pediatrics,24  and (3) practice improvement. Type of change was subdivided into 2 categories: knowledge (eg, respondent indicated “updating” or “refreshing” knowledge) and practice. Coding as knowledge change did not preclude that a practice change was made, only that it was not explicitly stated. Medical content in open-ended comments was coded by learning objective and ABP content domain. A cross-tabulation on the type of change and content domains produced an overall count for each content domain. Finally, practice improvement codes were attributed to comments not relating to a specific medical content area covered in MOCA-Peds’ questions.

The analytic sample was 66.7% female, close to half were 40 to 49 years of age, 77.9% were American medical school graduates, 73.9% were GPs, and 26.1% were maintaining 1 or more ABP subspecialty certifications. With respect to lifelong learning attitudes and practices captured on the registration survey, 97.1% agreed that lifelong learning was a professional responsibility of all physicians, yet only 79.1% routinely made time for self-directed learning. Over two-thirds (69.5%) reported routinely searching computer databases to learn about new developments in pediatrics (Table 1). Bivariate analyses of demographic characteristics by attitudes and practices of lifelong learning yielded no meaningful differences (P > .05, data not shown).

TABLE 1

Analytic Sample Characteristics and Attitudes on Learning and MOCA-Peds (N = 4238)

Sample Characteristics%
Demographicsa  
 Sex  
  Male 33.3 
  Female 66.7 
 Age, y  
  30–39 20.3 
  40–49 45.6 
  50–59 29.4 
  ≥60 4.7 
 Medical school graduate type  
  American medical school 78.0 
  International medical school 22.0 
 Certification type  
  General pediatrics 73.9 
  General pediatrics and at least 1 subspecialty 26.1 
Baseline lifelong learning attitudes and practices (percentage of strongly agree and agree)b  
 Lifelong learning is a professional responsibility of all physicians 97.0 
 I routinely make time for self-directed learning, even when I have a busy work schedule and other professional and family obligations 78.9 
 I routinely search computer databases to find out about new developments in pediatrics 86.6 
Attitudes about MOCA-Peds’ impact on learning and provision of care (percentage of strongly agree and agree)c  
 MOCA-Peds questions were useful learning tools 86.6 
 MOCA-Peds has helped me to identify my knowledge gaps 77.1 
 The MOCA-Peds program helps me provide better care to my patients 60.1 
Sample Characteristics%
Demographicsa  
 Sex  
  Male 33.3 
  Female 66.7 
 Age, y  
  30–39 20.3 
  40–49 45.6 
  50–59 29.4 
  ≥60 4.7 
 Medical school graduate type  
  American medical school 78.0 
  International medical school 22.0 
 Certification type  
  General pediatrics 73.9 
  General pediatrics and at least 1 subspecialty 26.1 
Baseline lifelong learning attitudes and practices (percentage of strongly agree and agree)b  
 Lifelong learning is a professional responsibility of all physicians 97.0 
 I routinely make time for self-directed learning, even when I have a busy work schedule and other professional and family obligations 78.9 
 I routinely search computer databases to find out about new developments in pediatrics 86.6 
Attitudes about MOCA-Peds’ impact on learning and provision of care (percentage of strongly agree and agree)c  
 MOCA-Peds questions were useful learning tools 86.6 
 MOCA-Peds has helped me to identify my knowledge gaps 77.1 
 The MOCA-Peds program helps me provide better care to my patients 60.1 

Reflects the total number of participants that responded to either the 2017 pilot quarter 4 survey and/or the 2017 pilot end-of-year survey.

a

Demographics were pulled from the ABP Certification Management System.

b

Lifelong learning attitudes were pulled from the 2017 pilot registration data set.

c

Attitudes on learning and provision of care were pulled from the quarter 4 survey.

On the quarter 4 survey, 77.1% agreed that pilot participation helped to identify personal knowledge gaps, and over half (60.1%) endorsed that MOCA-Peds helped them to provide better patient care. A majority (86.6%) agreed that MOCA-Peds questions were useful learning tools (Table 1). In each case, results differed slightly by age (P < .001), with younger participants generally indicating higher agreement (eg, 64.2% of those aged 30–39 years said participation led to providing better patient care compared with 51.9% of those aged 60 years and older). GPs and subspecialists also differed in the percentage that indicated that participation in the pilot led to providing better patient care, with 63.3% of GPs and 51.0% of subspecialists indicating improved care (P < .001).

In Fig 1, we display results on questions regarding knowledge and practice change on the end-of-year survey. A total of 97.6% of pediatricians indicated they had learned, refreshed, or enhanced their medical knowledge as a result of the pilot. Of these, 62.0% indicated that a knowledge change had led to at least 1 practice change, and 16.8% planned to make a change moving forward. The remaining 21.2% either mentioned “No, because my practice area is not general pediatrics focused” or “No, for any other reason.” Significant differences were observed, with GPs more likely to have already made a practice change than subspecialists (P < .001).

Of the 1727 who indicated they made a practice change, 84.3% responded to the open-ended question, “What was the most significant practice change(s) you made as a result of participation in the 2017 pilot?” In most instances, respondents indicated knowledge (26.4%) or practice change(s) (56.7%) but not both, permitting count totals by content domain in Table 2.

TABLE 2

Top 15 Practice and Knowledge Changes by Examination Content Domains (n = 1456) in Descending Order by Counts

Examination Content Domains (% of Questions on Examination)aKnowledge Change CountbPractice Change CountbExemplar Quotes
Eye, ear, nose, and throat (4) 21 104 “I identified a Kawasaki pt based on review - not common in our practice - potentially lifesaving; that is just one instance. . .” 
“I improved my management of AOM in patients older than 2 years, implementing a period of watchful waiting.” 
“My care for corneal abrasions was altered to be based on the current care guidelines.” 
Preventive pediatrics and well-child care (8) 10 96 “I think the biggest change was feeling more comfortable doing the flu vaccine in patients who reported [an] egg allergy.” 
“I was able to better inform my patients and remind my partners about [the] Flu vaccine and Flu precautions.” 
“I altered some of my antibiotic prescribing practices after reviewing some of the literature.” 
“Fine-tuned diagnosis and management of pediatric hypertension.” 
Mental and behavioral health (5) 65 “Rethinking my workup of Autism spectrum disorders/developmental disabilities.[. . .] hopefully creating a larger differential for ADHD to avoid overdiagnosis [in] children.” 
“I was able to distinguish between ADHD, bipolar, conduct disorders and ODD much [more] easily and refer them appropriately.” 
“[F]ollow up on ADHD patients and [watch] more closely for masked psychiatric disorders such as depression and anxiety.” 
Infectious diseases (7) 61 “Reviewed newer infectious disease guidelines that do actually affect my practice.” 
“I was able to talk more knowledgeably about vaccinepreventable diseases.” 
“I am more proactive in treating influenza patients with oseltamivir.” 
“Some of the questions reminded me to further investigate babies born SGA for possible infectious etiologies of growth concerns.” 
Pulmonology (5) 50 “Treating allergic rhinitis with nasal steroids as a first line treatment instead of anti-histamines.” 
“Review of Asthma guidelines kept me up to date with the asthma protocol in an urgent setting.” 
“Reviewed newest protocols for bronchiolitis, palivizumab, etc.” 
Nephrology, fluids, and electrolytes (4) 42 “I increased my differential with a patient with potential renal issues.” 
“We looked at our UTI management because of one of the questions I encountered on the test and we have to work to improve our evidence-based management of dysuria.” 
Endocrinology (4) 35 “Improved my management of endocrine cases mainly because of the reading after missed answers in this area more than others.” 
“Improved algorithm for evaluation of problems such as short stature.” 
Orthopedics and sports medicine (4) 26 “I did a lot of review with ortho complaints and injury patterns – seemed to be a shortfall for me.” 
“Management of ankle/foot pain (reviewing that) has been pertinent in daily use.” 
“Reviewed evaluation of child with limp.” 
Adolescent care (5) 16 “I was much better able to discuss issues of puberty, both precocious and delayed as well as discuss and diagnose menstrual difficulties.” 
“More aware of STIs, tests, and treatments in adolescent patients.” 
“Understanding contraception choices for teens.” 
Genetics, dysmorphology, and metabolic disorder (3) “Learned more about genetic defects that have been discovered since I trained.” 
“Being aware of possible critical newborn illnesses, such as galactosemia, again.” 
“More attention to family history and possible genetic disorders.” 
Urology and genital disorders (3) “I have changed the way I approach testicular exams, paying more attention to feature of epididymitis.” 
“Awareness of diagnostic approach to epididymitis.” 
Neurology (5) 10 “Increased awareness and management of Medication Overuse Headache.” 
“I’ve changed the way I treat sinusitis, being more conservative with antibiotics.” 
“It has helped me develop an algorithm for evaluating headaches.” 
Child abuse and neglect (4) “Enhanced nonaccidental trauma evaluations.” 
“Increased surveillance for signs of child abuse.” 
Fetal and neonatal care (5) “Applying some of the new resuscitation algorithms during care of critical patients.” 
“Newborn screening teaching changes.” 
Cardiology (4) “Better ability to diagnose cardiac murmurs.” 
“Testing made me aware of deficits in cardiology. Did some extra studying.” 
Examination Content Domains (% of Questions on Examination)aKnowledge Change CountbPractice Change CountbExemplar Quotes
Eye, ear, nose, and throat (4) 21 104 “I identified a Kawasaki pt based on review - not common in our practice - potentially lifesaving; that is just one instance. . .” 
“I improved my management of AOM in patients older than 2 years, implementing a period of watchful waiting.” 
“My care for corneal abrasions was altered to be based on the current care guidelines.” 
Preventive pediatrics and well-child care (8) 10 96 “I think the biggest change was feeling more comfortable doing the flu vaccine in patients who reported [an] egg allergy.” 
“I was able to better inform my patients and remind my partners about [the] Flu vaccine and Flu precautions.” 
“I altered some of my antibiotic prescribing practices after reviewing some of the literature.” 
“Fine-tuned diagnosis and management of pediatric hypertension.” 
Mental and behavioral health (5) 65 “Rethinking my workup of Autism spectrum disorders/developmental disabilities.[. . .] hopefully creating a larger differential for ADHD to avoid overdiagnosis [in] children.” 
“I was able to distinguish between ADHD, bipolar, conduct disorders and ODD much [more] easily and refer them appropriately.” 
“[F]ollow up on ADHD patients and [watch] more closely for masked psychiatric disorders such as depression and anxiety.” 
Infectious diseases (7) 61 “Reviewed newer infectious disease guidelines that do actually affect my practice.” 
“I was able to talk more knowledgeably about vaccinepreventable diseases.” 
“I am more proactive in treating influenza patients with oseltamivir.” 
“Some of the questions reminded me to further investigate babies born SGA for possible infectious etiologies of growth concerns.” 
Pulmonology (5) 50 “Treating allergic rhinitis with nasal steroids as a first line treatment instead of anti-histamines.” 
“Review of Asthma guidelines kept me up to date with the asthma protocol in an urgent setting.” 
“Reviewed newest protocols for bronchiolitis, palivizumab, etc.” 
Nephrology, fluids, and electrolytes (4) 42 “I increased my differential with a patient with potential renal issues.” 
“We looked at our UTI management because of one of the questions I encountered on the test and we have to work to improve our evidence-based management of dysuria.” 
Endocrinology (4) 35 “Improved my management of endocrine cases mainly because of the reading after missed answers in this area more than others.” 
“Improved algorithm for evaluation of problems such as short stature.” 
Orthopedics and sports medicine (4) 26 “I did a lot of review with ortho complaints and injury patterns – seemed to be a shortfall for me.” 
“Management of ankle/foot pain (reviewing that) has been pertinent in daily use.” 
“Reviewed evaluation of child with limp.” 
Adolescent care (5) 16 “I was much better able to discuss issues of puberty, both precocious and delayed as well as discuss and diagnose menstrual difficulties.” 
“More aware of STIs, tests, and treatments in adolescent patients.” 
“Understanding contraception choices for teens.” 
Genetics, dysmorphology, and metabolic disorder (3) “Learned more about genetic defects that have been discovered since I trained.” 
“Being aware of possible critical newborn illnesses, such as galactosemia, again.” 
“More attention to family history and possible genetic disorders.” 
Urology and genital disorders (3) “I have changed the way I approach testicular exams, paying more attention to feature of epididymitis.” 
“Awareness of diagnostic approach to epididymitis.” 
Neurology (5) 10 “Increased awareness and management of Medication Overuse Headache.” 
“I’ve changed the way I treat sinusitis, being more conservative with antibiotics.” 
“It has helped me develop an algorithm for evaluating headaches.” 
Child abuse and neglect (4) “Enhanced nonaccidental trauma evaluations.” 
“Increased surveillance for signs of child abuse.” 
Fetal and neonatal care (5) “Applying some of the new resuscitation algorithms during care of critical patients.” 
“Newborn screening teaching changes.” 
Cardiology (4) “Better ability to diagnose cardiac murmurs.” 
“Testing made me aware of deficits in cardiology. Did some extra studying.” 

ADHD, attention-deficit/hyperactivity disorder; AOM, acute otitis media; ODD, oppositional defiant disorder; pt, patient; SGA, small for gestational age; STI, sexually transmitted infections; UTI, urinary tract infection.

a

The percentage of questions on the examination is based on examination weightings used at the time of the 2017 pilot. Percent weights are shown in the ABP’s General Pediatrics Content Outline. “Preventive pediatrics and well-child care” at 8% represents the area with the most questions in MOCA-Peds.

b

Each count represents a respondent; therefore, the percentage of counts out of the total are not shown because a respondent may have indicated multiple medical content areas in one comment and may be indicated in this table more than once.

In Table 2, we present the most frequently mentioned 15 content domains in descending order by count. Content domains with the largest participant endorsements included ear, nose, and throat; preventive and well-child care; and mental and behavioral health. Participants indicated that learning occurred for problems commonly encountered in practice (eg, flu vaccine) as well as rare events that would be important to not miss (eg, child acutely presenting with Kawasaki syndrome). Within the content domain, the 3 most commonly cited learning objectives in which knowledge or practice change occurred included (1) plan the management of a child with otitis media (n = 107); (2) manage a child with an acute asthma exacerbation (n = 47); and (3) plan the management of a child with influenza (n = 44).

In Table 3, we capture the additional 400 responses on practice changes beyond those associated with a specific content domain. Of the 9 subthemes, the 3 most common were use of evidence-based medicine, increased review of materials or resource use, and improved differential diagnosis. With respect to “evidence-based medicine,” pediatricians indicated that participation in MOCA-Peds exposed them to new material, encouraging the review of guidelines or standard care protocols and their application. Respondents also increased their review of materials or resource use (eg, clinical decision support tools) in practice, updated the types of resources used, or generally sought more to “stay up-to-date.” For the “improved differential diagnosis” category, participants either mentioned a specific area of clinical care or that the exposure to MOCA-Peds enhanced their overall differential-making process.

TABLE 3

Positive Practice Improvements in Descending Order (n = 1456)

ThemeExemplar QuotesSubtheme Counta
Use of evidenced-based medicine “[. . .]I do remember being surprised by some guideline changes - as outlined in my reading around the questions and in the answer explanations. I try to incorporate changes, as I am convinced about appropriate changes in the guidelines.” 129 
“Making sure medical decision-making is evidence-based and current.” 
“Checking most current recommendations and practice guidelines.” 
“Refreshing my memory on concepts not frequently used, and staying up to- date with the latest recommendations / evidence.” 
“Taking time to review national guideline[s].” 
Increased review of materials or resource use “After practicing for 17 years, I have become a bit stuck in my ways. With new pediatricians joining my group I have become more aware of online resources. Using [electronic point-of-care medical resource] during the exam process has helped me find more current clinical treatment guidelines.” 123 
“I routinely access clinical guidelines during chart review for conditions that I see less frequently because the MOCA pilot highlighted areas of practice where my knowledge was not as strong.” 
“I had gotten out of the habit of reading to stay current and MOCA-Peds was a great jump-start to get going again.” 
“MOCA-Peds helps you realize what has changed in medicine and put it into practice.” 
“It made me read more and update my resources.” 
“Improved my ability to find resources I need for patient care.” 
“I utilized more articles and research at the bedside for patient care.” 
“Having browsers with quick links to trusted information sites on all computers used for clinical practice.” 
Improvement in differential diagnosis “Given the opportunity to refresh each condition allows me to expand differentials. When you are in practice for a few years, you tend to avoid the broader differential for the most common diagnosis.” 47 
“Think of differential diagnoses that I would not have considered before.” 
“Helped to broaden my differential and update me on guidelines.” 
“Included more in my differential diagnosis when seeing patients.” 
“Improved decision-making skills/differential diagnosis formulation.” 
“Differentials, remembering some zebras in the differential.” 
Better patient education or counseling “The phrase ‘up until 3rd grade kids are learning to read, after 3rd grade kids are reading to learn,’ and this as the basis that learning disabilities involving verbal and visual language processing often arise at this time. I do a lot of behavioral and mental health in my primary care setting, and this has been very helpful as a way to explain this to parents.” 28 
“I provided better and updated education to my patients.” 
“Better patient education.” 
Learning better in this assessment format “By not studying endless hours of minutia for a sit-down exam, I was able to accomplish these questions faster and immediately learn from the discussions and focus my extra time on practical continuing education for my patients.” 19 
“Keeping up to date in real time rather than cramming for a test.” 
Subspecialist keeping up with general pediatrics “It’s really just refreshing general knowledge about things and in reinforcing things that I maybe haven’t thought about in a while. I am a subspecialist, and so there are many gen peds items I don’t really look back on. However, I feel like refreshing my gen peds knowledge makes me a better physician in general.” 19 
“As a practicing subspecialist, I am now more comfortable with general pediatrics.” 
“I am a subspecialist, so it was really helpful to ‘relearn’ some general pediatrics as well as become more up-to-date with general pediatric guidelines.” 
Discussion of topics for improvement with others “Updated information on development, nephrology. We have started practice improvement projects based on it.” 13 
“When I came across clinically relevant information, I shared with my partners at formal clinical discussions.” 
“Reinforced best practices with my partners in group so all [are] doing similar.” 
“Sharing of the knowledge-based information with my partners to effect changes in the group practice. Quality improvement with respect to practice change in management of NICU babies with chronic lung disease.” 
Use of references and resources for teaching trainees “I was able to impart my new-found knowledge on my pediatric resident trainees as several times patients with evidence of AOM presented during this past year. We discussed the guidelines/criteria and managed multiple patients based on this.” 12 
“I used information I gained from the pilot during teaching rounds with the residents.” 
“It was very practical overall, one area was the recognition of new practice guidelines that I was otherwise unaware of, which allowed me to in turn educate my colleagues and resident learners.” 
“Staying more engaged with general pediatrics topics to discuss with residents during morning report.” 
Better patient and family histories “Taking a more-thorough history and physical to make it easier to make an accurate diagnosis.” 10 
“Learned to listen to parent[s] more.” 
“Better counseling and more awareness of who needs further workup.” 
“Better history taking from parents and patients.” 
“Ask more screening questions on histories.” 
ThemeExemplar QuotesSubtheme Counta
Use of evidenced-based medicine “[. . .]I do remember being surprised by some guideline changes - as outlined in my reading around the questions and in the answer explanations. I try to incorporate changes, as I am convinced about appropriate changes in the guidelines.” 129 
“Making sure medical decision-making is evidence-based and current.” 
“Checking most current recommendations and practice guidelines.” 
“Refreshing my memory on concepts not frequently used, and staying up to- date with the latest recommendations / evidence.” 
“Taking time to review national guideline[s].” 
Increased review of materials or resource use “After practicing for 17 years, I have become a bit stuck in my ways. With new pediatricians joining my group I have become more aware of online resources. Using [electronic point-of-care medical resource] during the exam process has helped me find more current clinical treatment guidelines.” 123 
“I routinely access clinical guidelines during chart review for conditions that I see less frequently because the MOCA pilot highlighted areas of practice where my knowledge was not as strong.” 
“I had gotten out of the habit of reading to stay current and MOCA-Peds was a great jump-start to get going again.” 
“MOCA-Peds helps you realize what has changed in medicine and put it into practice.” 
“It made me read more and update my resources.” 
“Improved my ability to find resources I need for patient care.” 
“I utilized more articles and research at the bedside for patient care.” 
“Having browsers with quick links to trusted information sites on all computers used for clinical practice.” 
Improvement in differential diagnosis “Given the opportunity to refresh each condition allows me to expand differentials. When you are in practice for a few years, you tend to avoid the broader differential for the most common diagnosis.” 47 
“Think of differential diagnoses that I would not have considered before.” 
“Helped to broaden my differential and update me on guidelines.” 
“Included more in my differential diagnosis when seeing patients.” 
“Improved decision-making skills/differential diagnosis formulation.” 
“Differentials, remembering some zebras in the differential.” 
Better patient education or counseling “The phrase ‘up until 3rd grade kids are learning to read, after 3rd grade kids are reading to learn,’ and this as the basis that learning disabilities involving verbal and visual language processing often arise at this time. I do a lot of behavioral and mental health in my primary care setting, and this has been very helpful as a way to explain this to parents.” 28 
“I provided better and updated education to my patients.” 
“Better patient education.” 
Learning better in this assessment format “By not studying endless hours of minutia for a sit-down exam, I was able to accomplish these questions faster and immediately learn from the discussions and focus my extra time on practical continuing education for my patients.” 19 
“Keeping up to date in real time rather than cramming for a test.” 
Subspecialist keeping up with general pediatrics “It’s really just refreshing general knowledge about things and in reinforcing things that I maybe haven’t thought about in a while. I am a subspecialist, and so there are many gen peds items I don’t really look back on. However, I feel like refreshing my gen peds knowledge makes me a better physician in general.” 19 
“As a practicing subspecialist, I am now more comfortable with general pediatrics.” 
“I am a subspecialist, so it was really helpful to ‘relearn’ some general pediatrics as well as become more up-to-date with general pediatric guidelines.” 
Discussion of topics for improvement with others “Updated information on development, nephrology. We have started practice improvement projects based on it.” 13 
“When I came across clinically relevant information, I shared with my partners at formal clinical discussions.” 
“Reinforced best practices with my partners in group so all [are] doing similar.” 
“Sharing of the knowledge-based information with my partners to effect changes in the group practice. Quality improvement with respect to practice change in management of NICU babies with chronic lung disease.” 
Use of references and resources for teaching trainees “I was able to impart my new-found knowledge on my pediatric resident trainees as several times patients with evidence of AOM presented during this past year. We discussed the guidelines/criteria and managed multiple patients based on this.” 12 
“I used information I gained from the pilot during teaching rounds with the residents.” 
“It was very practical overall, one area was the recognition of new practice guidelines that I was otherwise unaware of, which allowed me to in turn educate my colleagues and resident learners.” 
“Staying more engaged with general pediatrics topics to discuss with residents during morning report.” 
Better patient and family histories “Taking a more-thorough history and physical to make it easier to make an accurate diagnosis.” 10 
“Learned to listen to parent[s] more.” 
“Better counseling and more awareness of who needs further workup.” 
“Better history taking from parents and patients.” 
“Ask more screening questions on histories.” 
a

Each count represents a respondent; therefore, the percentage of counts out of the total are not shown because a respondent may have indicated multiple areas in one comment and may be indicated in this table more than once.

Although many responses highlighted patient care, some noted that MOCA-Peds had a more general influence on patient interactions through better patient education, counseling, and collection of patient histories. These responses were characterized by statements such as, “I provided better and updated education to my patients” and “Learn to listen to parent[s] more.” Although MOCA-Peds limits the discussion of specific questions for security purposes, 2 themes related to sharing ideas with colleagues appeared: (1) “discuss topics for improvement with others” and (2) “using resources for teaching trainees.”

Participation in the MOCA-Peds 2017 pilot demonstrates rich variation in the type and amount of self-reported knowledge and practice changes among respondents. Nearly all participants indicated that they learned, refreshed, or enhanced their medical knowledge; many also had made 1 or more practice changes as a result of participating in the MOCA-Peds 2017 pilot. Nearly all analyzed comments were related to items a pediatrician had direct control over (eg, updating their differential diagnosis, adjusting medication, conversing with parents and patients better), which is not surprising given the complexity of system-wide change efforts.25  Differences in reported practice changes found between GPs (68.9%) and subspecialists maintaining their general pediatrics certification (41.4%) were expected as a result of clinical practice differences, although a small number of subspecialists (n = 19) specifically commented that participation in MOCA-Peds, although focused on general pediatrics, helped them provide better care to their subspecialty patients (Table 3).

Before the pilot, we hypothesized practice change would occur for a relatively small proportion of participants. We believed that practice change would most likely be linked to medical content and that any recently updated guideline provided in the references would appear in pediatrician comments more often than other topics. The 2017 learning objective most clearly linked to a recently updated guideline26  (ie, clinical management of otitis media) was endorsed the most (n = 107). Whether this objective’s endorsement was due to heightened awareness through MOCA-Peds, the high frequency of otitis media in clinical practice, or coverage in the published literature is less important than the fact that practice change was reported. Beyond specific content, the largest number of other positive practice changes (n = 129) reported was attributed to the theme “following evidence-based medicine.” These positive practice improvements, ancillary to medical content (eg, better patient education or counseling, discuss topics for improvement with others), were largely unanticipated, representing an outcome beyond improving knowledge in a medical domain. Of note, only 69% of MOCA-Peds participants reported regularly using computer databases to find information on pediatric developments before the pilot. With a high number of comments about increased resource use in regular clinical practice, interacting with MOCA-Peds (which permits time to search varied information sources and offers linked references and/or guidelines to each question) may in and of itself lead to practice change around evidence-based medicine and resource use.

With this study, it is demonstrated that longitudinal assessment tools can have a positive impact on practice improvement in the short-term. Before developing MOCA-Peds, only the ABA had evaluated the impact of an online longitudinal assessment for physicians during their 2014–15 MOCA Minute pilot, a voluntary, 16-week study in which MOCA Minute was conceptualized as preparation for the secure examination. Results indicated that 96.7% of participants agreed that “questions were useful learning tools” and participation helped them “provide better care to patients” (80.7%).18  The differences in survey responses about MOCA-Peds and MOCA Minute may relate to differences in the pilot program structure, participation volume, and survey response rates (81.4% [ABP] versus 57% [ABA]).

MOCA-Peds was developed as an alternative to the proctored examination, and exact comparisons are not possible. However, only 52% (699 out of 1345) of non–MOCA-Peds participants (who completed the ABP’s 2017 MOC General Pediatrics Proctored Examination) agreed that their preparation for that examination was “very” or “extremely” valuable as a learning experience (data not shown).27  The only mechanism available for learning with the proctored examination is preparatory studying; feedback is limited to a score report delivered several weeks postexamination. In contrast, MOCA-Peds provides learning objectives in advance and offers additional learning opportunities by providing immediate resources after each question (ie, question rationales, references, and peer performance results). In fact, in 2017, approximately half of MOCA-Peds participants reported they did not study before answering questions, despite having learning objectives available in advance.28  Additionally, MOCA-Peds moves closer to spaced education principles by pacing the assessment and learning process over time, whereas the proctored examination is a single point in time.15,29,30  Although not solely a learning platform, early results indicate that MOCA-Peds moves from purely assessment to a more-formative experience of learning that provides value to most pediatricians.

Limitations of this research include a relatively modest response rate (58.8%) for the end-of-year survey and dependence on self-report. However, differences between respondents and nonrespondents were notably small or nonsignificant. Nonetheless, attitudes reported may be reflective of a voluntary participant group that may have been biased by the newness or difference of the program. In addition, those who opted out of MOCA-Peds or failed to meet the passing standard were not sent the survey (n = 216), which may have permitted some bias in sampling. Because the pilot study was focused on the new assessment method, no comparison was attempted for other types of learning activities (eg, self-assessments, continuing medical education). Finally, the evaluation was not used to measure an objective level of practice change or whether change endured. These and other questions about the impact of ongoing MOCA-Peds participation will be addressed in future evaluations.

Approximately 6 in 10 pediatricians reported being able to apply knowledge gained from participation in the MOCA-Peds pilot to advance clinical practice. The increased use of electronic data resources and evidence-based medicine by pediatricians may be the most apparent benefit among participants. Additionally, nearly all participants indicated they had learned or refreshed their medical knowledge by participating in the pilot. These results support the notion that assessment and learning can be integrated into a single platform and need not be considered mutually exclusive activities, in turn adding value for practicing physicians.31  Other ABMS member boards are launching longitudinal assessments, and further evaluation is needed to better understand the factors fostering learning and practice change across different models of assessment. As models for continuing certification are proposed and tested, the hope is to provide richer opportunities to guide learning and practice change, ultimately improving care for pediatric patients.

We thank the numerous individuals and groups associated with developing MOCA-Peds and the pilot participants for being willing to take a chance on a new assessment idea for pediatrics. The teams involved in MOCA-Peds extends to ABP staff, ABP committees and task forces, ABP’s Family Leadership Advisory Group, focus groups, user panels, and more. Over 11 000 pediatricians volunteered in 2017 and 2018, and this pilot and evaluation were not possible without their time and commitment to give feedback. The ABA deserves special credit for leading the entire ABMS board community in the longitudinal assessment, including the ABP. Lastly, we thank Virginia Moyer, David Nichols, Laura Brooks, Brenda Nuncio, and members of the ABP Research Advisory Committee for their editorial review.

Drs Leslie and Olmsted, Ms Smith, and Mr Turner contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation, the acquisition of the data, and the analysis and interpretation of the data and drafted the initial manuscript; Dr Dounoucos and Mr Bradford contributed to the analyses and interpretation of the qualitative data; Dr Althouse contributed to the conception, design, and evaluation of the Maintenance of Certification Assessment for Pediatrics pilot and its formative evaluation; and all authors reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.

FUNDING: The evaluation of the 2017 Maintenance of Certification Assessment for Pediatrics pilot was funded by the American Board of Pediatrics Foundation, a nonprofit type 1 supporting organization to the American Board of Pediatrics.

COMPANION PAPER: A companion to this article can be found online at www.pediatrics.org/cgi/doi/10.1542/peds.2019-2303.

     
  • ABA

    American Board of Anesthesiology

  •  
  • ABMS

    American Board of Medical Specialties

  •  
  • ABP

    American Board of Pediatrics

  •  
  • GP

    general pediatrician

  •  
  • MOC

    Maintenance of Certification

  •  
  • MOCA-Peds

    Maintenance of Certification Assessment for Pediatrics

1
Nichols
DG
.
Maintenance of certification and the challenge of professionalism
.
Pediatrics
.
2017
;
139
(
5
):
e20164371
2
Hawkins
RE
,
Lipner
RS
,
Ham
HP
,
Wagner
R
,
Holmboe
ES
.
American Board of Medical Specialties Maintenance of Certification: theory and evidence regarding the current framework
.
J Contin Educ Health Prof
.
2013
;
33
(
suppl 1
):
S7
S19
3
Leslie
LK
,
Olmsted
MG
,
Turner
A
, et al
.
MOCA-Peds: development of a new assessment of medical knowledge for continuing certification
.
Pediatrics
.
2018
;
142
(
6
):
e20181428
4
Strasburger
VC
,
Greydanus
DE
,
Sigman
GS
.
What should MOC ideally look like?
Clin Pediatr (Phila)
.
2015
;
54
(
9
):
831
832
5
Strasburger
VC
.
Ain’t misbehavin’: is it possible to criticize maintenance of certification (MOC)?
Clin Pediatr (Phila)
.
2011
;
50
(
7
):
587
590
6
Vision Initiative Commission
.
Continuing board certification: vision for the future
.
2018
.
Available at: https://visioninitiative.org/. Accessed April 26, 2018
7
The American Board of Medical Specialties
.
What is longitudinal assessment? Available at: https://www.abms.org/initiatives/certlink/what-is-longitudinal-assessment/. Accessed May 7, 2019
8
Rohrer
D
,
Pashler
H
.
Recent research on human learning challenges conventional instructional strategies
.
Educ Res
.
2010
;
39
(
5
):
406
412
9
Ryan
RM
,
Deci
EL
.
Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being
.
Am Psychol
.
2000
;
55
(
1
):
68
78
10
Schumacher
DJ
,
Englander
R
,
Carraccio
C
.
Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment
.
Acad Med
.
2013
;
88
(
11
):
1635
1645
11
Carpenter
SK
.
Cue strength as a moderator of the testing effect: the benefits of elaborative retrieval
.
J Exp Psychol Learn Mem Cogn
.
2009
;
35
(
6
):
1563
1569
12
Rowland
CA
.
The effect of testing versus restudy on retention: a meta-analytic review of the testing effect
.
Psychol Bull
.
2014
;
140
(
6
):
1432
1463
13
Pastötter
B
,
Bäuml
K-HT
.
Reversing the testing effect by feedback: behavioral and electrophysiological evidence
.
Cogn Affect Behav Neurosci
.
2016
;
16
(
3
):
473
488
14
The American Board of Pediatrics
.
Future of testing conference summary. Available at: https://abpedsfoundation.org/fotc-2015/. Accessed March 29, 2018
15
Kerfoot
BP
,
Lawler
EV
,
Sokolovskaya
G
,
Gagnon
D
,
Conlin
PR
.
Durable improvements in prostate cancer screening from online spaced education a randomized controlled trial
.
Am J Prev Med
.
2010
;
39
(
5
):
472
478
16
Shaw
TJ
,
Pernar
LI
,
Peyre
SE
, et al
.
Impact of online education on intern behaviour around joint commission national patient safety goals: a randomised trial
.
BMJ Qual Saf
.
2012
;
21
(
10
):
819
825
17
Schuwirth
LWT
,
van der Vleuten
CPM
.
The use of progress testing
.
Perspect Med Educ
.
2012
;
1
(
1
):
24
30
18
Sun
H
,
Zhou
Y
,
Culley
DJ
, et al
.
Association between participation in an intensive longitudinal assessment program and performance on a cognitive examination in the maintenance of certification in anesthesiology program®
.
Anesthesiology
.
2016
;
125
(
5
):
1046
1055
19
The American Board of Pediatrics
.
2017 MOCA-peds pilot learning objectives. Available at: https://www.abp.org/mocapeds/pilot-learning-objectives. Accessed May 10, 2019
20
Leslie
LK
,
Turner
AL
,
Smith
AC
, et al
.
Pediatrician Perspectives on Feasibility and Acceptability of the MOCA-Peds 2017 Pilot
.
Pediatrics
.
2019
;
144
(
6
):
e20192303
21
SPSS Statistics for Windows [computer program]
.
Armonk, NY
:
IBM Corporation
;
2017
22
Dovetail [computer program]. Sydney, Australia: Dovetail Research Party;
2018
. Available at: https://dovetailapp.com
23
Braun
V
,
Clarke
V
.
Using thematic analysis in psychology
.
Qual Res Psychol
.
2006
;
3
(
2
):
77
101
24
The American Board of Pediatrics
.
General pediatrics content outline. 2017. Available at: https://www.abp.org/content/content-outlines-0. Accessed May 7, 2019
25
Field
B
,
Booth
A
,
Ilott
I
,
Gerrish
K
.
Using the Knowledge to Action Framework in practice: a citation analysis and systematic review
.
Implement Sci
.
2014
;
9
(
1
):
172
26
Lieberthal
AS
,
Carroll
AE
,
Chonmaitree
T
, et al
.
The diagnosis and management of acute otitis media
.
Pediatrics
.
2013
;
131
(
3
). Available at: www.pediatrics.org/cgi/content/full/131/2/e964
27
Turner
A
.
ABP Internal Analysis
.
Chapel Hill, NC
:
The American Board of Pediatrics
;
2019
28
Turner
A
,
Leslie
L
,
Furter
R
,
Haig
V
.
Summary of the 2017 MOCA-Peds Pilot, Use Patterns, Feasibility, Acceptability, and Scoring
.
Chapel Hill, North Carolina
:
The American Board of Pediatrics
;
2018
. Available at: https://www.abp.org/sites/abp/files/pdf/moca-peds-research-summary-2017.pdf. Accessed December 13, 2018
29
Kerfoot
BP
,
Shaffer
K
,
McMahon
GT
, et al
.
Online “spaced education progress-testing” of students to confront two upcoming challenges to medical schools
.
Acad Med
.
2011
;
86
(
3
):
300
306
30
House
H
,
Monuteaux
MC
,
Nagler
J
.
A randomized educational interventional trial of spaced education during a pediatric rotation
.
AEM Educ Train
.
2017
;
1
(
2
):
151
157
31
Southgate
L
,
van der Vleuten
CPM
.
A conversation about the role of medical regulators
.
Med Educ
.
2014
;
48
(
2
):
215
218

Competing Interests

POTENTIAL CONFLICT OF INTEREST: Drs Leslie and Althouse, Mr Bradford, and Mr Turner are employees of the American Board of Pediatrics. Drs Dounoucos and Olmsted and Ms Smith are employees of RTI International, an international, nonprofit research firm with whom the American Board of Pediatrics contracted to conduct this evaluation.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.