BACKGROUND:

In the United States, up to 20% of children experience a mental health (MH) disorder in a given year, many of whom remain untreated. Routine screening during annual well visits is 1 strategy providers can use to identify concerns early and facilitate appropriate intervention. However, many barriers exist to the effective implementation of such screening.

METHODS:

A 15-month quality improvement learning collaborative was designed and implemented to improve screening practices in primary care. Participating practices completed a survey at 3 time points to assess preparedness and ability to promote and support MH issues. Monthly chart reviews were performed to assess the rates of screening at well visits, documentation of screening results, and appropriate coding practices.

RESULTS:

Ten practices (including 107 providers) were active participants for the duration of the project. Screening rates increased from 1% at baseline to 74% by the end of the project. For the 1 practice for which more comprehensive data were available, these screening rates were sustained over time. Documentation of results and appropriate billing for reimbursement mirrored the improvement seen in screening rates.

CONCLUSIONS:

The learning collaborative model can improve MH screening practices in pediatric primary care, an important first step toward early identification of children with concerns. More information is needed about the burden placed on practices and providers to implement these changes. Future research will be needed to determine if improved identification leads to improved access to care and outcomes.

In the United States, up to 20% of children experience a mental health (MH) disorder in a given year,1 causing impairment and requiring intervention.2 Early identification is essential because the prompt recognition and treatment of MH issues can mitigate social and academic problems into adulthood; however, few youth receive this treatment.3,5 Many factors contribute to unmet MH needs, including the workforce shortage of child MH professionals.4,5 Pediatric primary care providers (PCPs) can play a critical role in ameliorating the MH service gap.6 

Routine MH screening during annual well visits, encouraged by the American Academy of Pediatrics (AAP),7,8 is 1 strategy to identify concerns early and facilitate appropriate intervention.2,8,11 However, PCPs face numerous barriers to implementing routine MH screening, including lack of time, insufficient referral resources, inadequate reimbursement,12 and limited partnerships with MH providers.13 Indeed, over half of PCPs report never or rarely using a standardized MH screening tool.14 

Little is written about practical strategies to improve MH screening rates, although the literature suggests that quality improvement (QI) can narrow the gap between recommended best practice and current care. QI has been used to improve early identification of and care for children with autism and adolescents and adults with depression when compared to the standard practice of a onetime educational intervention.15,18 Additionally, many states are successfully using QI to improve access to and quality of MH care.19 

The QI learning collaborative (LC) is a model that supports large-scale practice change by facilitating learning sessions and coaching for groups of practitioners and organizations to develop, test, and implement sustainable improvement strategies.20 QI LCs have been successfully used to target health issues in the pediatric primary care setting21; however, we know of no researchers applying this approach to MH screening. We therefore sought to determine if participation in a QI LC is associated with improvements in a practice’s capacity to address MH issues through routine screening, coding, and documentation.

Our overall aim was to improve annual MH screening rates by using an approved tool for children aged 1 to 18 years. Specifically, we aimed to do the following: (1) increase practices’ preparedness to address MH concerns through such constructs as education and workflow planning, (2) increase the percentage of annual well-child visits (WCVs) in which an approved MH screening tool is administered to 50%, (3) increase the percentage of WCVs with documentation of MH screening results to 50%, and (4) increase the percentage of WCVs with appropriate Current Procedural Terminology (CPT) coding to 75%. In a preliminary needs assessment, it seemed few providers were regularly conducting MH screening; target aims were determined by content experts, taking into consideration published experience in other states.14 

In Washington, DC, 3000 to 20 000 low-income children are estimated to have MH needs22,23; In 2012, the DC (District of Columbia) Collaborative for Mental Health in Pediatric Primary Care (also referred to as the DC Collaborative), a multidisciplinary coalition whose primary aim is to increase the integration of MH services into pediatric primary care, was formed. An early priority was to increase the rates of MH screening at WCVs for youth aged 0 to 21 years. Leveraging local QI infrastructure and funding from state MH and health agencies, the DC Collaborative designed, implemented, and evaluated a citywide longitudinal QI LC.

The LC was planned as a 9-month project (February–October 2014; “round 1”). Because of demand from practices to provide ongoing support, the LC was extended for 6 months (January–June 2015; “round 2”). Round 2 was open both to round 1 and new practices. No other widespread training on this topic occurred locally from February 2014 to June 2015. Practices were invited to participate via electronic communication through multiple outlets, including membership groups, practice networks, government agencies, and personal outreach. Recruitment materials included information about the DC Medicaid managed care requirement that all members receive an annual MH screen using an approved screening tool and encouraged LC participation as a strategy to support the implementation of this requirement in clinical practice. Inclusion criteria included practices who provided medical care to children and adolescents, with no exclusion criteria. Screening tools approved by the DC Department of Behavioral Health included the Ages and Stages Questionnaire: Social-Emotional24 (3–66 months), the Strengths and Difficulties Questionnaire (SDQ) (2–17 years),25 and the Patient Health Questionnaire-926 (≥18 years).

Participation was free to practices in the greater Washington, DC, metropolitan area; there was no cap on the number of practices who could enroll. Individual participants were given the option to earn American Board of Pediatrics or American Board of Family Medicine Part IV Maintenance of Certification (MOC) credits. Hour-for-hour continuing medical education credits could be earned for each individual project component. Providers could only earn MOC credit if their practice completed all of the following requirements during each of the 2 project rounds: (1) monthly chart audit reports, (2) monthly team meeting reports, (3) monthly team leader calls, and (4) 3 plan-do-study-act (PDSA) reports. Within practices that met these requirements, individual providers could earn MOC credit if they fulfilled the following requirements for each round: (1) complete an audit of 12 charts, (2) attend 5 team meetings, and (3) participate in 6 learning sessions.

DC practices were provided with additional incentives, including the AAP Mental Health Toolkit, the Ages and Stages Questionnaire: Social-Emotional Starter Kit (English and Spanish), MH screening posters (in English and Spanish), and access to on-site consultation from an MH QI coach. This project received exempt status from the Children’s National Health System institutional review board.

The overall LC design was adapted from other MOC projects implemented by Children’s National Health System and informed by the experience that practice coaching was an integral part of the model (Fig 1).21 Within a month of registration, a QI coach offered an on-site visit to review program activities and practice responsibilities. Practices and individual providers were registered with QI TeamSpace (projects.upiq.org), a centralized platform that confidentially manages project content and data collection.

FIGURE 1

Project map. TBD, to be determined.

FIGURE 1

Project map. TBD, to be determined.

Close modal

A freely available MH resource guide was developed (www.dchealthcheck.net). Additional support, including practice coaching by QI and MH experts, general technical assistance, electronic medical record (EMR) support (eg, assistance developing automated smart forms), and scoring aids (eg, transparencies), was provided.

During months 0 to 3, practices used a structured tool to assess their workflow and preparedness for conducting MH screening and developed a screening implementation plan. Practices conducted a baseline chart audit of 30 patients seen for a WCV during the previous 6 months. Beginning in May 2014, practices performed monthly chart audits (minimum 10–15 charts). Charts for audit were randomly selected by the practice sites. Monthly hour-long webinars provided educational content on QI concepts, MH screening and implementation strategies, methods for engaging families, and management of common MH concerns in primary care. Technical assistance and 1 team leader call were provided between rounds 1 and 2 (November–December 2014). Archived webinars, patient handouts, and additional resources were uploaded to QI TeamSpace.

Practices received monthly report cards based on their chart review data. The report included information about their performance in each domain (eg, percentage of chart reviews in which screening was completed) for that month and the project period, screening targets, and distance from that goal. Each practice designed and implemented individualized PDSA cycles at ∼2-month intervals (3 cycles per round) on the basis of their initial practice assessment, the previous month’s chart audit, or other available data (Table 1). Although practices were not required to seek approval for their PDSA cycles, they were encouraged to consult our expert team.

TABLE 1

Sample PDSA Cycles

MH Screening Practice MeasureExample PDSA Cycle
Increase provider and practice readiness to perform annual MH screenings Design a more efficient workflow to accommodate addition of new screening tools at well visits. 
Analyze current workflow for existing developmental screening tools to identify what is working well and areas for improvement. Use observation and interviews. Engage multidisciplinary team to maximize buy-in. 
Once new workflow is established, test on small scale with existing developmental screening tools and revise as necessary. 
Implement new workflow clinic-wide. Add MH screening tools for various age groups in stepwise fashion by using new workflow. 
Increase the percentage of annual well visits in which an approved screening tool is used Initiate annual MH screening for 4 to 10-y-olds with SDQ. 
Delineate workflow and purchase additional materials to facilitate workflow as needed. 
Survey staff after implementation to determine what is working well and what adjustments need to be made in workflow. 
Translate introduction and disclosure statement printed on screening tools into Spanish. Decide where providers will document in medical record that SDQ was completed. Design workflow for scheduling patients for follow-up if MH concerns are identified. 
Increase the rates of billing for use of MH screening tool Improve billing rates for use of screening tool. 
Project leader to talk with individual providers about their billing practices. Provide education about the use and importance of appropriate billing for screening tool. 
Audit charts before intervention and 2 wk postintervention. 
Improvement was noted for 3 out of 4 providers. Repeat intervention with fourth provider to assess individual and practice-level barriers. 
MH Screening Practice MeasureExample PDSA Cycle
Increase provider and practice readiness to perform annual MH screenings Design a more efficient workflow to accommodate addition of new screening tools at well visits. 
Analyze current workflow for existing developmental screening tools to identify what is working well and areas for improvement. Use observation and interviews. Engage multidisciplinary team to maximize buy-in. 
Once new workflow is established, test on small scale with existing developmental screening tools and revise as necessary. 
Implement new workflow clinic-wide. Add MH screening tools for various age groups in stepwise fashion by using new workflow. 
Increase the percentage of annual well visits in which an approved screening tool is used Initiate annual MH screening for 4 to 10-y-olds with SDQ. 
Delineate workflow and purchase additional materials to facilitate workflow as needed. 
Survey staff after implementation to determine what is working well and what adjustments need to be made in workflow. 
Translate introduction and disclosure statement printed on screening tools into Spanish. Decide where providers will document in medical record that SDQ was completed. Design workflow for scheduling patients for follow-up if MH concerns are identified. 
Increase the rates of billing for use of MH screening tool Improve billing rates for use of screening tool. 
Project leader to talk with individual providers about their billing practices. Provide education about the use and importance of appropriate billing for screening tool. 
Audit charts before intervention and 2 wk postintervention. 
Improvement was noted for 3 out of 4 providers. Repeat intervention with fourth provider to assess individual and practice-level barriers. 

During monthly team leader calls, deidentified data were used to track progress and compare results. Run charts for practice-specific data and overall LC-wide data were presented and discussed. Practices also shared their experiences with MH screening (including the results of their PDSA cycles, challenges, and successes) to facilitate shared learning.

The AAP Mental Health Practice Readiness Inventory (MHPRI)27 was used to assess practices’ self-reported readiness to address MH issues in 5 areas: community resources, health care financing, support for children and families, clinical information systems and delivery system redesign, and decision support for clinicians.

Chart audit measures, which were selected via expert consensus on the basis of study aims, included the following: (1) MH screening completed by using an approved screening tool, (2) screening scored and documented, and (3) screening billed by using CPT code 96110. Additional components of the chart audit not relevant to the aims included patient age and whether the patient had a previously identified MH issue. Practices were coached on selecting representative, random charts for monthly audits; given that practices selected their own charts, selection was not done in a blinded or randomized fashion.

Practices were coached to implement screening in a stepwise manner in which they started screening a narrow age range of patients and improved workflow processes before expanding. Most practices selected charts for auditing from the age range in which they were targeting screening. As a balancing measure, providers completed a brief survey at the midpoint and end of the project to assess the perceived impact and effectiveness of integrating project activities within practice workflow.

Practices submitted standardized, structured data reporting forms on QI TeamSpace. Data were reviewed for completeness and accuracy both independently and jointly by 3 members of the project team.

Data analyses were performed in Excel. Changes in practice readiness to address MH were examined via practice averages on the 5 domains of the AAP MHPRI reported at pre-, mid-, and end-project time periods. Monthly chart audits examined changes in the percentage of (1) annual WCVs in which an approved screening tool was used compared with the total number of charts reviewed, (2) results documented compared with all instances when a screening tool was used, and (3) appropriate CPT codes used compared with all instances when a screening tool was used. Individual control charts were used to identify process variation and/or trends within overall practice data. Control charts were created by using Statistical Process Control Charts for Excel.

A retrospective chart review of WCVs completed during and 6 months after the LC was conducted at the largest participating practice to determine sustained effects of the LC. This practice was selected because of its large size and consistency in using EMR smart forms, making data collection more feasible. For this practice, data were pulled for all WCVs of 4 to 10-year-olds. Charts were examined for completion of the SDQ, the approved screening tool for children this age. Data abstraction was completed by using structured screening data fields, so only those screens documented in this way in the chart were included in the analyses.

Of the 19 practices enrolled at any point, 8 were academic health centers, 6 were private practices, 4 were federally qualified health centers (FQHC), and 1 was an outpatient specialty clinic. Ten practices, including 6 academic community health centers, 2 FQHCs, and 2 private practices, were considered active participants throughout both rounds of the LC and were included in the analyses. Reasons for exclusion included participation in only 1 LC round (6) and withdrawal from the project (3). Reasons practices withdrew or only completed 1 round included being a single provider practice, not being a primary care clinic, being a practice outside of DC, and internal staffing and infrastructure issues. The mean practice size within the 10 practices was 11 (range, 3–22) providers, with a total of 107 providers participating.

On average, practices submitted 41.4 charts at baseline (SD = 30.6; median = 31.5) and 20 charts per month (SD = 16.2; median = 15.0) for a total of 2721 charts across 10 practices. Providers attended an average of 11 practice QI team meetings (SD = 3.6; median = 11.5). Providers completed a mean 12.33 continuing medical education hours (SD = 8.4 hours; median = 10 hours) for participating in practice team meetings, team leader webinars, learning sessions, and chart audits.

Results from the self-reported AAP MHPRI indicated mean improvement of 17% across all 5 domains from baseline. Overall, practices saw the most improvement in the health care financing (23% increase) and community resources (20% increase) domains (Tables 2 and 3).

TABLE 2

AAP MHPRI

Preproject Average (February 2014: n = 10 Practices) Mean (SD)Midproject Average (October 2014: n = 8 Practices) Mean (SD)End of Project Average (June 2015: n = 8 Practices) Mean (SD)Percent Change from Baseline Mean
Community resources 2.3 (0.34) 1.8 (0.47) 1.7 (0.43) 20 
Health care financing 2.5 (0.53) 1.7 (0.25) 1.8 (0.48) 23 
Support for children and families 2.0 (0.32) 1.8 (0.40) 1.5 (0.36) 17 
Clinical information system redesign 2.5 (0.41) 2.4 (0.34) 2.4 (0.11) 
Decision support for clinicians 2.2 (0.31) 2.0 (0.52) 1.7 (0.35) 17 
Preproject Average (February 2014: n = 10 Practices) Mean (SD)Midproject Average (October 2014: n = 8 Practices) Mean (SD)End of Project Average (June 2015: n = 8 Practices) Mean (SD)Percent Change from Baseline Mean
Community resources 2.3 (0.34) 1.8 (0.47) 1.7 (0.43) 20 
Health care financing 2.5 (0.53) 1.7 (0.25) 1.8 (0.48) 23 
Support for children and families 2.0 (0.32) 1.8 (0.40) 1.5 (0.36) 17 
Clinical information system redesign 2.5 (0.41) 2.4 (0.34) 2.4 (0.11) 
Decision support for clinicians 2.2 (0.31) 2.0 (0.52) 1.7 (0.35) 17 

A readiness score of 3 = We do not do this well; significant change is needed. A score of 2 = We do this to some extent; improvement is needed. A score of 1 = We do this well; substantial improvement is not currently needed.

TABLE 3

Practice Information

PracticeNo. of FTE PCPsaPractice TypeAAP MHPRI (Overall Average)Screen CompletedScreen Scored and DocumentedBilling Completed
PreprojectMidprojectEnd of ProjectPreproject, %End of Project, %Preproject, %End of Project, %Preproject, %End of Project, %
4–9 Academic health center 2.2 1.7 1.8 73 100 100 
>10 FQHC 2.5 1.9 1.6 86 100 50 
4–9 Academic health center 2.4 1.9 1.9 84 100 100 
>10 Academic health center 2.4 2.1 1.9 100 100 100 
>10 Academic health center 1.9 1.6 1.6 67 100 100 
>10 FQHC 2.2 2.2 1.9 33 88 86 
2–3 Private practice 2.7 2.1 1.8 100 95 95 
4–9 Academic health center 2.4 — — 93b 86b 86a 
4–9 Private practice 2.5 2.4 40 100 67 
4–9 Academic health center 1.9 1.3 — 82 100 100 
Summary 2.3 1.9 1.8 76 97 88 
PracticeNo. of FTE PCPsaPractice TypeAAP MHPRI (Overall Average)Screen CompletedScreen Scored and DocumentedBilling Completed
PreprojectMidprojectEnd of ProjectPreproject, %End of Project, %Preproject, %End of Project, %Preproject, %End of Project, %
4–9 Academic health center 2.2 1.7 1.8 73 100 100 
>10 FQHC 2.5 1.9 1.6 86 100 50 
4–9 Academic health center 2.4 1.9 1.9 84 100 100 
>10 Academic health center 2.4 2.1 1.9 100 100 100 
>10 Academic health center 1.9 1.6 1.6 67 100 100 
>10 FQHC 2.2 2.2 1.9 33 88 86 
2–3 Private practice 2.7 2.1 1.8 100 95 95 
4–9 Academic health center 2.4 — — 93b 86b 86a 
4–9 Private practice 2.5 2.4 40 100 67 
4–9 Academic health center 1.9 1.3 — 82 100 100 
Summary 2.3 1.9 1.8 76 97 88 

FTE, full-time equivalent; —, not available.

a

Number of full-time equivalent PCPs.

b

Note that because data were not available for end of project, data were taken from the last month in which data were submitted.

Monthly chart audits demonstrated improvement at project end across all domains assessed (Figs 24). Data plotted by using the individual x-control chart shows significant improvement within the first 3 months, moving toward a more stable rate of improvement among practices by project end. Overall, the screening and billing rates improved by 73% and 89% from baseline, respectively. There was variability in screening patterns across practices (Table 3). Approximately one-quarter (25.5%) of screens were in the elevated range, suggesting an MH concern, compared with 9% at baseline and slightly higher than national trends. Analysis of the comprehensive chart review from the largest participating practice showed sustained increases in the percentage of MH screens completed and documented in the EMR during the LC, beginning in July 2014, and for 6 months afterward (Fig 5). Data from this practice also elucidated provider follow-up actions after screening indicated a concern, which ranged from management by the pediatric provider (26%) to referral to internal (17%) or external (14%) MH resources. Twenty-five percent were already receiving services of some kind, 10% declined services and/or referral, and 9% were other or not documented.

FIGURE 2

Mean practice reports of MH screening completed by using an approved tool. Avg, average; LCL, lower control limit; UCL, upper control limit.

FIGURE 2

Mean practice reports of MH screening completed by using an approved tool. Avg, average; LCL, lower control limit; UCL, upper control limit.

Close modal
FIGURE 3

Mean practice reports of MH screens scored and documented. Avg, average; LCL, lower control limit; UCL, upper control limit.

FIGURE 3

Mean practice reports of MH screens scored and documented. Avg, average; LCL, lower control limit; UCL, upper control limit.

Close modal
FIGURE 4

Mean practice reports of completed MH screens billed by using 96110 CPT codes. Avg, average; LCL, lower control limit; UCL, upper control limit.

FIGURE 4

Mean practice reports of completed MH screens billed by using 96110 CPT codes. Avg, average; LCL, lower control limit; UCL, upper control limit.

Close modal
FIGURE 5

Percent of WCVs of children aged 4 to 10 with an SDQ result documented. Practice D (total well visits = 7107; SDQ screens = 4711). Avg, average; LCL, lower control limit; UCL, upper control limit.

FIGURE 5

Percent of WCVs of children aged 4 to 10 with an SDQ result documented. Practice D (total well visits = 7107; SDQ screens = 4711). Avg, average; LCL, lower control limit; UCL, upper control limit.

Close modal

Response to the balancing measure survey was poor (18%), limiting its usefulness in assessing the impact on other aspects of practice. Anecdotally, on the basis of follow-up communication with participating practices and the state Medicaid agency, we determined that practices’ comfort and efficiency with MH screening improved over time. For example, providers reported on postproject surveys that they felt “much better able to screen and refer to community resources” and felt “more comfortable with some basic management in primary care of anxiety and behavior.” In our follow-up with practices who did not complete the LC, it seems that barriers to participation included lack of investment by senior leadership to prioritize the project, existing challenges in managing patient workflow, and lack of alignment with the project design of supporting primarily DC-based practices.

Overall, the LC was an effective way to support participating practices in their efforts to implement, document, and bill for routine MH screening at WCVs. Substantive improvements were seen across all domains measured and, in at least 1 participating practice, were sustained over time. Additionally, practices demonstrated increased preparedness in several domains to address the MH needs of their patients, as measured by the AAP MHPRI.

Although the LC was associated with improvements in MH screening practices, a greater investment of time and resources was required to achieve this result than is typically dedicated to continuing education in the medical setting. To help offset this investment, practices were offered intensive and ongoing support in a variety of domains, including QI expertise, data analysis, technology support, and content expertise. With the low baseline rates of screening, despite the fact that providers were aware it is a best practice, we suggest that intensive support was needed and a traditional onetime educational approach would not be adequate. The need for intensive practice engagement likely contributed to the fact that some practices were not able to complete both rounds of the project. This is consistent with the findings of researchers conducting a systematic review on the impact of LCs that they have a positive and sustained effect, yet their impact may be difficult to predict and dependent in large part on organizational culture.28 However, for practices who are interested in improving their screening practices and willing to adopt a QI approach to change, these findings should be encouraging.

Although this project did not track reimbursement rates, substantive improvements were seen in CPT code use, which in theory should increase revenue. The 2015 DC Medicaid reimbursement rate was $5.19 for behavioral health screening (96127 CPT code) and $10.30 for developmental screening (96110 CPT code). Of note, there is considerable variability in these rates nationally, with some states lower and some much higher.29 Actual reimbursement could be impacted by any number of factors, including prenegotiated payment agreements and completed submission of claims. A more detailed analysis is needed to determine the actual fiscal impact for individual practices.

As a result of the overall low response rate to the midproject and end-of-project surveys, it was difficult to ascertain the impact that implementing MH screening had on workflow and practice efficiency, which can be a significant barrier to implementing MH screening. Practices that most effectively used multidisciplinary teams seemed to have the greatest success. Screening activities continued postproject for most, suggesting that practices were not unduly burdened. Follow-up qualitative interviews are being conducted with participating PCPs and families who completed MH screens to better understand the impact of screening on workflow and clinical care.

There were several limitations to this project, which may limit its generalizability. First, it was conducted in a locale with QI infrastructure, a screening mandate, and academic resources. Many practices had previously participated in similar projects, so they may have had more comfort and skills with QI methodology than providers in other settings. Chart review was self-reported, and thus accuracy cannot be verified. Participation was voluntary and not randomly selected; therefore, there may be a selection bias, with those who participated being more highly motivated. Additionally, lack of control data from neighboring practices limits our ability to confirm that nonparticipating practices did not make similar improvements. Lastly, we do not have data regarding the proportion of patients with positive MH screen results who received care, outcomes for these patients, or the financial impact of the project.

Despite the limitations, there was evidence for meaningful improvement during the 15-month QI LC, which was sustained in at least 1 practice for 6 months afterward. More information is needed about the burden placed on practices and providers to implement these changes, which will in turn inform recommendations for improvement and additional supports needed by practices. For practices that were uninterested or unable to fully engage in this project, a more individualized approach may be needed, although there is good reason to suggest a QI approach would be helpful. Additionally, cost-effectiveness analysis and expansion of this approach to other geographic regions will be important to determine long-term feasibility and sustainability. With these early findings, we suggest that the LC model can improve MH screening practices in pediatric primary care, an important first step toward early identification of patients with MH concerns. Future research will also be needed to determine if improved identification leads to improvements in access to MH care and patient outcomes.

     
  • AAP

    American Academy of Pediatrics

  •  
  • CPT

    Current Procedural Terminology

  •  
  • DC

    District of Columbia

  •  
  • EMR

    electronic medical record

  •  
  • FQHC

    federally qualified health center

  •  
  • LC

    learning collaborative

  •  
  • MOC

    Maintenance of Certification

  •  
  • MH

    mental health

  •  
  • MHPRI

    Mental Health Practice Readiness Inventory

  •  
  • PCP

    primary care provider

  •  
  • PDSA

    plan-do-study-act

  •  
  • QI

    quality improvement

  •  
  • SDQ

    Strengths and Difficulties Questionnaire

  •  
  • WCV

    well-child visit

Dr Beers conceptualized and designed the study, drafted the initial manuscript, and reviewed all data analyses and revised the manuscript; Dr Godoy and Ms John participated in the study design and implementation, conducted the initial analyses, drafted portions of the manuscript, and reviewed and revised the manuscript; Drs Long, Biel, Anthony, Moon, and Weissman participated in the study design and implementation and reviewed and revised the manuscript; Ms Mlynarski made substantive contributions to the interpretation of the study data, drafted portions of the manuscript, and reviewed and revised the manuscript; and all authors approved the final manuscript as submitted.

FUNDING: Supported by contract RM-014-SAS-165-BY0-DJW from the District of Columbia Department of Behavioral Health and grant CHA.PSMB.CNMC.PGR.M-C.052013 from the District of Columbia Department of Health. The contents of this publication are solely the responsibility of the authors and do not necessarily represent the official views of these agencies.

Support was provided by the Children’s National Health Network. We thank Dr Chaya Merrill for her review and advice.

1
Perou
R
,
Bitsko
RH
,
Blumberg
SJ
, et al;
Centers for Disease Control and Prevention (CDC)
.
Mental health surveillance among children–United States, 2005-2011.
MMWR Suppl
.
2013
;
62
(
2
):
1
35
[PubMed]
2
Berger-Jenkins
E
,
McCord
M
,
Gallagher
T
,
Olfson
M
.
Effect of routine mental health screening in a low-resource pediatric primary care population.
Clin Pediatr (Phila)
.
2012
;
51
(
4
):
359
365
[PubMed]
3
Dodge
KA
,
Bierman
KL
,
Coie
JD
, et al;
Conduct Problems Prevention Research Group
.
Impact of early intervention on psychopathology, crime, and well-being at age 25.
Am J Psychiatry
.
2015
;
172
(
1
):
59
70
[PubMed]
4
Steinman
KJ
,
Shoben
AB
,
Dembe
AE
,
Kelleher
KJ
.
How long do adolescents wait for psychiatry appointments?
Community Ment Health J
.
2015
;
51
(
7
):
782
789
[PubMed]
5
Thomas
CR
,
Holzer
CE
 III
.
The continuing shortage of child and adolescent psychiatrists.
J Am Acad Child Adolesc Psychiatry
.
2006
;
45
(
9
):
1023
1031
[PubMed]
6
Foy
JM
;
American Academy of Pediatrics Task Force on Mental Health
.
Enhancing pediatric mental health care: report from the American Academy of Pediatrics Task Force on Mental Health. Introduction.
Pediatrics
.
2010
;
125
(
suppl 3
):
S69
S74
[PubMed]
7
American Academy of Pediatrics
.
Appendix S4: the case for routine mental health screening.
Pediatrics
.
2010
;
125
(
suppl 3
):
S133
S139
8
Weitzman
C
,
Wegner
L
;
Section on Developmental and Behavioral Pediatrics
;
Committee on Psychosocial Aspects of Child and Family Health
;
Council on Early Childhood
;
Society for Developmental and Behavioral Pediatrics
;
American Academy of Pediatrics
.
Promoting optimal development: screening for behavioral and emotional problems.
Pediatrics
.
2015
;
135
(
2
):
384
395
[PubMed]
9
Committee on Psychosocial Aspects of Child and Family Health and Task Force on Mental Health
.
Policy statement–the future of pediatrics: mental health competencies for pediatric primary care.
Pediatrics
.
2009
;
124
(
1
):
410
421
[PubMed]
10
Jellinek
MS
.
Universal mental health screening in pediatrics: toward better knowing, treating, or referring.
J Am Acad Child Adolesc Psychiatry
.
2013
;
52
(
11
):
1131
1133
[PubMed]
11
Sheldrick
RC
,
Merchant
S
,
Perrin
EC
.
Identification of developmental-behavioral problems in primary care: a systematic review.
Pediatrics
.
2011
;
128
(
2
):
356
363
[PubMed]
12
Ward-Zimmerman
B
,
Cannata
E
.
Partnering with pediatric primary care: lessons learned through collaborative colocation.
Prof Psychol Res Pr
.
2012
;
43
(
6
):
596
605
13
Pidano
AE
,
Honigfeld
L
,
Bar-Halpern
M
,
Vivian
JE
.
Pediatric primary care providers’ relationships with mental health care providers: survey results.
Child Youth Care Forum
.
2014
;
43
(
1
):
135
150
14
Kuhlthau
K
,
Jellinek
M
,
White
G
,
Vancleave
J
,
Simons
J
,
Murphy
M
.
Increases in behavioral health screening in pediatric care for Massachusetts Medicaid patients.
Arch Pediatr Adolesc Med
.
2011
;
165
(
7
):
660
664
[PubMed]
15
Kairys
SW
,
Petrova
A
.
Role of participation of pediatricians in the “activated autism practice” program in practicing children with autism spectrum disorders at the primary care setting.
Glob Pediatr Health
.
2016
;
3
:
2333794X16663544
[PubMed]
16
Carbone
PS
,
Norlin
C
,
Young
PC
.
Improving early identification and ongoing care of children with autism spectrum disorder.
Pediatrics
.
2016
;
137
(
6
):
e20151850
[PubMed]
17
Asarnow
JR
,
Jaycox
LH
,
Duan
N
, et al
.
Effectiveness of a quality improvement intervention for adolescent depression in primary care clinics: a randomized controlled trial.
JAMA
.
2005
;
293
(
3
):
311
319
[PubMed]
18
Wells
KB
,
Sherbourne
C
,
Schoenbaum
M
, et al
.
Impact of disseminating quality improvement programs for depression in managed primary care: a randomized controlled trial.
JAMA
.
2000
;
283
(
2
):
212
220
[PubMed]
19
Larner College of Medicine, University of Vermont
. National Improvement Partnership Network (NIPN): Topic areas/QI initiatives. Available at: www.med.uvm.edu/nipn/topic. Accessed January 7, 2017
20
Delmarva Foundation
.
Medicaid Managed Care 2013 Annual Technical Report
.
Washington, DC
:
Department of Health Care Finance
;
2014
21
John
T
,
Morton
M
,
Weissman
M
, et al
.
Feasibility of a virtual learning collaborative to implement an obesity QI project in 29 pediatric practices.
Int J Qual Health Care
.
2014
;
26
(
2
):
205
213
[PubMed]
22
Brink
R
.
Improving the Children’s Mental Health System in the District of Columbia
.
Washington, DC
:
Children’s Law Center
;
2012
23
Kolko
DJ
,
Perrin
E
.
The integration of behavioral health interventions in children’s health care: services, science, and suggestions.
J Clin Child Adolesc Psychol
.
2014
;
43
(
2
):
216
228
24
ASQ: SE-2
.
Baltimore, MD
:
Brookes Publishing Co
;
2015
. Available at: www.brookespublishing.com/resource-center/screening-and-assessment/asq/asq-se-2/. Accessed March 15, 2016
25
Goodman
R
,
Ford
T
,
Simmons
H
,
Gatward
R
,
Meltzer
H
.
Using the Strengths and Difficulties Questionnaire (SDQ) to screen for child psychiatric disorders in a community sample.
Int Rev Psychiatry
.
2003
;
15
(
1–2
):
166
172
[PubMed]
26
Spitzer
RL
,
Kroenke
K
,
Williams
JB
.
Validation and utility of a self-report version of PRIME-MD: the PHQ primary care study. Primary care evaluation of mental disorders. Patient health questionnaire.
JAMA
.
1999
;
282
(
18
):
1737
1744
[PubMed]
27
American Academy of Pediatrics
.
Appendix S3: mental health practice readiness inventory.
Pediatrics
.
2010
;
125
(
suppl 3
):
S129
S132
28
Schouten
LM
,
Hulscher
ME
,
van Everdingen
JJ
,
Huijsman
R
,
Grol
RP
.
Evidence for the impact of quality improvement collaboratives: systematic review.
BMJ
.
2008
;
336
(
7659
):
1491
1494
[PubMed]
29
American Academy of Pediatrics
. 2015 Medicaid reimbursement reports. Available at: https://www.aap.org/en-us/professional-resources/Research/research-resources/Pages/2015-Medicaid-Reimbursement-Reports.aspx. Accessed January 7, 2017

Competing Interests

POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.