Equitable access to coronavirus 2019 (COVID-19) screening is important to reduce transmission and maintain in-person learning for middle school communities, particularly in disadvantaged schools. Rapid antigen testing, and at-home testing in particular, could offer substantial advantages over onsite testing from a school district’s perspective, but it is unknown if engagement in at-home testing can be initiated and sustained. We hypothesized that an at-home COVID-19 school testing program would be noninferior to an onsite school COVID-19 testing program with regard to school participation rates and adherence to a weekly screening testing schedule.
We enrolled 3 middle schools within a large, predominantly Latinx-serving, independent school district into a noninferiority trial from October 2021 to March 2022. Two schools were randomized to onsite and 1 school to at-home COVID-19 testing programs. All students and staff were eligible to participate.
Over the 21-week trial, at-home weekly screening testing participation rates were not inferior to onsite testing. Similarly, adherence to the weekly testing schedule was not inferior in the at-home arm. Participants in the at-home testing arm were able to test more consistently during and before returning from school breaks than those in the onsite arm.
Results support the noninferiority of at-home testing versus onsite testing both in terms of participation in testing and adherence to weekly testing. Implementation of at-home COVID-19 screening testing should be part of schools’ routine COVID-19 prevention efforts nationwide; however, adequate support is essential to ensure participation and persistence in regular at-home testing.
In-person school attendance is critical for academic success and student engagement, as well as social and emotional well-being.1 When prevention strategies are not implemented, schools are vulnerable to coronavirus disease 2019 (COVID-19)-related harms, including transmission, staff and student absences, and closures.2,3 Individuals are also vulnerable; 25% of teachers have underlying medical conditions4 and every infection carries the risk of potentially serious post-COVID-19 conditions,5 including long COVID-19, which is estimated to affect 25% of children, although there are limited studies which include control groups.6
Schools with layered prevention strategies, including screening testing, have reduced COVID-19 transmission rates,7–9 thereby maximizing in-person instruction days and facilitating continuation of extracurricular activities.10 Accessible testing is important, given existing inequities. Compared with their white counterparts, Black, Hispanic, and Asian children have lower rates of testing but are more likely to be infected and hospitalized with COVID-19.11
Polymerase chain reaction (PCR) testing has been considered the gold standard test for COVID-19, because qualitative PCR, as typically reported to testers, is unable to distinguish infectious individuals, which is necessary for determining isolation. Rapid antigen tests are a relatively inexpensive and fast testing alternative that requires significantly less infrastructure. Positive rapid antigen tests strongly correlate with culturable severe acute respiratory syndrome coronavirus 2,12 which is a proxy for infectiousness.
Several school testing paradigms (e.g., screening, symptomatic testing, test-to-stay) have been implemented onsite, offsite, and at home. Screening testing can prevent or limit outbreaks by identifying cases before transmission, particularly when community transmission is high.13 Onsite testing has been the standard, minimizing school absences,14 but is resource-intensive and results in missed class time. Whether at-home testing15 is a feasible alternative to onsite testing remains unknown.
We hypothesized that an at-home COVID-19 school testing program would be noninferior to an onsite program in terms of the participation rate and adherence to weekly middle school screening testing among students and staff. This manuscript explores the consequent findings from our study, Communities Fighting COVID!: Returning our Kids Back to School Safely, which is funded by the National Institutes of Health Rapid Acceleration of Diagnostics – Underserved Populations (RADx-UP) Return to School Initiative.16
Methods
Population
There were 10 middle schools and 1 junior high school eligible for inclusion, all in the Sweetwater Union High School District, which is a large, independent school district located in southern San Diego County, CA, primarily serving Latinx students. Schools were excluded from possible selection (Fig 1) if they had a lower proportion of students from households receiving Supplemental Nutrition Assistance Program benefits (16% vs 44%) and a lower proportion of underrepresented minority students (80% vs 89%) compared with other schools (3 schools)17 ; had ongoing school-wide testing programs with a commercial provider (2 schools); had larger (1 school) or smaller (1 school) student populations (for matching purposes) than the remaining schools; or were geographically closer to the remaining schools (1 school). All students and staff affiliated with the selected schools were eligible to participate. The study was approved by the San Diego State University Institutional Review Board and written informed consent and child assent were obtained.
Design
This was a cluster-randomized noninferiority trial conducted at 3 schools from October 2021 to April 2022. We chose a noninferiority design because at-home testing would be easier to implement compared with the standard onsite testing and its use would be justifiable if “not worse” than onsite testing. The 3 schools matched by population size, racial and ethnic and socioeconomic characteristics, were randomly assigned to COVID-19 testing (onsite or at-home) by a SAS algorithm (SAS Institute, Cary, NC), which drew random numbers of either 1 (onsite testing arm; probability of 2 of 3) or 2 (at-home testing arm; probability of 1 of 3). Randomization was done by the biostatistician who was blinded to the school names and arms during randomization.
Trial enrollment occurred from mid October 2021 through late March 2022, with COVID-19 testing occurring from November 7, 2021 to March 27, 2022; the length of possible trial participation varied from 2 to 21 weeks. We concluded trial enrollment before reaching the target sample size for 1 arm based on the need to scale up a final testing program to other district middle schools before the end of the school year. A preliminary analysis for the primary outcomes demonstrated sufficient power (>80%) and supported ending trial enrollment early. With outcome data collection complete, this facilitated continued testing program enrollment in the 1 arm already receiving the final testing program to be scaled up and initiating enrollment in the final testing program across all the other district middle schools.
Participating students (plus their parent or guardian) and staff completed an online questionnaire via Qualtrics (Provo, UT) at enrollment that collected demographics including race and ethnicity, gender identity, primary language spoken at home, number of COVID-19 vaccine doses received, and previous COVID-19 diagnosis, and other items collected for the RADx-UP consortium18 and public health reporting purposes. Race and ethnicity questions were the items from the RADx-UP common data elements18 ; participants self-reported race and ethnicity, and major categories reported herein were selected based on the overall racial and ethnic composition of the schools. The questionnaire and all study materials were available in English and Spanish.
School COVID-19 Testing Programs
The onsite COVID-19 testing program was implemented by trained community health workers using the QuickVue SARS Antigen test for health care settings (QuidelOrtho Corporation, San Diego, CA). Once enrolled, students were provided leave slips to exit class weekly at preappointed times for testing; staff were encouraged to test during breaks or free periods and were given priority. First-time testers received a study testing card with their unique study number. Upon arrival, participants completed a symptom and exposure questionnaire using KoBo Collect (Kobo Inc., Cambridge, MA) and self-administered their anterior nares test swab under community health worker supervision. Unless symptomatic, participants returned to class and duties after sample collection. Using a built-in algorithm, those who reported COVID-19 symptoms were flagged by the app and asked to wait in a designated outdoor area for the school nurse. Students were notified of positive results and removed from class to await retrieval by a parent or guardian; staff were notified of positive results and sent home to isolate. During school breaks, onsite participants were provided flyers and encouraged to attend local testing supported through our RADx-UP Communities Fighting COVID! (3U54CA132384-10S1, 3U54CA132384-10S5) program using their study testing card.
The at-home testing program was facilitated by trained community health workers and used consumer QuickVue At-Home OTC COVID-19 Tests (QuidelOrtho Corporation, San Diego, CA). Upon enrollment, participants were instructed to pick up their test kits from study staff at their school. Test kits included a study testing card with unique study number and link to the test reporting web site, 6 2-packs of tests, and tailored instructional handouts. Participants received weekly e-mail and text message reminders and submitted results into a custom test reporting system that collected symptom and exposure information used to outline next steps.19 Participants who registered, but did not pick up initial test kits, as well as those who picked up test kits but did not report results, were contacted by study staff for assistance. Participants were instructed to return for additional tests once they used the majority of their tests. All onsite and at-home test results were reported to the local county health department.
Primary Outcome Measures
The first primary outcome is school participation rate in the testing program, calculated for each of the 21 trial weeks as a function of school population. For the onsite testing arm schools, the school participation rate is the percentage of the schools’ student and staff population that tested onsite at the school each week. For the at-home testing arm school, the school participation rate is the percentage of the school’s student and staff population that reported their at-home test results via the online reporting system each week. For each of the 21 trial weeks, the denominator for this outcome is the schools’ population by arm and the numerator is the participants by arm who tested onsite or reported an at-home test that week. The second primary outcome is adherence to weekly screening testing among enrolled participants. Adherence to weekly testing refers to participants testing each week after enrollment. Participants in both arms were given a 1-week grace period to begin testing to account for enrollment or pick-up occurring after the weekly testing or recommended testing day. To create equivalence between the arms, the 1-week grace period began on the date of test kit pick-up for participants in the at-home testing arm, whereas for onsite arm participants, it was from the date they enrolled. If participants tested during their enrollment week (i.e., the grace period), then that test was included in the data.
Outcomes for both hypotheses included assessing whether a participant completed a COVID-19 test as part of the study during each week of the trial. For the onsite testing arm, COVID-19 testing was objectively assessed by completion of a COVID-19 test in the presence of study staff and reported, along with test results within the same KoboCollect form as their symptoms and exposure data. For at-home testing arm participants, COVID-19 testing completion each week was assessed by self-report through the online results reporting system, which included a signed attestation that they completed and accurately reported the test result. Participants and study staff implementing and supporting testing and test results reporting were not blinded to school assignment or testing programs.
Statistical Methods
We designed this study to determine if at-home COVID-19 testing is noninferior to onsite testing for participation rates and adherence to weekly testing. With a noninferiority margin of 6%, chosen because of the significant advantages of an at-home COVID-19 testing program in implementation resources and reducing at-school exposures, the necessary sample size was calculated as 324 per arm (type I error rate 0.05, power 80%) for both outcomes.
Generalized estimating equations models with a logit link and first-order autoregressive correlation structure were used to account for the time dependence within each participant.20 Models including covariates, such as participant type (student, staff), race and ethnicity, primary language spoken at home, gender identity, COVID-19-vaccination status, and prior COVID-19 testing among interaction terms, were evaluated. The first outcome, participation rate in testing over time, utilizes the school population sample of both enrolled and nonenrolled participants. Therefore, models for this outcome only have demographic covariates of participant type, race and ethnicity, and gender identity since these demographics were available by school; a nonenrolled sample with demographics closely matching the school’s demographics when combined with the enrolled sample could be generated to create the school population sample. Estimated marginal means (model predicted probabilities) and standard errors for the outcomes for each study arm, controlling for covariates, were generated for testing the noninferiority hypotheses. Exploratory subgroup analyses were conducted for covariates that showed differences between the arms. Analyses were conducted using SPSS (version 28, IBM Corp., Armonk, NY).
One-sided noninferiority tests for participation rates and adherence proportions between the 2 study arms were assessed. Data from all 21 weeks of the trial were included. Models 1 and 2 tested the noninferiority of the participation rate for the study arm effect. Model 1 included covariates of race and ethnicity, gender identity, participant type (student, staff), and time per week, whereas Model 2 included the same covariates and an interaction term between the study arm and participant type. Models 3 and 4 tested for noninferiority for adherence to the weekly testing schedule between the 2 arms. Model 3 included gender identity, race and ethnicity, primary language spoken at home, participant type, and time per week, whereas nonsignificant covariates (vaccination status, prior COVID-19 testing) were removed. Model 4 added the interaction term between the study arm and the participant type covariate. One-sided 95% confidence intervals for all models were constructed. All analyses were conducted according to an intention-to-treat principle. A 6% noninferiority margin was used for all tests.
Results
As shown in Fig 1 and Tables 1 and 2, trial enrollment included 264 (199 students, 65 staff; 34.7% of school’s population) participants in the at-home testing arm and 588 (500 students, 88 staff; 38.6% of schools’ population) participants in the onsite testing arm. Demographics and COVID-19-related history were mostly similar between arms, with the exception of race and ethnicity, language spoken at home, and age. Onsite testing schools had a slightly higher percentage of Latinx participants, and a higher population of Filipino, Asian, or Asian Pacific Islander participants, which reflects actual demographic differences in those schools. Compared with onsite testing schools, at-home testing schools had a higher percentage of non-Hispanic white participants and Spanish reported as the primary at-home language. The majority of participants received 2 COVID-19 vaccine doses, whereas approximately one-third were unvaccinated. The population sample of enrolled and nonenrolled school population members was 761 students and staff from the at-home school and 1524 from the onsite schools. The sample was constructed to reflect each school’s student and staff overall race and ethnicity and gender identity compositions17,21 ; the arms differed on race and ethnicity (Tables 1 and 2).
Enrolled Sample . | At-Home Testing Arm Participants Enrolled in the Trial, n (%) . | Onsite Testing Arm Participants Enrolled in the Trial, n (%) . | Difference Between Arms . | ||||
---|---|---|---|---|---|---|---|
Students, n = 199 . | School Staff, n = 65 . | All At-Home, n = 264 . | Students, n = 500 . | School Staff, n = 88 . | All Onsite, n = 588 . | P . | |
Race and ethnicity | .002 | ||||||
Latinx | 175 (87.9) | 43 (66.2) | 218 (82.6) | 430 (86.0) | 70 (79.5) | 500 (85.0) | |
Filipino, Asian, or Asian Pacific Islander | 6 (3.0) | 3 (4.6) | 9 (3.4) | 42 (8.4) | 3 (3.4) | 45 (7.7) | |
African American/Black | 2 (1.0) | 1 (4.6) | 3 (1.1) | 6 (1.2.) | 0 (0.0) | 6 (1.0) | |
Another race or ethnicity | 9 (4.5) | 5 (7.7) | 14 (5.3) | 9 (1.8) | 1 (1.1) | 10 (1.7) | |
Non-Hispanic white | 7 (3.5) | 13 (20.0) | 20 (7.6) | 13 (2.6) | 14 (15.9) | 27 (4.6) | |
Gender identity | .14 | ||||||
Man | 76 (38.2) | 19 (29.2) | 95 (36.0) | 228 (45.6) | 23 (26.1) | 251 (42.7) | |
Woman | 120 (60.3) | 45 (69.2) | 165 (62.5) | 260 (52.0) | 65 (73.9) | 325 (55.3) | |
Another category (transgender, genderqueer, bigender, agender) | 3 (1.5) | 1 (1.5) | 4 (1.5) | 12 (2.4) | 0 (0.0) | 12 (2.0) | |
Primary language spoken at home | .03 | ||||||
Spanish | 61 (30.7) | 5 (7.7) | 66 (25.0) | 96 (19.2) | 11 (12.5) | 107 (18.2) | |
English | 138 (69.3) | 60 (92.3) | 198 (75.0) | 404 (80.8) | 77 (87.5) | 481 (81.8) | |
Number of COVID-19 vaccine doses | .64 | ||||||
3 | 4 (2.0) | 18 (27.7) | 22 (8.3) | 19 (3.8) | 28 (31.8) | 47 (8.0) | |
2 | 113 (56.8) | 36 (55.4) | 149 (56.4) | 265 (53.0) | 44 (50.0) | 309 (52.6) | |
1 | 5 (2.5) | 2 (3.1) | 7 (2.7) | 20 (4.0) | 2 (2.3) | 22 (3.7) | |
0 | 77 (38.7) | 9 (13.8) | 86 (32.6) | 196 (39.2) | 14 (15.9) | 210 (35.7) | |
Previously tested positive for COVID | .73 | ||||||
Yes | 28 (14.1) | 16 (24.6) | 44 (16.7) | 83 (16.6) | 12 (13.6) | 95 (16.2) | |
No | 109 (54.8) | 42 (64.6) | 151 (57.2) | 286 (57.2) | 72 (81.8) | 358 (60.9) | |
Never tested | 60 (30.2) | 6 (9.2) | 66 (25.0) | 125 (25.0) | 3 (3.4) | 128 (21.8) | |
Don’t know or prefer not to answer | 2 (1.0) | 1 (1.5) | 3 (1.1) | 6 (1.2%) | 1 (1.1) | 7 (1.2) | |
Positive COVID-19 test(s) during the triala | .60 | ||||||
1 or > | 12 (6.7) | 2 (3.7) | 14 (6.0) | 21 (4.4) | 6 (7.3) | 27 (4.8) | |
0 | 167 (93.3) | 52 (96.3) | 219 (94.0) | 460 (95.6) | 76 (92.7) | 536 (95.2) | |
Age | M = 12.74, SD = 0.73 | M = 42.40, SD = 12.27 | M = 20.04, SD = 14.18 | M = 12.72, SD = 0.63 | M = 42.00, SD = 12.69 | M = 17.10, SD = 11.55 | .001 |
Weeks enrolled or eligible to test | M = 14.54, SD = 5.52 | M = 15.58, SD = 5.31 | M = 14.23, SD = 5.44 | M = 13.94, SD = 5.46 | M = 15.88, SD = 5.00 | M = 14.80, SD = 5.33 | .16 |
COVID-19 weekly tests completed or reported | M = 6.38, SD = 5.71 | M = 8.65, SD = 6.59 | M = 6.94, SD = 6.00 | M = 6.50, SD = 3.53 | M = 4.47, SD = 3.99 | M = 6.20, SD = 3.67 | .03 |
Enrolled Sample . | At-Home Testing Arm Participants Enrolled in the Trial, n (%) . | Onsite Testing Arm Participants Enrolled in the Trial, n (%) . | Difference Between Arms . | ||||
---|---|---|---|---|---|---|---|
Students, n = 199 . | School Staff, n = 65 . | All At-Home, n = 264 . | Students, n = 500 . | School Staff, n = 88 . | All Onsite, n = 588 . | P . | |
Race and ethnicity | .002 | ||||||
Latinx | 175 (87.9) | 43 (66.2) | 218 (82.6) | 430 (86.0) | 70 (79.5) | 500 (85.0) | |
Filipino, Asian, or Asian Pacific Islander | 6 (3.0) | 3 (4.6) | 9 (3.4) | 42 (8.4) | 3 (3.4) | 45 (7.7) | |
African American/Black | 2 (1.0) | 1 (4.6) | 3 (1.1) | 6 (1.2.) | 0 (0.0) | 6 (1.0) | |
Another race or ethnicity | 9 (4.5) | 5 (7.7) | 14 (5.3) | 9 (1.8) | 1 (1.1) | 10 (1.7) | |
Non-Hispanic white | 7 (3.5) | 13 (20.0) | 20 (7.6) | 13 (2.6) | 14 (15.9) | 27 (4.6) | |
Gender identity | .14 | ||||||
Man | 76 (38.2) | 19 (29.2) | 95 (36.0) | 228 (45.6) | 23 (26.1) | 251 (42.7) | |
Woman | 120 (60.3) | 45 (69.2) | 165 (62.5) | 260 (52.0) | 65 (73.9) | 325 (55.3) | |
Another category (transgender, genderqueer, bigender, agender) | 3 (1.5) | 1 (1.5) | 4 (1.5) | 12 (2.4) | 0 (0.0) | 12 (2.0) | |
Primary language spoken at home | .03 | ||||||
Spanish | 61 (30.7) | 5 (7.7) | 66 (25.0) | 96 (19.2) | 11 (12.5) | 107 (18.2) | |
English | 138 (69.3) | 60 (92.3) | 198 (75.0) | 404 (80.8) | 77 (87.5) | 481 (81.8) | |
Number of COVID-19 vaccine doses | .64 | ||||||
3 | 4 (2.0) | 18 (27.7) | 22 (8.3) | 19 (3.8) | 28 (31.8) | 47 (8.0) | |
2 | 113 (56.8) | 36 (55.4) | 149 (56.4) | 265 (53.0) | 44 (50.0) | 309 (52.6) | |
1 | 5 (2.5) | 2 (3.1) | 7 (2.7) | 20 (4.0) | 2 (2.3) | 22 (3.7) | |
0 | 77 (38.7) | 9 (13.8) | 86 (32.6) | 196 (39.2) | 14 (15.9) | 210 (35.7) | |
Previously tested positive for COVID | .73 | ||||||
Yes | 28 (14.1) | 16 (24.6) | 44 (16.7) | 83 (16.6) | 12 (13.6) | 95 (16.2) | |
No | 109 (54.8) | 42 (64.6) | 151 (57.2) | 286 (57.2) | 72 (81.8) | 358 (60.9) | |
Never tested | 60 (30.2) | 6 (9.2) | 66 (25.0) | 125 (25.0) | 3 (3.4) | 128 (21.8) | |
Don’t know or prefer not to answer | 2 (1.0) | 1 (1.5) | 3 (1.1) | 6 (1.2%) | 1 (1.1) | 7 (1.2) | |
Positive COVID-19 test(s) during the triala | .60 | ||||||
1 or > | 12 (6.7) | 2 (3.7) | 14 (6.0) | 21 (4.4) | 6 (7.3) | 27 (4.8) | |
0 | 167 (93.3) | 52 (96.3) | 219 (94.0) | 460 (95.6) | 76 (92.7) | 536 (95.2) | |
Age | M = 12.74, SD = 0.73 | M = 42.40, SD = 12.27 | M = 20.04, SD = 14.18 | M = 12.72, SD = 0.63 | M = 42.00, SD = 12.69 | M = 17.10, SD = 11.55 | .001 |
Weeks enrolled or eligible to test | M = 14.54, SD = 5.52 | M = 15.58, SD = 5.31 | M = 14.23, SD = 5.44 | M = 13.94, SD = 5.46 | M = 15.88, SD = 5.00 | M = 14.80, SD = 5.33 | .16 |
COVID-19 weekly tests completed or reported | M = 6.38, SD = 5.71 | M = 8.65, SD = 6.59 | M = 6.94, SD = 6.00 | M = 6.50, SD = 3.53 | M = 4.47, SD = 3.99 | M = 6.20, SD = 3.67 | .03 |
Among those who tested or reported any test results: excludes 20 students, 11 staff in the at-home arm, and 17 students, 6 staff in the onsite arm.
Population samplea | At-Home Testing School’s Population, n (%) | Onsite Testing Schools’ Population, n (%) | Difference Between Arms | ||||
Students, n = 666 | School Staff, n = 95 | All At-Home, n = 761 | Students, n = 1383 | School Staff, n = 141 | All Onsite, n = 1524 | P | |
Race and ethnicity | < .001 | ||||||
Latinx | 530 (79.6) | 69 (72.6) | 599 (78.7) | 1073 (77.6) | 111 (78.7) | 1184 (77.7) | |
Filipino, Asian, or Asian Pacific Islander | 23 (3.5) | 3 (3.2) | 26 (3.4) | 112 (8.1) | 6 (4.3) | 118 (7.7) | |
African American/Black | 20 (3.0) | 1 (1.1) | 21 (2.8) | 26 (1.9) | 0 (0.0) | 26 (1.7) | |
Another race or ethnicity | 28 (4.2) | 7 (7.4) | 35 (4.6) | 51 (3.7) | 1 (0.7) | 52 (3.4) | |
Non-Hispanic white | 65 (9.8) | 15 (15.8) | 80 (10.5) | 121 (8.7) | 23 (16.3) | 144 (9.4) | |
Gender identity | .43 | ||||||
Man | 324 (48.6) | 30 (31.6) | 354 (46.5) | 705 (51.0) | 39 (27.7) | 744 (48.8) | |
Woman | 339 (50.9) | 64 (67.4) | 403 (53.0) | 666 (48.2) | 102 (72.3) | 768 (50.4) | |
Another category (transgender, genderqueer, bigender, agender) | 3 (0.5) | 1 (1.1) | 4 (0.5) | 12 (0.9) | 0 (0.0) | 12 (0.8) |
Population samplea | At-Home Testing School’s Population, n (%) | Onsite Testing Schools’ Population, n (%) | Difference Between Arms | ||||
Students, n = 666 | School Staff, n = 95 | All At-Home, n = 761 | Students, n = 1383 | School Staff, n = 141 | All Onsite, n = 1524 | P | |
Race and ethnicity | < .001 | ||||||
Latinx | 530 (79.6) | 69 (72.6) | 599 (78.7) | 1073 (77.6) | 111 (78.7) | 1184 (77.7) | |
Filipino, Asian, or Asian Pacific Islander | 23 (3.5) | 3 (3.2) | 26 (3.4) | 112 (8.1) | 6 (4.3) | 118 (7.7) | |
African American/Black | 20 (3.0) | 1 (1.1) | 21 (2.8) | 26 (1.9) | 0 (0.0) | 26 (1.7) | |
Another race or ethnicity | 28 (4.2) | 7 (7.4) | 35 (4.6) | 51 (3.7) | 1 (0.7) | 52 (3.4) | |
Non-Hispanic white | 65 (9.8) | 15 (15.8) | 80 (10.5) | 121 (8.7) | 23 (16.3) | 144 (9.4) | |
Gender identity | .43 | ||||||
Man | 324 (48.6) | 30 (31.6) | 354 (46.5) | 705 (51.0) | 39 (27.7) | 744 (48.8) | |
Woman | 339 (50.9) | 64 (67.4) | 403 (53.0) | 666 (48.2) | 102 (72.3) | 768 (50.4) | |
Another category (transgender, genderqueer, bigender, agender) | 3 (0.5) | 1 (1.1) | 4 (0.5) | 12 (0.9) | 0 (0.0) | 12 (0.8) |
Population sample includes: (a) participants enrolled in the trial before the enrollment end date; (b) 3 students in the at-home arm, 33 students, and 2 staff in the onsite arm who enrolled in the 2 weeks after trial enrollment concluded and thus were not able to test before the trial end date but whose collected demographics are used in the population sample; and (c) a sample of nonparticipants generated to complete each schools remaining population and in doing so creating a full population sample that closely reflects each school’s overall race and ethnicity and gender identity composition.
As shown in Fig 1, 20 students (of 199, 10.1%) and 11 staff (of 65, 16.9%) in the at-home arm picked up test kits but did not report any at-home results; 17 students (of 500, 3.4%) and 6 staff (of 88, 6.8%) in the onsite arm never tested onsite. Additionally, among those who reported or tested at least 3 times and had the opportunity to test for at least 7 weeks, 35 students (of 179, 19.6%) and 17 staff (of 54, 31.5%) in the at-home arm and 43 students (of 483, 8.9%) and 30 staff (of 82, 36.6%) in the onsite arm missed reporting tests or testing the last 4 weeks or more, indicating potential discontinuation of weekly testing participation.
School Participation Rate in the Testing Program
Table 3 shows the estimated marginal participation rates (model estimated adherence proportions controlling for covariates) and standard errors for each study arm under the 4 considered models. The estimated marginal participation rate (Model 1), controlling for covariates, shows that 14.9% of the at-home and 14.5% of the onsite school population tested in a given week, on average (P = .003). We conclude that there is no inferiority for the participation rates between the 2 study arms. Since there was a significant participation rate difference for participant type (students versus staff) between the study arms, we added the interaction term in Model 2, where 17.1% of the at-home testing school population and 11.8% of the onsite population tested in a given week, on average (P < .001). A noninferiority conclusion is extremely likely to be observed for the at-home versus onsite testing arm.
Model . | Estimated Participation Rate (SE) for At-Home COVID-19 Testing . | Estimated Participation Rate (SE) for Onsite COVID-19 Testing . | 95% CI Lower Limit . | Z-statistic . | P . |
---|---|---|---|---|---|
#1 Participation rate outcomea | 0.149 (0.019) | 0.145 (0.017) | −0.032 | 2.746 | .003 |
#2 Participation rate outcomea (including participant type x arm interaction) | 0.171 (0.021) | 0.118 (0.014) | 0.008 | 4.358 | <.001 |
#3 Adherence to weekly testing outcomeb | 0.394 (0.038) | 0.352 (0.032) | −0.055 | 2.053 | .02 |
#4 Adherence to weekly testing outcomeb (including participant type x arm interaction) | 0.424 (0.039) | 0.304 (0.029) | 0.025 | 3.704 | <.001 |
Model . | Estimated Participation Rate (SE) for At-Home COVID-19 Testing . | Estimated Participation Rate (SE) for Onsite COVID-19 Testing . | 95% CI Lower Limit . | Z-statistic . | P . |
---|---|---|---|---|---|
#1 Participation rate outcomea | 0.149 (0.019) | 0.145 (0.017) | −0.032 | 2.746 | .003 |
#2 Participation rate outcomea (including participant type x arm interaction) | 0.171 (0.021) | 0.118 (0.014) | 0.008 | 4.358 | <.001 |
#3 Adherence to weekly testing outcomeb | 0.394 (0.038) | 0.352 (0.032) | −0.055 | 2.053 | .02 |
#4 Adherence to weekly testing outcomeb (including participant type x arm interaction) | 0.424 (0.039) | 0.304 (0.029) | 0.025 | 3.704 | <.001 |
Only lower limits are provided for confidence intervals because the tests for noninferiority are 1-sided. Sample size for models 1 and 2 is at-home: n = 761, onsite: n = 1524 with longitudinal data for 21 wk representing a total of 47 985 cases. Sample size for models 3 and 4 is at-home: n = 264, onsite: n = 588, total cases across all participants 12 275. CI, confidence interval.
Generalized estimating equations model estimated marginal participation and adherence rate controlling for race and ethnicity, gender identity, participant type, and time.
Generalized estimating equations model estimated marginal participation and adherence rate controlling for race and ethnicity, gender identity, participant type, primary language spoken at home, and time.
Adherence to the Weekly Screening Testing Schedule
In Table 3, Models 3 and 4 show the noninferiority test results for adherence to weekly testing between the 2 study arms. Model 3 estimates, controlling for covariates, on average, showed adherence estimates of 39.4% at-home and 35.2% onsite for the scheduled weekly COVID-19 testing (P = .02), showing modest noninferiority for adherence to the weekly testing schedule between the 2 study arms. Model 4 added an interaction term between the participant type covariate and study arm. Model 4 estimated 42.4% at-home and 30.4% onsite adherence to scheduled weekly testing (P < .001). A noninferiority conclusion is extremely likely for adherence to the weekly screening testing schedule between the 2 study arms.
Subgroup Analyses by Students, School Staff
Participant type showed different patterns between the study arms. Figs 2 and 3 illustrate the unadjusted school participation rates and adherence to the weekly testing schedule over the 21 weeks by student and staff according to study arm with a fitted linear trend line. Participation rates in the at-home testing arm among students were close to that of the onsite testing arm (Fig 2). The at-home testing arm was more consistent, with students continuing to test during school breaks. For staff in the at-home testing arm, participation rates were nearly universally higher each week than for staff in the onsite testing arm. The same pattern of testing was observed among staff and students. Virtually, the same patterns were observed for students and staff in weekly adherence to testing (Fig 3).
Table 4 shows study arm differences in testing noninferiority of participation rates and adherence proportions categorized by participant type. On average, 9.1% of at-home and 10.5% of onsite student population tested on a given week (P = .009). We conclude that there is no inferiority for the participation rates between the 2 study arms for the student group. For staff, the model estimated 28.6% at-home and 13.2% onsite tested on a given week (P < .001), indicating no inferiority for participation rate between the staff at-home and onsite testing arm.
Model . | Estimated Participation Rate (SE) for At-Home COVID-19 Testing . | Estimated Participation Rate (SE) for Onsite COVID-19 Testing . | 95% CI Lower Limit . | Z-statistic . | P . |
---|---|---|---|---|---|
#2 Participation rate outcome: studentsa | 0.091 (0.012) | 0.105 (0.012) | −0.0479 | 2.357 | .009 |
#2 Participation rate outcome: staffa | 0.286 (0.040) | 0.132 (0.021) | 0.0857 | 4.870 | <.001 |
#4 Adherence to weekly testing outcome: studentsb | 0.380 (0.036) | 0.400 (0.030) | −0.097 | 0.854 | .20 |
#4 Adherence to weekly testing outcome: staffb | 0.470 (0.054) | 0.222 (0.029) | 0.1473 | 5.025 | <.001 |
Model . | Estimated Participation Rate (SE) for At-Home COVID-19 Testing . | Estimated Participation Rate (SE) for Onsite COVID-19 Testing . | 95% CI Lower Limit . | Z-statistic . | P . |
---|---|---|---|---|---|
#2 Participation rate outcome: studentsa | 0.091 (0.012) | 0.105 (0.012) | −0.0479 | 2.357 | .009 |
#2 Participation rate outcome: staffa | 0.286 (0.040) | 0.132 (0.021) | 0.0857 | 4.870 | <.001 |
#4 Adherence to weekly testing outcome: studentsb | 0.380 (0.036) | 0.400 (0.030) | −0.097 | 0.854 | .20 |
#4 Adherence to weekly testing outcome: staffb | 0.470 (0.054) | 0.222 (0.029) | 0.1473 | 5.025 | <.001 |
Only lower limits are provided for confidence intervals because the tests for noninferiority are 1-sided. Sample size for model 2 is at-home: n = 761, onsite: n = 1524 with longitudinal data for 21 wk representing a total of 47 985 cases. Sample size for model 4 is at-home: n = 264, onsite: n = 588, total cases across all participants 12 275. CI, confidence interval.
Generalized estimating equations model estimated marginal participation and adherence rate controlling for race and ethnicity, gender identity, and time.
Generalized estimating equations model estimated marginal participation and adherence rate controlling for race and ethnicity, gender identity, primary language spoken at home, and time.
For students, model-estimated adherence to weekly testing was 38.0% at-home and 40.0% onsite (P = .20; Table 4), showing insufficient evidence to support a noninferiority conclusion for the adherence proportions between the at-home and onsite testing arms. Notably, the estimated adherence proportions between the 2 arms are similar. The nonsignificant result may be because of relatively large standard errors (0.036 for at-home and 0.030 for onsite). With the estimated adherence proportions and standard errors available after the study, we recalculated the power using the other original parameters. The recalculated power for this subgroup analysis was 0.2143. For staff, the estimated adherence to weekly testing was 47.0% for at-home and 22.2% for onsite (P < .001), showing there was no inferiority for adherence to the weekly screening testing schedule for the at-home versus onsite staff testing group.
Discussion
In this study of predominantly Latinx middle schools, intention-to-treat analyses supported our hypotheses that at-home COVID-19 testing was not inferior to onsite testing for both school participation rates and persistence with weekly testing. In post-hoc subgroup analyses, there was a clear advantage of at-home testing for school staff. We could not conclude noninferiority in weekly adherence for at-home testing in students, although the subgroup analyses were exploratory and under-powered. Furthermore, students in the at-home arm had a higher percentage of participants who never tested and a greater percentage of participants who discontinued participation compared with the onsite arm, both of which had a large impact on adherence, though students in the at-home testing arm demonstrated only slightly lower overall adherence.
Other school testing programs have shown high variability in participation rates, and have not provided head-to-head assessments of testing strategies, nor examined the role of at-home testing.3,22 Some have found lower participation among nonwhite students.22 We observed the opposite in our study. Our population was predominantly Latinx, all study materials were available in English and Spanish, and staff were bilingual. Although overall participation is important, continued adherence to regular screening is critical for reducing virus transmission.23–26 Some studies have demonstrated high adherence to at-home screening schedules compared with onsite PCR, though not in a kindergarten through 12th grade school setting.27,28 Our adherence consistency in at-home testing was notable given natural changes in broader community testing interest over time through COVID-19 surges and reductions.
At-home testing has clear advantages to onsite testing for those who test in terms of convenience and identification of infected individuals before their return to campus, therefore avoiding potential in-school exposure and transmission. Participants in the at-home arm of the study were more able to test consistently during school breaks and before returning from extended breaks (Fig 3). Our project also experienced substantial and consistent ongoing participation, despite wide availability of other testing options, indicating that school-based distribution may be preferable to other options (e.g., rapid tests reimbursed through insurance; government distribution of rapid tests through mail, those available through community health centers).
This study has some limitations. First, enrollment may have been suppressed, resulting in selection bias, because of a lengthy informed consent form, a 15 to 20 minute questionnaire completed at enrollment, and testing as part of a research study. Second, the results were possibly under- or over-reported by participants in the at-home arm. Third, since this was a trial to determine which testing modality to scale-up to middle schools district-wide, only 3 schools participated. Finally, sample size calculations assumed an independent sample without adjusting for potential intraclass correlation within the same school; post hoc subgroup analysis lacked sufficient power.
Despite these limitations, our study had several strengths. First, how we used rapid tests, focused on historically underserved school communities, and implemented a study in a region severely impacted by the pandemic, was unique. Second, our study was powered sufficiently to examine the primary outcomes. Third, the trial length allowed for significant natural variability in community COVID-19 rates and testing interest. Fourth, vaccinated and unvaccinated students and staff were included. Fifth, we developed a strong research partnership with the school district. Finally, although we did not assess generalizability to other communities, the representativeness of our study sample makes a compelling argument for external validity.
Modeling shows a reduction in COVID-19 transmission when there is high participation in school testing.24 At-home testing removes financial and logistical barriers to program implementation and our data support it as a viable alternative to on-site school screening testing programs. Our findings suggest that any perceived individual participant burden of at-home testing does not exceed that of on-site testing, perhaps because the benefits of at-home testing outweigh the perceived burden. However, adequate support is essential to ensure availability, participation, and persistence in regular at-home testing.
Conclusions
The use of school-based at-home testing is an acceptable strategy for consistent participation in screening testing within a primarily Latinx school district. Combined with other mitigation strategies, at-home testing may reduce in-school and community COVID-19 transmission. Given ease of use and noninferiority shown in this trial, at-home testing programs with sufficient support structures ensuring engagement and persistence should be implemented nationally for an equity-focused COVID-19 response.
Acknowledgments
We thank Dr. Sonia Lee from NICHD for her guidance and support; school principals, assistant principals, nurses, COVID-19 liaisons, after school program coordinators, custodial staff, and other school staff who supported the testing programs at each school; Layda Galvan for supporting testing program communications; our community health worker study team, data management and results reporting support from Katherine Crockett and Michael Ediau; Cheenee Rose Real, Cynthia Sanchez, and Fatima Ashaq for testing logistics and program implementation support; community health workers from SBCS for assisting with outreach for recruitment; the school district leadership for having the vision to implement these testing programs within the district middle schools; Erin Campbell, MS, provided editorial review and submission of this manuscript; Ms. Campbell did not receive compensation for her contributions, apart from her employment at Duke Clinical Research Institute.
Dr Kiene conceptualized and designed the study, designed the data collection and reporting architecture, oversaw data collection, contributed to data analyses, and drafted the initial manuscript; Dr McDaniels-Davidson conceptualized and designed the study, led partnerships for community engagement, contributed to oversight of data collection, and drafted the initial manuscript; Dr Oren conceptualized and designed the study, contributed to oversight of data collection, and drafted the initial manuscript; Dr Lin conducted power analyses, randomization, led data analyses, contributed to conceptualizing the study design, and contributed to drafting the initial manuscript; Ms Chris curated the data; Mr Snyder designed customized systems supporting data collection, testing, and reporting, and curated the data; Drs Bravo and Moore and Ms Rodriguez, Ms Famania-Martinez, Ms Carbuccia, and Ms Pinuelas-Morineau contributed to study implementation; Ms Arechiga-Romero implemented the testing programs and collected data; and all authors critically reviewed and revised the manuscript, approved the final manuscript as submitted, and agree to be accountable for all aspects of the work.
This trial has been registered at www.clinicaltrials.gov (identifier NCT05150860).
The datasets for this manuscript are not yet publicly available because RADx data have not yet been made available to the public for review and analysis. Requests to access the datasets should be directed to the NIH RADx Data Hub (https://radx-hub.nih.gov/home)
FUNDING: This research was, in part, funded by the National Institutes of Health (NIH) Agreement No. 1OT2HD108112-01. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the NIH. At the investigators request, NIH provided the Quidel QuickVue OTC and Quidel QuickVue tests used in the study through an existing contract agreement (75N92020C00013) between NIH and Quidel Corporation.
CONFLICT OF INTEREST DISCLOSURES: Corinne McDaniels-Davidson has received compensation as a consultant for Gilead Scientific. In addition, her spouse is employed by QuidelOrtho Corporation and participates in their employee stock purchase program. The other authors have no conflicts of interest relevant to this article to disclose.
Comments