Child mobile device use is increasingly prevalent, but research is limited by parent-report survey methods that may not capture the complex ways devices are used. We aimed to implement mobile device sampling, a set of novel methods for objectively measuring child mobile device use.
We recruited 346 English-speaking parents and guardians of children aged 3 to 5 years to take part in a prospective cohort study of child media use. All interactions with participants were through e-mail, online surveys, and mobile device sampling; we used a passive-sensing application (Chronicle) in Android devices and screenshots of the battery feature in iOS devices. Baseline data were analyzed to describe usage behaviors and compare sampling output with parent-reported duration of use.
The sample comprised 126 Android users (35 tablets, 91 smartphones) and 220 iOS users (143 tablets, 77 smartphones); 35.0% of children had their own device. The most commonly used applications were YouTube, YouTube Kids, Internet browser, quick search or Siri, and streaming video services. Average daily usage among the 121 children with their own device was 115.3 minutes/day (SD 115.1; range 0.20–632.5) and was similar between Android and iOS devices. Compared with mobile device sampling output, most parents underestimated (35.7%) or overestimated (34.8%) their child’s use.
Mobile device sampling is an unobtrusive and accurate method for assessing mobile device use. Parent-reported duration of mobile device use in young children has low accuracy, and use of objective measures is needed in future research.
Previous studies of young children’s mobile device use rely on parent recall or time-use diaries, which may be inaccurate or carry high participant burden. No previous studies in children have harnessed application usage data already collected by mobile devices.
Mobile device sampling (passive sensing for Android and screenshots from iOS devices) is an acceptable and feasible objective method for assessing mobile device use. Parent-reported duration of their child’s mobile device use had low accuracy compared with objective output.
Children’s use of mobile and interactive media has increased rapidly over the past decade.1 Recent estimates reveal that the majority of parents own smartphones,2 on which they allow their children to play games or watch videos. Up to 75% of young children have their own tablets,3 and infants are estimated to start handling mobile devices during the first year of life,1 but research on modern media has been limited by a lack of precise measurement tools.
Research on traditional screen media, such as television, historically used parent recall of child media use duration to test associations with outcomes such as sleep problems, obesity, and externalizing behavior.4 Similarly, studies of the benefits of educational television programming relied on parent recall and content analysis of linear, noninteractive programs.5,6 As the proportion of time that children spend on mobile platforms increases,1 media researchers are posed with a challenge of measuring on-demand, portable, and intermittent mobile device usage.7,8 Participant recall accuracy of mobile device use may be low because exposure occurs in small bursts8 (less likely to be remembered than longer interactions9 ), and parents may find it difficult to monitor content when children use handheld devices individually.10
Mobile devices collect usage data that could feasibly be harnessed for the purposes of research studies. Analysis of various data streams (eg, accelerometer, Bluetooth, location) has been used in public health research to predict patterns of human behavior11 but collects more data than is necessary for the purposes of media use measurement. In a few studies, researchers have used commercially available or prototype applications (apps) (ie, created by researchers) to test hypotheses in adults regarding mental health and smartphone use12 or motivations for using different apps,13 but no previous research has been conducted by using similar measures on the devices of children. Harnessing mobile data from children’s devices may provide more accurate data collection with lower participant and researcher burden.
Our objective for the current study was to implement novel mobile device sampling methods in a community-based sample of preschool-aged children to describe their mobile device usage and compare parent report of child use with mobile device sampling output. We describe the development of this method, important considerations during implementation, and types of variables that can be generated for research. On the basis of pilot research revealing that most of parents’ recall of their own mobile device use is inaccurate,14 we hypothesized that most parents would be inaccurate in reporting their child’s mobile device use.
Methods
Overall Study Design
The Preschooler Tablet Study is a longitudinal cohort study (Eunice Kennedy Shriver National Institute of Child Health and Human Development grant R21HD094051) in which associations between early childhood digital media use, emotion regulation, and executive functioning are examined. Data were collected through online surveys and e-mail communication with participants, mobile device sampling, and an online time-use diary completed by parents at baseline and at the 3- and 6-month follow-up. Data from the baseline data collection wave (August 2018 to May 2019) are included in the present article. The study was approved by the University of Michigan Institutional Review Board.
Participants
Parents of young children were recruited via flyers posted in community centers, preschools, child care centers, and pediatric clinics in southeast Michigan as well as our university’s online participant registry and social media advertisements. Interested parents who contacted the study team were e-mailed a link to an eligibility questionnaire. Eligibility criteria included the following: (1) the parent was the legal guardian of a 3- to 4.99-year-old child, (2) the parent lived with the child at least 5 days/week, (3) the parent understood English sufficiently to complete questionnaires and provide consent, and (4) the family owned at least one Android or iOS tablet or smartphone. Children did not need to regularly use mobile devices to be included in the study. Exclusion criteria included presence of child developmental delays, use of psychotropic medication, and the child’s mobile device being a Kindle or Amazon Fire (n = 43 interested but excluded), which do not use the standard Android operating system.
Because all interactions with the research team were electronic, we anticipated a high rate of attrition. Of 487 parents who consented to take part in the study, 64 (13%) submitted no study data after providing informed consent and receiving electronic reminders.
Survey Measures
After providing online consent for themselves and their child, parents were e-mailed study instructions and a link to online Research Electronic Data Capture15,16 surveys, in which parents reported their child’s age, sex, race and/or ethnicity, preschool or child care enrollment, and prematurity; their own age, sex, educational attainment, marital status, and employment status; and household income and size (from which we calculated the income-to-needs ratio).
Parents then completed an abbreviated version (36 items) of the Media Assessment Qualtrics Survey, which is used to assess child, parent, and household media use practices. In this survey, parents were asked, “Thinking about a typical [weekday or weekend], how much time does your child spend using 1) an iPad, tablet, LeapPad, iTouch, or similar mobile device (not including a smartphone) and 2) a smartphone for things like texting, playing games, watching videos, or surfing the Internet (don’t count time spent talking on the phone)?” Responses were never, <30 minutes, 30 minutes to 1 hour, 1 to 2 hours, 2 to 3 hours, 3 to 4 hours, 4 to 5 hours, and >5 hours. Because mobile sampling included both weekdays and weekends, we created a weighted categorical variable that reflected parent estimates of their child’s usual smartphone or tablet use throughout the week.
Mobile Device Sampling Methods
During eligibility screening, parents indicated what type(s) of mobile device(s), if any, the child regularly had access to or used. If the child used >1 device, we sampled the device used more frequently and asked the parent to avoid letting the child play on other devices that week. We provided video and visual instructions specific for tracking the device on a study Web site (see Mobile Device Sampling Methods: Installation and Data Collection in the Supplemental Information).
Android Devices
Android users were instructed to download a study app, Chronicle, from the Google Play store (Supplemental Figs 1 and 2). The Chronicle app was developed by OpenLattice, Inc, in collaboration with the Comprehensive Assessment of Family Media Exposure Consortium. It queries the Google UsageStatsManager application programming interface (API), which provides data about app usage on all Android devices running version 5.0 or later and transmits data automatically to the OpenLattice platform. Chronicle was pilot tested on a range of Android devices in June 2018 to July 2018, which allowed app debugging to ensure accuracy by comparing handwritten usage logs with raw output.
In the informed consent document, parents were informed that Chronicle only collects app name, timestamp, and a masked device identification but does not collect personal information (eg, contacts, content of messages, Web sites viewed) and that data are stored in a secure server and not shared with third-party companies. After installing Chronicle, parents were e-mailed a unique link routing their app data to the research team on the OpenLattice platform. The app user interface is simple, only providing a timestamp of the last data upload (see Android Mobile Devices in the Supplemental Information), and runs in the background with no need for user interaction. Data are continually collected locally on the device and uploaded every 15 minutes when connected to WiFi. After 9 days, participants were instructed to uninstall Chronicle after confirming that data had been uploaded that day (ie, in case the devices had been recently disconnected from WiFi). The study team then exported the Chronicle data file through the Chronicle Web application in comma-separated values (CSV) format and conducted data cleaning and processing steps as described in the Chronicle Data Cleaning Methods section of the Supplemental Information.
iOS Devices
For children who used an iPhone or iPad, we asked parents to take a screenshot of the device’s battery page (under “Settings”) 7 days after completing the surveys. Instructions for taking screenshots, including the specific buttons that need to be tapped to visualize app usage over the past 7 to 10 days, were provided via the study Web site (see Apple Mobile Devices in the Supplemental Information; see also Supplemental Fig 3).
When parents sent screenshots that did not follow study instructions, the study team responded by e-mail the same day, offering clarification on screenshot methods and requesting that new ones be sent. However, if screenshots were still incorrectly taken at this point, they were flagged for potential errors and manually inspected before inclusion in final data sets. Research assistants manually entered all screenshot data (app name, number of minutes) into spreadsheets.
Shared Devices
At the end of the sampling period, parents were asked whether the device had been shared with any other family members that week. If the parents responded yes (70.6% of Android users; 61.8% of iOS users), they completed a data form listing the names of the apps their child used that week. We created a subset of data files to include only the apps that children used during the sampling period.
App Category Coding
We developed a coding scheme to categorize apps on the basis of app store labels (eg, educational, age category), video chat, YouTube, streaming video, and other common categories such as eBooks or music (see Supplemental Table 4 for coding scheme; interrater reliability = 0.72–0.94).
Data Analysis
First, for all children with complete mobile device data (n = 346), we analyzed differences in sociodemographic characteristics by operating system and shared or unshared status. We calculated frequencies of the most commonly played apps and the number of different apps played by each child during the sampling week.
For children with their own, unshared mobile Android or iOS device (n = 121), we created summary variables representing each child’s average daily duration of device use, average daily duration of app categories, and average daily duration of specific apps played during the sampling period. We chose not to calculate daily duration from shared mobile devices because of the risk of overestimating duration of apps such as YouTube, Safari, or Netflix, which are commonly used by both children and parents.
For children with unshared Android devices (n = 37), whose output provides date and timestamps, we additionally calculated average usage by day of the week, proportion of days the child used the device, and average number of daily pickups. For illustrative purposes, we plotted the average hourly app category usage of 6 child participants (4 with heavy use, 2 with lighter use) to demonstrate diurnal visualizations of mobile device usage.
Finally, we calculated accuracy of parent-reported mobile device use by determining if each child’s average daily usage (based on mobile sampling output) fell within the weighted parent-reported time category. If parent report was inaccurate, we calculated the difference between actual daily usage and the upper or lower bounds of the parent-reported category.
All processing of raw timestamped data into user logs was performed in Python,17 all mobile device sampling analyses were conducted by using data.table in R 3.5.2,18,19 and analyses of demographics and comparison of parent report with sampling output were conducted by using SAS version 9.4 (SAS Institute, Inc, Cary, NC).20
Results
Of the 423 parents who provided any data, 58 (13%) were excluded because of incomplete mobile device data. Reasons for missing mobile device data included the following: could not (n = 7) or decided not to (n = 2) install Chronicle, <2 days of data appeared on server (usually because of server maintenance; n = 13), failed to send iOS screenshots (n = 20), screenshots were incorrect (n = 4) or blank (n = 4), and the app list for shared devices was not submitted (n = 8). Participants with missing mobile device data had no significant sociodemographic differences compared with included participants. In addition, 19 children were reported to have never used mobile devices at baseline, so mobile device sampling was not performed; these children were more likely to have parents with higher educational attainment (χ2 test; P = .02).
Characteristics of the full sample (N = 346) and the unshared device subsample (n = 121) are shown in Table 1. Participants comprised 126 Android users (35 tablets, 91 smartphones) and 220 iOS users (143 tablets, 77 smartphones). Children with iOS devices were more likely to have higher-income families (2-sample Wilcoxon rank test; P < .0001), married parents (χ2 test; P = .03), and parents with higher educational attainment (χ2 test; P < .0001).
Characteristic . | Full Cohort (N = 346) . | Unshared Devices (n = 121) . |
---|---|---|
Parent | ||
Age, y, mean (SD) | 34.0 (4.6) | 34.2 (4.6) |
Female sex, n (%) | 325 (93.9) | 112 (92.6) |
Education, n (%) | ||
High school or GED or less | 24 (6.9) | 10 (8.3) |
Some college or 2-y degree | 106 (30.6) | 46 (38.0) |
4-y college degree | 92 (26.6) | 32 (26.5) |
Advanced degree | 124 (35.8) | 33 (27.3) |
Married or with a partner, n (%) | 314 (91.0) | 111 (91.7) |
Employment, n (%) | ||
Unemployed | 97 (28.0) | 36 (29.8) |
Part-time job | 69 (19.9) | 21 (17.4) |
Full-time job | 158 (45.7) | 56 (46.3) |
Multiple jobs | 22 (6.4) | 8 (6.6) |
Child | ||
Age, y, mean (SD) | 3.82 (0.53) | 3.86 (0.55) |
Female sex, n (%) | 169 (48.8) | 54 (44.6) |
Race and/or ethnicity, n (%) | ||
White non-Hispanic | 258 (75.0) | 92 (76.7) |
Other | 86 (25.0) | 28 (23.3) |
Only child, n (%) | 63 (18.2) | 21 (17.4) |
Preschool or day care, n (%) | ||
Center based | 218 (65.3) | 77 (67.5) |
Home based | 26 (7.8) | 6 (5.3) |
Stays home with parent or caregiver | 90 (27.0) | 31 (27.2) |
Premature, n (%) | 26 (7.5) | 9 (7.4) |
Household | ||
ITN, n (%)a | ||
ITN ≤ 1.0 | 46 (13.6) | 16 (13.5) |
1.0 < ITN ≤ 2.0 | 68 (20.1) | 18 (15.1) |
2.0 < ITN ≤ 3.5 | 96 (28.4) | 40 (33.6) |
3.5 < ITN ≤ 5.0 | 91 (26.9) | 32 (26.9) |
ITN > 5.0 | 37 (11.0) | 13 (10.9) |
Mobile device usage | ||
Daily duration, min, mean (SD) | — | 115.3 (115.1) |
Daily duration category, h, n (%) | ||
<1 | — | 49 (40.5) |
1–<2 | — | 32 (26.5) |
2–<3 | — | 15 (12.4) |
3–<4 | — | 7 (5.79) |
≥4 | — | 18 (14.9) |
Characteristic . | Full Cohort (N = 346) . | Unshared Devices (n = 121) . |
---|---|---|
Parent | ||
Age, y, mean (SD) | 34.0 (4.6) | 34.2 (4.6) |
Female sex, n (%) | 325 (93.9) | 112 (92.6) |
Education, n (%) | ||
High school or GED or less | 24 (6.9) | 10 (8.3) |
Some college or 2-y degree | 106 (30.6) | 46 (38.0) |
4-y college degree | 92 (26.6) | 32 (26.5) |
Advanced degree | 124 (35.8) | 33 (27.3) |
Married or with a partner, n (%) | 314 (91.0) | 111 (91.7) |
Employment, n (%) | ||
Unemployed | 97 (28.0) | 36 (29.8) |
Part-time job | 69 (19.9) | 21 (17.4) |
Full-time job | 158 (45.7) | 56 (46.3) |
Multiple jobs | 22 (6.4) | 8 (6.6) |
Child | ||
Age, y, mean (SD) | 3.82 (0.53) | 3.86 (0.55) |
Female sex, n (%) | 169 (48.8) | 54 (44.6) |
Race and/or ethnicity, n (%) | ||
White non-Hispanic | 258 (75.0) | 92 (76.7) |
Other | 86 (25.0) | 28 (23.3) |
Only child, n (%) | 63 (18.2) | 21 (17.4) |
Preschool or day care, n (%) | ||
Center based | 218 (65.3) | 77 (67.5) |
Home based | 26 (7.8) | 6 (5.3) |
Stays home with parent or caregiver | 90 (27.0) | 31 (27.2) |
Premature, n (%) | 26 (7.5) | 9 (7.4) |
Household | ||
ITN, n (%)a | ||
ITN ≤ 1.0 | 46 (13.6) | 16 (13.5) |
1.0 < ITN ≤ 2.0 | 68 (20.1) | 18 (15.1) |
2.0 < ITN ≤ 3.5 | 96 (28.4) | 40 (33.6) |
3.5 < ITN ≤ 5.0 | 91 (26.9) | 32 (26.9) |
ITN > 5.0 | 37 (11.0) | 13 (10.9) |
Mobile device usage | ||
Daily duration, min, mean (SD) | — | 115.3 (115.1) |
Daily duration category, h, n (%) | ||
<1 | — | 49 (40.5) |
1–<2 | — | 32 (26.5) |
2–<3 | — | 15 (12.4) |
3–<4 | — | 7 (5.79) |
≥4 | — | 18 (14.9) |
GED, general equivalency diploma; ITN, income-to-needs ratio; —, not applicable.
ITN of 1 = 100% of the federal poverty level for the family’s size; ITN of 2 = 200% of the federal poverty level, etc.
In the full sample, children used between 1 and 85 different apps over the course of the sampling week; the 20 most commonly played apps are listed in Table 2.
Android Users (n = 126) . | iOS Users (n = 220) . | ||
---|---|---|---|
App Name . | n (%) . | App Name . | n (%) . |
YouTube | 69 (54.8) | YouTube Kids | 67 (30.5) |
YouTube Kids | 27 (21.4) | YouTube | 58 (26.4) |
Browser | 27 (21.4) | Netflix | 55 (25.0) |
Quick Search Box | 19 (15.1) | Safari | 43 (19.5) |
Netflix | 15 (11.9) | Photos | 40 (18.2) |
Camera | 12 (9.5) | Camera | 39 (17.8) |
Gallery | 8 (6.3) | Siri | 35 (15.9) |
PBS Kids Games | 7 (5.6) | Prime Video | 18 (8.2) |
Facebook Messenger | 7 (5.6) | Nick Jr | 16 (7.3) |
Children’s Doctor Dentist | 6 (4.8) | FaceTime | 16 (7.3) |
Subway Surfers | 6 (4.8) | PBS Kids Video | 16 (7.3) |
Android Video | 6 (4.8) | PBS Kids Games | 15 (6.8) |
ABC Kids Toddler Tracing Phonics | 6 (4.8) | ABC Mouse | 14 (6.4) |
ABC Mouse | 6 (4.8) | Music | 9 (4.1) |
Pokémon Go | 5 (4.0) | Hulu | 8 (3.6) |
Line and Water | 4 (3.2) | LEGO Juniors | 6 (2.7) |
Hulu Plus | 4 (3.2) | My Talking Tom 2 | 6 (2.7) |
Kick the Buddy | 4 (3.2) | Disney NOW | 6 (2.7) |
Toca Kitchen 2 | 4 (3.2) | Facebook Messenger | 6 (2.7) |
PBS Kids Video | 4 (3.2) | Roblox | 6 (2.7) |
Android Users (n = 126) . | iOS Users (n = 220) . | ||
---|---|---|---|
App Name . | n (%) . | App Name . | n (%) . |
YouTube | 69 (54.8) | YouTube Kids | 67 (30.5) |
YouTube Kids | 27 (21.4) | YouTube | 58 (26.4) |
Browser | 27 (21.4) | Netflix | 55 (25.0) |
Quick Search Box | 19 (15.1) | Safari | 43 (19.5) |
Netflix | 15 (11.9) | Photos | 40 (18.2) |
Camera | 12 (9.5) | Camera | 39 (17.8) |
Gallery | 8 (6.3) | Siri | 35 (15.9) |
PBS Kids Games | 7 (5.6) | Prime Video | 18 (8.2) |
Facebook Messenger | 7 (5.6) | Nick Jr | 16 (7.3) |
Children’s Doctor Dentist | 6 (4.8) | FaceTime | 16 (7.3) |
Subway Surfers | 6 (4.8) | PBS Kids Video | 16 (7.3) |
Android Video | 6 (4.8) | PBS Kids Games | 15 (6.8) |
ABC Kids Toddler Tracing Phonics | 6 (4.8) | ABC Mouse | 14 (6.4) |
ABC Mouse | 6 (4.8) | Music | 9 (4.1) |
Pokémon Go | 5 (4.0) | Hulu | 8 (3.6) |
Line and Water | 4 (3.2) | LEGO Juniors | 6 (2.7) |
Hulu Plus | 4 (3.2) | My Talking Tom 2 | 6 (2.7) |
Kick the Buddy | 4 (3.2) | Disney NOW | 6 (2.7) |
Toca Kitchen 2 | 4 (3.2) | Facebook Messenger | 6 (2.7) |
PBS Kids Video | 4 (3.2) | Roblox | 6 (2.7) |
PBS, Public Broadcasting Service.
Average daily usage among the 121 children with their own tablet (n = 100) or smartphone (n = 21) was 115.3 minutes (SD 115.1; range 0.20–632.5) and was similar between Android (117.7; SD 143.2) and iOS (114.2; SD 101.3) users. More than half (59.5%) of children used their device for an average of ≥1 hour/day, including 18 (14.9%) who averaged ≥4 hours/day (Table 1).
Average daily use of the most commonly played apps by children with unshared devices is shown in Table 3; YouTube, YouTube Kids, and streaming video services revealed the highest daily duration, whereas the browser and Quick Search Box or Siri were accessed by a large number of children but used for briefer periods of time.
App Name . | Android (n = 37) . | iOS (n = 84) . | ||||
---|---|---|---|---|---|---|
n . | Average Daily Duration (SD), min/d . | Range, min/d . | n . | Average Daily Duration (SD), min/d . | Range, min/d . | |
Chrome or Safari | 17 | 0.36 (0.46) | 0.005–1.54 | 30 | 11.5 (25.8) | 0.1–109.7 |
YouTube Kids | 16 | 43.6 (36.2) | 0.07–118.0 | 33 | 112.6 (90.6) | 0.57–357.4 |
Quick Search Box or Siri | 19 | 0.44 (0.75) | 0.002–2.89 | 30 | 0.50 (0.71) | 0.1–2.86 |
YouTube | 15 | 83.7 (99.4) | 0.003–299.5 | 16 | 109.1 (110.5) | 0.2–390.1 |
Netflix | 6 | 112.1 (150.8) | 0.35–341.2 | 28 | 31.8 (33.7) | 0.1–113.2 |
Prime Video | n/a | n/a | n/a | 9 | 19.9 (15.5) | 1.9–46.3 |
Video appa | 6 | 0.83 (1.22) | 0.04–3.14 | 8 | 11.8 (15.1) | 0.1–36.7 |
FaceTime | n/a | n/a | n/a | 8 | 20.3 (45.0) | 0.1–131.1 |
Camera | 9 | 1.16 (1.58) | 0.01–3.75 | 19 | 0.98 (1.13) | 0.1–3.9 |
Photo Gallery | 7 | 3.94 (5.86) | 0.22–16.2 | 6 | 0.64 (0.64) | 0.1–1.71 |
Line and Water | 4 | 2.74 (3.47) | 0.80–7.94 | n/a | n/a | n/a |
Subway Surfer | 4 | 5.38 (4.61) | 1.78–12.1 | 2 | 16.9 (19.6) | 3–30.7 |
Toca Kitchen 2 | 4 | 1.60 (2.43) | 0.03–5.16 | 1 | n/a | n/a |
ABC Mouse | 2 | 1.13 (1.59) | 0.003–2.26 | 7 | 12.3 (16.7) | 0.1–47.1 |
Nick Jr | 2 | 1.95 (0.33) | 1.70–2.16 | 7 | 21.6 (21.6) | 0.1–60 |
App Name . | Android (n = 37) . | iOS (n = 84) . | ||||
---|---|---|---|---|---|---|
n . | Average Daily Duration (SD), min/d . | Range, min/d . | n . | Average Daily Duration (SD), min/d . | Range, min/d . | |
Chrome or Safari | 17 | 0.36 (0.46) | 0.005–1.54 | 30 | 11.5 (25.8) | 0.1–109.7 |
YouTube Kids | 16 | 43.6 (36.2) | 0.07–118.0 | 33 | 112.6 (90.6) | 0.57–357.4 |
Quick Search Box or Siri | 19 | 0.44 (0.75) | 0.002–2.89 | 30 | 0.50 (0.71) | 0.1–2.86 |
YouTube | 15 | 83.7 (99.4) | 0.003–299.5 | 16 | 109.1 (110.5) | 0.2–390.1 |
Netflix | 6 | 112.1 (150.8) | 0.35–341.2 | 28 | 31.8 (33.7) | 0.1–113.2 |
Prime Video | n/a | n/a | n/a | 9 | 19.9 (15.5) | 1.9–46.3 |
Video appa | 6 | 0.83 (1.22) | 0.04–3.14 | 8 | 11.8 (15.1) | 0.1–36.7 |
FaceTime | n/a | n/a | n/a | 8 | 20.3 (45.0) | 0.1–131.1 |
Camera | 9 | 1.16 (1.58) | 0.01–3.75 | 19 | 0.98 (1.13) | 0.1–3.9 |
Photo Gallery | 7 | 3.94 (5.86) | 0.22–16.2 | 6 | 0.64 (0.64) | 0.1–1.71 |
Line and Water | 4 | 2.74 (3.47) | 0.80–7.94 | n/a | n/a | n/a |
Subway Surfer | 4 | 5.38 (4.61) | 1.78–12.1 | 2 | 16.9 (19.6) | 3–30.7 |
Toca Kitchen 2 | 4 | 1.60 (2.43) | 0.03–5.16 | 1 | n/a | n/a |
ABC Mouse | 2 | 1.13 (1.59) | 0.003–2.26 | 7 | 12.3 (16.7) | 0.1–47.1 |
Nick Jr | 2 | 1.95 (0.33) | 1.70–2.16 | 7 | 21.6 (21.6) | 0.1–60 |
n/a, not applicable.
Includes the Samsung video app and iOS video app.
Among Android users, average pickup frequency was 3.82 per day (SD 5.48), children used devices on most (69.0%) days of sampling (SD 27.1%; range 25%–100%), and duration was longest on Fridays and Saturdays (Supplemental Fig 4). Example data visualizations of average usage of different app categories (eg, educational apps, streaming video) and diurnal patterns for specific participants are available in Supplemental Figs 5 and 6, respectively.
Of 115 participants with unshared devices and complete parent-report data, 41 (35.7%) parents underestimated, 34 (29.6%) were accurate, and 40 (34.8%) overestimated their child’s device use. Accuracy did not vary by operating system (Android 25.7% versus iOS 31.3%; P = .49). For inaccurate reporters, actual usage was on average 69.7 minutes (SD 67.5) above or below the parent-reported category bounds (median 50.7; range 0.86–332.5 minutes). Parents were more likely to overreport when their child’s average usage was <1 hour/day and underreport if their child’s average usage was ≥1 hour/day (χ2 test; P = .001).
Discussion
This is the first study to use an objective form of mobile device–based data collection (a method we term “mobile device sampling”) to examine young children’s tablet and smartphone usage. We found high variability in daily mobile device usage in children with their own smartphones or tablets, with ∼15% of children averaging ≥4 hours per day. The most commonly used apps were YouTube and YouTube Kids, followed by browsers, the camera and photograph gallery, and video streaming services such as Netflix.
Compared with our previous pilot research in which we used passive sensing in parents,14 we had significantly lower rates of missing data when using the Chronicle app for Android and screenshot-based data collection for iOS. However, we had an ∼10% missing data rate for Chronicle, which we are addressing by (1) screening participants to ensure Chronicle compatibility before enrollment, (2) developing new features on the OpenLattice platform to increase stability and reliability of data uploads, and (3) providing in-person installation or phone troubleshooting.
Strengths of this approach include highly reliable data because the Google usage statistics API is maintained by Google and used by thousands of vendors. Participating parents found the mobile sampling methods highly acceptable and were informed of how their child’s data would be collected, handled, and destroyed.
A main limitation of our current app is that it cannot identify the user of shared devices, which is important in early childhood when many children do not have their own devices. However, our subset approach allowed us to generate a list of apps used by children who share mobile devices with family members that can be coded for educational value,21 presence of advertising,22 or age-appropriate content. For example, we documented that preschool-aged children use YouTube (36.7% of our sample), general audience apps such as Cookie Jam and Candy Crush (30.6% of our sample), gambling apps such as Cashman, and violent apps such as Terrorist Shooter, Flip the Gun, and Granny, which are intended for use by teenagers and adults. These findings also have implications for child privacy because general audience apps and platforms may not place restrictions on the data they collect or distribute to third-party advertising companies.23
We found low accuracy of parent-reported mobile device duration compared with mobile sampling output, which is consistent with our previous research in parents.24 Inaccurate parents showed an average error of >60 minutes compared with their child’s actual daily device use. We therefore suggest that mobile device sampling may be an important future data collection tool for pediatric, adolescent, or adult research. For example, by using Chronicle, it is possible to define variables such as the number of checks of specific apps (eg, social media) per hour, usage during time periods when family meals or routines might occur, or overnight usage. At present, timestamped data are not available for iOS, and data transfer from screenshots is labor intensive; development of similar iOS tracking tools will therefore be necessary to fully assess children’s media landscapes. Mobile sampling will need to be used in combination with methods that capture media use on other platforms (eg, television, video game consoles) and other sensors that detect whether the user is awake (eg, Fitbit) or interacting with others (eg, LENA).
Limitations of our overall study design are worthy of mention. Use of online recruitment allowed for rapid enrollment of multiple families simultaneously because we did not have to schedule study visits, but it also led to higher rates of attrition immediately after enrollment. Our sample was more highly educated and had lower racial and/or ethnic diversity than the general population; future research in non–English-speaking populations is needed once our app interface is updated for other languages. Parents were aware of their child’s mobile device usage being tracked, which may have changed their usage behavior. Children may have used other mobile devices during the sampling period, so our results represent a minimum estimate of their true usage. Our app categorization approach was also limited by the fact that apps commonly disappear from app stores and may no longer appear when searched for several months later.
Conclusions
We describe development of a novel mobile device sampling method in which implementation allowed for description of the smartphone and tablet use behaviors of preschool-aged children. Given the limitations of parent report, such objective measurement tools must be developed and refined so that health research (and evidence-based guidelines) can reflect the complex ways modern media are used.
Dr Radesky conceptualized the data collection methods, designed the cohort study, supervised data collection, drafted the initial manuscript, and reviewed and revised the manuscript; Dr Weeks performed the data analysis and critically reviewed the manuscript for important intellectual content; Ms Ball, Ms Schaller, and Ms Yeo coordinated and conducted data collection, performed application coding, and reviewed and revised the manuscript; Dr Durnez, Mr Tomayo-Rios, and Ms Epstein developed and piloted the passive-sensing data collection methods, contributed to the data analysis, and reviewed and revised the manuscript; Drs Kirkorian, Coyne, and Barr helped develop the passive-sensing data collection methods and reviewed and revised the manuscript; and all authors approved the final manuscript as submitted and agree to be accountable for all aspects of the work.
FUNDING: Funded by Children and Screens: Institute of Digital Media and Child Development Inc for development of the passive-sensing technology and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (grant 1R21HD094051) for the Preschooler Tablet Study. Research Electronic Data Capture and recruitment support was provided through the Michigan Institute for Clinical and Health Research (Clinical and Translational Science Award UL1TR002240). Funded by the National Institutes of Health (NIH).
COMPANION PAPER: A companion to this article can be found online at www.pediatrics.org/cgi/doi/10.1542/peds.2020-1242.
References
Competing Interests
POTENTIAL CONFLICTS OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose
FINANCIAL DISCLOSURE: Dr Radesky is a consultant for and is on the Board of Directors of Melissa & Doug Toys and receives research support from Common Sense Media; the other authors have indicated they have no financial relationships relevant to this article to disclose.
Comments