Video Abstract

Video Abstract

Close modal
BACKGROUND AND OBJECTIVES:

Social robots (SRs) are increasingly present in medical and educational contexts, but their use in inpatient pediatric settings has not been demonstrated in studies. In this study, we aimed to (1) describe the introduction of SR technology into the pediatric inpatient setting through an innovative partnership among a pediatric teaching hospital, robotics development, and computational behavioral science laboratories and (2) present feasibility and acceptability data.

METHODS:

Fifty-four children ages 3 to 10 years were randomly exposed to 1 of 3 interventions: (1) interactive SR teddy bear; (2) tablet-based avatar version of the bear; or (3) plush teddy bear with human presence. We monitored intervention enrollment and completion patterns, obtained qualitative feedback on acceptability of SR use from child life–specialist stakeholders, and assessed children’s positive and negative affect, anxiety, and pain intensity pre- and postintervention.

RESULTS:

The intervention was well received and appeared feasible, with 93% of those enrolled completing the study (with 80% complete parent data). Children exposed to the SR reported more positive affect relative to those who received a plush animal. SR interactions were characterized by greater levels of joyfulness and agreeableness than comparison interventions. Child life specialist stakeholders reported numerous potential benefits of SR technology in the pediatric setting.

CONCLUSIONS:

The SR appears to be an engaging tool that may provide new ways to address the emotional needs of hospitalized children, potentially increasing access to emotionally targeted interventions. Rigorous development and validation of SR technology in pediatrics could ultimately lead to scalable and cost-effective tools to improve the patient care experience.

What’s Known on This Subject:

Social robots (SRs) have been successfully employed with children and adults in outpatient and educational settings and can be effective tools for engagement and stress reduction.

What This Study Adds:

By illustrating the feasibility and acceptability of introducing and studying SRs in the pediatric inpatient hospital setting, we show how SRs can offer an engaging and therapeutically valuable tool to address emotional needs of hospitalized children.

Addressing the emotional needs of hospitalized children is a complex task shared across the pediatric health care team. Key components include managing anxiety, pain, and separation inherent in the hospital experience; educating patients and families about their condition and treatment course; and implementing developmentally appropriate interventions to facilitate coping with stressful procedures. In many hospital systems, certified child life specialists (CLSs) are a focal point of this care. Although these services improve the hospital experience for many patients,1 they are often limited by human resources because staff cannot be with every patient through every aspect of their hospital experience.

Social robots (SRs) offer promise for addressing the current economy-of-scale gap between the emotional needs of medically ill children and the human capital required to meet those needs. They are designed to leverage social and affective attributes to sustain engagement, increase motivation, and facilitate coaching, monitoring, education, and communication.2 SRs, therefore, may also reach children who are less responsive to traditional human interactions3,4 and offer new pathways to comfort.5 They are particularly relevant in domains in which social and emotional supports are critical for positive outcomes, including health care.6 Several SRs have been developed for diverse health care needs, including autism therapy,7 physical rehabilitation,8 and weight management.9 Virtual SRs (eg, tablet- or computer-based agents) have also been deployed in a range of health contexts, primarily with adults. In at least 1 study in which researchers compared the 2 formats of socially assistive technology in a medical context, they found that physical robots were significantly better at sustaining engagement, building trust, establishing working alliances, and creating emotional bonds with users.10 SRs have been used with pediatric patients in outpatient settings to address vaccination-related distress11 and promote medical adherence.12 However, little work has explored how SRs could assist pediatric inpatients with critical and/or chronic health conditions who experience high levels of stress and pain.

To address this gap, our team developed an innovative partnership among a tertiary care academic hospital (Boston Children’s Hospital), SR expertise (Massachusetts Institute of Technology Media Laboratory), and computational behavioral science (Northeastern University) to design, create, and evaluate an SR teddy bear (named “Huggable”) capable of enhancing the emotional experiences of hospitalized children. Long-term goals include mitigating experiences of stress, pain, and isolation as well as fostering positive emotions and engagement. In the current study, we hypothesized that it is feasible and acceptable to families and hospital staff to integrate SR technology into pediatric care. We examined a secondary hypothesis that participants exposed to the SR intervention would show more positive affect, less negative affect, and lower levels of pain and anxiety after the exposure relative to participants in comparison conditions.

This is a feasibility and acceptability trial with preliminary efficacy data, set on pediatric medical and surgical floors, in which a convenience sample of patients was used. Our multidisciplinary design and evaluation team included clinicians (critical care physician, CLS), psychologists (hospital-based clinical and experimental), roboticists, and engineers. We obtained input from patients, families, and medical unit staff (eg, nurses) who were introduced to the robot in hospital room visits and clinical team meetings. The hospital’s institutional review board approved the study, and parents provided written informed consent for children’s participation.

We enrolled 54 medically or surgically hospitalized children ages 3 to 10 years (18 in robot, 17 in avatar, 19 in plush conditions). Inclusion criteria included being English-speaking and admitted to one of several participating general and hematology-oncology floors for >48 hours of hospitalization. We excluded patients with pacemakers (to avoid potential wireless interference) or significant developmental delays and those admitted to nonparticipating floors. Clinical staff preapproved patient participation on the basis of current clinical status (patient awake and alert, not in medical or emotional crisis, etc).

We piloted a between-groups, randomized open trial with 3 conditions: (1) a tele-operated Huggable robot in full physical form (“robot”); (2) a tablet version of a tele-operated Huggable, providing an interactive comparison condition without physical presence (“avatar”); and (3) a static plush teddy bear presented by a CLS to provide a physically comforting but noninteractive comparison (“plush”). To counterbalance conditions demographically, block random assignment was applied. We assigned children ages 3 to 5 years to the younger-age block and children ages 6 to 10 years to the older-age block. Age groups were determined a priori to align with preschool-aged and school-aged developmental stages. Within each age-by-sex block, children were randomly assigned to intervention condition. In Fig 1, we depict study flow.

FIGURE 1

Study flow.

For this stage of development, Huggable (both forms) was tele-operated through “Wizard of Oz” methodology,13 in which a human (CLS staff in this case) tele-operated the robot or avatar. Wizard of Oz is a standard approach in the human-robot interaction field to inform the creation of autonomous SRs. Remotely controlling the robot or avatar in a real-world setting allows researchers to collect data and observe behaviors to identify important usage opportunities and necessary capabilities. The Huggable tele-operator could trigger facial expressions and body actions, talk through Huggable in a childlike, pitch-shifted voice, and see and hear participants and their surroundings via camera feed.

A CLS was in the patient’s room during interventions to facilitate interactions. The avatar and robot Huggable engaged participants by conversing about their likes and dislikes, singing nursery rhymes, and playing an I spy game. The plush was presented and “puppeteered” by CLSs, interacting as they typically would clinically, thus providing a human element to this comparison condition. Patients, parents, and any medical staff present were instructed to act as they typically would, to test the intervention in a natural setting, allowing care routines to unfold as needed. Interventions ended after 30 minutes or sooner if participants appeared ready to end the encounter (eg, because of fatigue). Interventions were videotaped and subsequently transcribed and coded by trained raters to evaluate affect, speech, and engagement.

The robot had a plush bear exterior and used an Android smartphone as its primary computational unit, with the screen depicting digitally animated eyes. The virtual Huggable avatar ran on an Android tablet with identical degrees of freedom and animations. All equipment, including Huggable’s fur, was cleaned between each intervention in accordance with hospital infection control standards.

Additional study equipment included the Q Sensor (Affectiva, Inc) wireless wrist-worn electrodermal activity (EDA) sensor to passively and continuously measure peripheral sympathetic nervous system arousal via skin conductance, skin surface temperature, and motor movements via 3-axis accelerometry.

Video capture was initially achieved through 2 wired Logitech C920 cameras positioned for a wide-angle view. A third camera was mounted underneath the bedside table to capture the patient’s face during interactions. Over time, we transitioned to battery-operated GoPro Hero 3+ cameras that enabled easier setup, wireless monitoring, faster troubleshooting, higher-quality image capture, and easier footage recovery.

Feasibility and Acceptability

Indicators of feasibility included (1) tracking of enrollment and completion rates and (2) analysis of procedural difficulties that emerged during intervention implementation. Acceptability indicators included (1) whether participants found playing with Huggable fun and whether they would want to play with Huggable again (yes or no) and (2) open-ended questions administered to CLS project staff, whose responses were qualitatively analyzed via content analysis to elicit common themes related to perceived benefits and risks of the intervention and suggestions for process improvement. Specific cutoffs for these indicators were not stated a priori; rather, we sought to evaluate overall feasibility and acceptability across multiple indicators.

Patient and parent report questionnaires included in the study were used to assess perceived emotional state and current pain intensity. Measures were collected before and immediately after intervention exposure.

Facial Affective Scale

The Facial Affective Scale consists of 9 faces varying by level of overt distress.14 Children are instructed to identify the face that “looks like how you feel inside.”14 We used a modified scoring transformation, scoring faces from 1 to 9, with lower numbers indicating more positive affect. Parent proxy ratings were also obtained.

Positive and Negative Affect Scales for Children (PANAS-C), Brief version15 is a 10-item self-report measure of child affect. Children rate the degree to which they currently feel emotions (eg, “nervous”) on a 1 to 5 scale. Higher numbers indicate more emotion for each subscale, Positive and Negative. Reliability and validity data for the measure support its use in school-aged children.15 We administered the Brief PANAS-C to children ages ≥6 years.

State-Trait Anxiety Inventory (STAI) and State-Trait Anxiety Inventory for Children (STAIC)16 measure general proneness to anxiety (trait anxiety) and anxiety as a transient emotional state (state anxiety). We used the STAIC scale to assess changes in anxiety with the intervention. The 20 items are rated on a 1 to 4 scale; higher scores indicate more anxiety. Parents completed the STAI trait measure (pre-intervention only) to be included as a covariate. Validity and reliability of both versions are well established and reported in the instrument manuals. The STAIC was originally validated on children ages ≥8 years but has been used in past studies with younger children.17,18 

Pain Ratings

To rate pain intensity, younger children used the Faces Pain Rating Scale Revised,19 and older children used a numeric rating scale (NRS). In both scales, a metric of 0 to 10 is employed for rating pain intensity, wherein higher scores indicate higher pain. Both measures are valid and reliable in children.19,20 

Feasibility data were summarized by using descriptive analyses of recruitment and study completion processes along with qualitative thematic analysis of CLS interviews. We used repeated measures analysis of variance techniques (including covariates when appropriate) to analyze self-report data. To analyze participants’ verbal utterances from video coding, we transcribed and coded footage using a Health Insurance Portability and Accountability Act–compliant vendor. Sentiments for total (all individuals in the room) and patient-only utterances were analyzed with IBM Watson’s Tone Analyzer,21 a computerized cognitive-linguistic analysis tool assessing emotional tones (anger, fear, joy, sadness, and disgust) in written text. Analyses were focused on scoring average joy, agreeableness, and sadness. Joy and sadness scores were calculated for total utterances. Agreeableness scores were calculated for patient-only utterances. When analyzing agreeableness, 18 participants (all hematology-oncology patients) were removed for failing to produce enough utterances (minimum 100) for reliable analysis. We also analyzed intervention duration and extent of patients’ physical movement.22 Generalized linear modeling approaches were used for analyses, with predictor variables contrast coded as ordered values [−1 0 1], for robot, avatar, and plush.

EDA recorded via Q Sensor was used to measure physiologic and emotional arousal. Engagement levels during the last third of the interaction period were annotated for comparison with children’s EDA responses, wherein the rate and detrended SD of nonspecific skin conductance responses were analyzed to indicate arousal levels.

In Tables 1 and 2, we present information on study feasibility, including participation rates and reasons for missing data. In Fig 2, we show length of intervention by condition (robot: mean = 26.4 minutes, SD = 17.0; avatar: mean = 21.7, SD = 11.4; plush: mean = 7.8, SD = 5.1).

TABLE 1

Feasibility Data: Numbers of Participants Providing Complete Data at Each Time Point

n%
Deemed eligible by study staff and approved by clinical team 68 — 
Consented to participate 54 79 
Child completed at least a portion of study measures at baseline 50 93 
Parent completed at least a portion of study measures at baseline 43 80 
Child completed at least a portion of study measures postintervention 50 93 
Parent completed at least a portion of study measures postintervention 44 81 
n%
Deemed eligible by study staff and approved by clinical team 68 — 
Consented to participate 54 79 
Child completed at least a portion of study measures at baseline 50 93 
Parent completed at least a portion of study measures at baseline 43 80 
Child completed at least a portion of study measures postintervention 50 93 
Parent completed at least a portion of study measures postintervention 44 81 

Two participants consented to the study and later withdrew as a result of not feeling up to participating. —, not applicable.

TABLE 2

Feasibility Data: Reasons for Missing Data Points

Reason for Data LossNo. Participants Impacted, n (%)
Video recording problems 11 (20) 
Q Sensor removed early or refused entirely 9 (17) 
Child unable or unwilling to complete surveys 13 (24) 
Parent unable or unwilling to complete surveys 5 (9) 
Technical problems with robot (eg, speaker not working) 4 (8) 
Child scared of Huggable 1 (2) 
Reason for Data LossNo. Participants Impacted, n (%)
Video recording problems 11 (20) 
Q Sensor removed early or refused entirely 9 (17) 
Child unable or unwilling to complete surveys 13 (24) 
Parent unable or unwilling to complete surveys 5 (9) 
Technical problems with robot (eg, speaker not working) 4 (8) 
Child scared of Huggable 1 (2) 
FIGURE 2

Total intervention time by group.

FIGURE 2

Total intervention time by group.

Close modal

Initial feasibility challenges we identified included minimizing clinical workflow impact and overcoming clinical staff reservations. Equipment had to be installed and removed for every experimental session without disturbing families. We installed most devices and wires on a mobile cart to minimize in-room setup time. However, this had the negative side effect of sometimes compromising video capture because of constraints of the physical environment and/or patient movement. Various wireless devices and lead-lined walls of some units created interference in the heavily used hospital wireless signal, causing periodic but significant delays or malfunctions in Huggable operation. To allay staff concerns, we avoided scheduling interactions at times when clinicians needed patient access (eg, rounds procedures).

Regarding acceptability, 93% of child participants endorsed that meeting the SR was “fun,” and 92% expressed a desire to play with the SR again. Numbers were too small for statistical analysis by group, but acceptability for the SR was similar to the tablet (94% for fun, 92% for playing again) but greater than the plush (75% for fun, 63% for playing again). Content analysis of qualitative CLS interviews highlighted the rewards to hospitalized children that emerged from the introduction of an SR into the hospital setting and elucidated areas for future development. Several themes arose from these interviews to inform our subsequent efforts. Prevalent themes are summarized in Table 3.

TABLE 3

Themes From Child Life Staff’s Involvement in Huggable Intervention

ThemeExample Comments
Technology provides ability to connect with patients in a different way I would say in general, children made a social connection to Huggable robot, often displaying an emotional attachment or empathy, such as giving hugs and offering him a pillow to rest on when he was tired. 
 Some patients really seemed to connect on a social/emotional level with Huggable (both robot and tablet). Many patients did not want Huggable to leave or asked if Huggable could come back at a later time to “meet my mother” or another family member. Patients often asked about Huggable’s age, gender, likes/interests, and family situation. 
Technological demands or glitches detracted from experience What made me feel the most worried about the tablet and robot interactions was the concern of whether the tech/equipment/sound would work throughout the interaction. 
 In their current states, the Huggable robot and the Huggable tablet require a significant amount of time and coordination for the setup and the involvement of 2 child life specialists and a technician for each interaction. 
 The setup time often took longer than expected, and a few patients and families were left waiting (some families became visibly upset). 
 Patients and families could often hear the operator from inside the patient’s room. This is distracting and also loses the “magic” of the interaction. 
Individual and situational differences are major influences on patients’ responses to interventions The interactions seemed to be the least effective if the child was tired, very ill, and/or had too many things happening in the room (eg, staff coming in and out and other distractions, such as the television being on, food being on the table, etc). 
 Much of the impact [of the intervention] was determined by the individual patient, the health care team involved with patient, culture of specific unit, overall health care experience, and illness acuity. 
 Children who responded best were ages 5–8 years old, verbal, and feeling well enough to engage in primarily verbal social interaction and game play. 
 It was helpful when the parent/caregiver in the room allowed the patient to take the lead in the interaction with the robot and not try to “take over” the interaction/experience. 
Potential future directions [A fully autonomous] Huggable might be able to provide encouragement/support to patients to participate in a medical task or daily activities (eg, physical therapy exercises). Huggable could also help gather information about patients’ current mood/feelings or pain levels. 
 Repeated exposures to Huggable would be especially helpful for patients who are away from their parents/caregivers/families for extended periods of time. 
 Children may feel more comfortable opening up/trusting a robot versus sharing with a hospital staff member. 
ThemeExample Comments
Technology provides ability to connect with patients in a different way I would say in general, children made a social connection to Huggable robot, often displaying an emotional attachment or empathy, such as giving hugs and offering him a pillow to rest on when he was tired. 
 Some patients really seemed to connect on a social/emotional level with Huggable (both robot and tablet). Many patients did not want Huggable to leave or asked if Huggable could come back at a later time to “meet my mother” or another family member. Patients often asked about Huggable’s age, gender, likes/interests, and family situation. 
Technological demands or glitches detracted from experience What made me feel the most worried about the tablet and robot interactions was the concern of whether the tech/equipment/sound would work throughout the interaction. 
 In their current states, the Huggable robot and the Huggable tablet require a significant amount of time and coordination for the setup and the involvement of 2 child life specialists and a technician for each interaction. 
 The setup time often took longer than expected, and a few patients and families were left waiting (some families became visibly upset). 
 Patients and families could often hear the operator from inside the patient’s room. This is distracting and also loses the “magic” of the interaction. 
Individual and situational differences are major influences on patients’ responses to interventions The interactions seemed to be the least effective if the child was tired, very ill, and/or had too many things happening in the room (eg, staff coming in and out and other distractions, such as the television being on, food being on the table, etc). 
 Much of the impact [of the intervention] was determined by the individual patient, the health care team involved with patient, culture of specific unit, overall health care experience, and illness acuity. 
 Children who responded best were ages 5–8 years old, verbal, and feeling well enough to engage in primarily verbal social interaction and game play. 
 It was helpful when the parent/caregiver in the room allowed the patient to take the lead in the interaction with the robot and not try to “take over” the interaction/experience. 
Potential future directions [A fully autonomous] Huggable might be able to provide encouragement/support to patients to participate in a medical task or daily activities (eg, physical therapy exercises). Huggable could also help gather information about patients’ current mood/feelings or pain levels. 
 Repeated exposures to Huggable would be especially helpful for patients who are away from their parents/caregivers/families for extended periods of time. 
 Children may feel more comfortable opening up/trusting a robot versus sharing with a hospital staff member. 

In Table 4, we list descriptive results from self-report questionnaires, and in Table 5 we summarize participant diagnoses and reasons for hospitalization, included as covariates where relevant. Data conformed to assumptions of normality and were found appropriate for parametric statistical tests. Overall, children reported statistically significantly greater positive affect (preintervention Brief PANAS-C–positive mean = 3.48, postintervention mean = 3.89; P < .01), and parents reported lower perceptions of children’s pain (preintervention NRS mean = 1.71, postintervention mean = 0.81; P < .01) and more positive child affect (Facial Affective Scale preintervention mean = 3.12, postintervention mean = 2.58; P < .05) after the intervention, regardless of condition. There was a statistically significant group effect on positive affect, with children exposed to the robot reporting greater positive affect relative to those in the plush condition (robot group postintervention Brief PANAS-C–positive mean = 4.57, plush group mean = 3.39; P < .05). No other statistically significant group differences were found on self-report.

TABLE 4

Variable Descriptive Statistics

VariableMeasure RangePretest, nPretest, Mean (SD)Posttest, nPosttest, Mean (SD)
Positive affecta, Brief PANAS-C21 positive 1–5 26 3.48 (1.09) 25 3.89 (1.15) 
Negative affecta, Brief PANAS-C negative 1–5 26 1.27 (0.50) 24 1.19 (0.27) 
Facial affect scale,22 child 1–9 49 2.33 (2.09) 45 1.73 (1.37) 
Facial affect scale, parent 1–9 47 3.17 (1.58) 54 1.06 (4.20) 
Anxietya, STAI-C23  1–4 28 2.27 (0.31) 28 2.35 (0.30) 
Parent trait anxiety, STAI24  1–4 54 1.88 (0.74) 54 1.81 (0.63) 
Pain score, child report, NRS25 or FPRS-R25  0–10 53 0.85 (1.79) 49 1.16 (2.15) 
Pain score, NRS,26 parent report 0–10 46 1.65 (2.26) 54 1.00 (4.11) 
VariableMeasure RangePretest, nPretest, Mean (SD)Posttest, nPosttest, Mean (SD)
Positive affecta, Brief PANAS-C21 positive 1–5 26 3.48 (1.09) 25 3.89 (1.15) 
Negative affecta, Brief PANAS-C negative 1–5 26 1.27 (0.50) 24 1.19 (0.27) 
Facial affect scale,22 child 1–9 49 2.33 (2.09) 45 1.73 (1.37) 
Facial affect scale, parent 1–9 47 3.17 (1.58) 54 1.06 (4.20) 
Anxietya, STAI-C23  1–4 28 2.27 (0.31) 28 2.35 (0.30) 
Parent trait anxiety, STAI24  1–4 54 1.88 (0.74) 54 1.81 (0.63) 
Pain score, child report, NRS25 or FPRS-R25  0–10 53 0.85 (1.79) 49 1.16 (2.15) 
Pain score, NRS,26 parent report 0–10 46 1.65 (2.26) 54 1.00 (4.11) 

FPRS-R, Faces Pain Rating Scale Revised.

a

Only administered to children in the older-age group.

TABLE 5

Information on Participants’ Medical Diagnosis and Reason for Admission

InterventionDiagnoses%Reasons for Admission%
Robot Leukemia 47 Medical oncology treatment 41 
 Other cancer 18 Infection treatment 24 
 Congenital anatomic abnormality 24 Surgical repair 18 
 SCD HSCT 12 
 Trauma BMT 
     
Avatar Leukemia 47 Medical oncology treatment 65 
 Other cancer 29 Surgical repair 18 
 Congenital anatomic abnormality 12 BMT 12 
 Appendicitis  Pain crisis 
 SCD   
     
Plush Leukemia 37 Surgical repair 42 
 Other cancer 11 Infection treatment 26 
 Appendicitis 16 Medical oncology treatment 21 
 Infection 16 HSCT 11 
 Congenital anatomic abnormality 11   
 Ulcerative colitis   
 Trauma   
InterventionDiagnoses%Reasons for Admission%
Robot Leukemia 47 Medical oncology treatment 41 
 Other cancer 18 Infection treatment 24 
 Congenital anatomic abnormality 24 Surgical repair 18 
 SCD HSCT 12 
 Trauma BMT 
     
Avatar Leukemia 47 Medical oncology treatment 65 
 Other cancer 29 Surgical repair 18 
 Congenital anatomic abnormality 12 BMT 12 
 Appendicitis  Pain crisis 
 SCD   
     
Plush Leukemia 37 Surgical repair 42 
 Other cancer 11 Infection treatment 26 
 Appendicitis 16 Medical oncology treatment 21 
 Infection 16 HSCT 11 
 Congenital anatomic abnormality 11   
 Ulcerative colitis   
 Trauma   

BMT, bone marrow transplant; HSCT, hematapoetic stem cell transplant; SCD, sickle cell disease.

We examined group differences in behavioral responses. Total joyfulness utterances showed a statistically significant increase across the 3 conditions (robot > avatar > plush; P = .003). Agreeableness utterances showed a statistically significant increase (robot > avatar > plush; P = .001). Sadness utterances showed a statistically significant decrease across conditions (robot < avatar < plush; P = .026). (See Figs 35.)

FIGURE 3

Joy scores of total utterances (including all individuals in room) by condition. The joy scores of total utterances showed a statistically significant trend of increase over the three experimental conditions (robot > avatar > plush; ** P = .003).

FIGURE 3

Joy scores of total utterances (including all individuals in room) by condition. The joy scores of total utterances showed a statistically significant trend of increase over the three experimental conditions (robot > avatar > plush; ** P = .003).

Close modal
FIGURE 4

Agreeableness scores by patients across conditions. The agreeableness scores of the utterances made by patients in bone marrow transplant and surgical units revealed a statistically significant trend of increase over the three experimental conditions (robot > avatar > plush; ** P = .001).

FIGURE 4

Agreeableness scores by patients across conditions. The agreeableness scores of the utterances made by patients in bone marrow transplant and surgical units revealed a statistically significant trend of increase over the three experimental conditions (robot > avatar > plush; ** P = .001).

Close modal
FIGURE 5

Sadness scores of total utterances (across all individuals in room) by condition. The sadness scores of total utterances showed a statistically significant trend of decrease over the three experimental conditions (robot < avatar < plush; * P = .026).

FIGURE 5

Sadness scores of total utterances (across all individuals in room) by condition. The sadness scores of total utterances showed a statistically significant trend of decrease over the three experimental conditions (robot < avatar < plush; * P = .026).

Close modal

There was a reduction in participants’ physiologic arousal over time across all groups (average video-coded engagement in final third of the interaction period correlated negatively with skin conductance [EDA] response rate; r = −0.446, P = .037). Because of missing data, cell sizes were too small for valid comparisons by group.

In the context of shared desire among chronically ill children, families, and providers to leverage technology for more personalized pediatric care, we sought in this study to examine feasibility and acceptability of SR technology in the inpatient setting. Traditional methods for addressing emotional aspects of pediatric hospitalization typically have shown positive but short-lived and small effects.23,26 Our innovative partnership between pediatric clinicians and engineering and technology experts sought to explore whether advanced SR technology can expand the emotional comfort we can offer hospitalized children.

The solid participation rate (79%) in the study demonstrates feasible enrollment and patient and family enthusiasm for the technology. Rates of completed data, although excellent overall (93%), highlight several challenges, such as discomfort with the Q Sensor, inability to complete postintervention questionnaires, or insufficient verbal output for speech analysis. Because this is the first known study in which SR technology has been explored in an inpatient setting with ill children, there are no comparable studies against which to evaluate our success. Previous work with SR technology and children outside the laboratory have included small numbers,27 were designed as observational studies in public settings,28 or included few or no study-specific requirements, such as questionnaires or additional monitoring procedures.11 Our rate of incompletion due to technical failures appears similar to or better than previous published studies,29 which is particularly promising given the added logistic challenges of the inpatient hospital setting. Patients found the SR engaging and expressed a desire to play with it again, demonstrating acceptability. In-depth interviews with CLSs highlight that CLSs valued the opportunities this technology provides to connect with patients in new ways. They also underscore the role of individual differences that likely make each child’s and family’s interaction with the SR unique, an important area to attend to in future research.

Through our team’s combined expertise, we implemented some creative problem-solving to address the feasibility of integrating SR technology into the clinical environment. Reflecting on this process may be useful for others considering introducing SRs in similar settings. Examples included tethering devices directly to a router and using a high-quality wireless access point that boosted and directed signals to overcome the demands on wireless signal and the disruptions to transmission due to environmental features, such as lead-lined walls. Although staff were at times wary of this new technology, we found ways to increase buy in. Unfortunately, avoiding times when clinicians required patient access limited our ability to assess the effects of the SR on patient response to stressful procedures in vivo. We intend to explore this in future work. Fortunately, as clinicians observed patients and families having positive interactions, they appeared to grow more supportive of SR presence.

Not surprisingly, the medical issues facing our sample required adaptations to approaches developed in the Massachusetts Institute of Technology team’s previous experiences with children.30,32 Fatigue may have contributed to brevity of some interventions, influenced engagement levels and speech output, or limited completion of postintervention self-reports. Because we focused on capturing immediate affective responses in our pilot, we could not defer data collection. We continue to explore optimal outcome measures and postintervention assessment timing. The team also had to be sensitive to issues with tolerating monitoring devices such as the wrist-worn EDA biosensor. Despite our team’s experience using these technologies successfully in healthy children, we recognized that children in the inpatient setting must tolerate many monitors and invasive procedures. Because hospitalized children have limited control over what is happening to them,33,34 at times we chose to sacrifice research data to avoid children feeling coerced or experiencing added discomfort.

Overall, preliminary findings suggest that hospitalized children benefit from SR technology, as evidenced by increases in reported positive affect after exposure to the robot relative to the other interventions, along with greater expressed joyfulness and agreeableness among children in the robot condition. Decreased EDA reactivity across groups toward the end of interventions tentatively indicates that “quality” of engagement or attention often waned over time. Because of missing EDA data points, sample size was not sufficient to reliably investigate this effect on a group level, requiring further study to determine if an SR could foster more sustained engagement. Recent advances in biometric data collection approaches, such as the use of laboratory-grade ambulatory monitors,35 may help in this regard. Results augment our previously published findings that children who interact with a robot show greater increase in physical movement over the course of an intervention and have longer verbal exchanges relative to other conditions.22 These emotional, physical, and verbal outcomes are all positive factors that could contribute toward better and faster recovery in hospitalized children.36,37 

All technologies have different affordances and strengths. For instance, a wearable device is suitable for seamless measurement of physiologic data without affecting users’ attention or behavior, whereas screen-based devices offer the interface to display various types of visual information. According to our results, as well as over 30 published studies,38 physically present and embodied robots engage humans socioemotionally better than virtual avatars. This makes SRs an interesting and relevant technology in the pediatric care context in comparison with other devices. Recent neuroscientific findings reveal that interpersonal interaction with a trusted ally can mitigate perceptions of physical pain and increase comfort.39 At Boston Children’s Hospital and many institutions, a multidisciplinary team provides emotional support and pain management, with CLSs serving a prominent role. However, CLSs cannot be with every patient continuously. Tools such as SRs can increase reach and ultimately provide continuous monitoring and assessment of children’s variable emotional states throughout the hospital experience to enhance comfort and care.

Our feasibility and preliminary outcome data and insights will inform development of a fully autonomous SR, over a projected time frame of 2 to 3 years, to scale the application of SR technology that augments human capacity and enhances the positive emotional experience of patients. Although developing emotional intelligence that can fully understand how people feel and behave is a long-term goal, interactive technologies and devices are starting to be widely used by the general population, with some already targeting health care application (eg, KidsMD on Amazon Echo and Mabu [http://www.cataliahealth.com]). Thus, we believe it is feasible to build a fully autonomous SR for pediatric patients within this time frame. In future studies, our technical focus shall be to improve our affective computing algorithms to better characterize and monitor children’s affective state, coupled with computational methods for the SR to learn how to effectively support and engage each child through dialogue and playful activities, in concert with clinical staff.

It is important to note the limitations of this pilot study. Within the real-world clinical setting, a between-subjects design was an important first step, but it failed to control for individual patient variability or to expose all patients to each intervention. In future within-subjects studies, we can explore whether SRs are consistently more effective than nontechnical interventions for all hospitalized children, under specific circumstances (eg, times of greater fatigue), or for patient subgroups (eg, age, health condition, etc). Of note, our sample was weighted toward oncology patients; in future studies, researchers should assess the use of SRs in large groups of patients with diverse medical conditions. CLS facilitators were not blinded in this design, potentially biasing interactions. Our small sample size limited power and generalizability, and factors such as current physical state may have influenced patient responses. Finally, opportunities exist to further expand our understanding of the impact of SR technology through in-depth interviews with families, as we have shown in previous work.40 

Despite these limitations, our experiences underscore significant opportunities that exist to further develop an engaging and therapeutically valuable SR platform for children with chronic physical illnesses. Our hope is for rigorous development and validation of SR technology in pediatrics to result in scalable, cost-effective tools to reduce length of stay and recidivism. More importantly, this technology can offer entirely new paradigms to optimize emotional experiences, outcomes, and quality of life for children and families as they navigate the health care journey.

Dr Logan conceptualized and designed the study, conducted initial analysis, drafted the manuscript, and critically reviewed the manuscript for important intellectual content; Dr Weinstock and Prof Breazeal conceptualized and designed the study, oversaw implementation, and critically reviewed the manuscript for important intellectual content; Dr Goodwin, Dr Heathers, and Ms Jeong designed data collection instruments and technological components of the project, oversaw data collection and analysis, and reviewed and revised the manuscript; Ms O’Connell and Mr Smith-Freedman contributed to project design and implementation, collected data, and reviewed and revised the manuscript; and all authors approved the final manuscript as submitted and agree to be accountable for all aspects of the work.

FUNDING: No external funding.

We thank the research team, including Alex Ahmed, Laurel Anderson, Melissa Burke, Jimmy Day (video production), Pauline Dimaano, Katie Fitzpatrick, Suzanne Graca, Honey Goodenough, Katherine Jamieson, Brianna Jehl, Nicole Stenquist, Taylor Turrisi, and Tessa Wihak for contributions to this work.

CLS

child life specialist

EDA

electrodermal activity

NRS

numeric rating scale

PANAS-C

Positive and Negative Affect Scales for Children

SR

social robot

STAI

State-Trait Anxiety Inventory

STAIC

State-Trait Anxiety Inventory for Children

1
Wilson
JM
;
American Academy of Pediatrics Child Life Council and Committee on Hospital Care
.
Child life services.
Pediatrics
.
2006
;
118
(
4
):
1757
1763
[PubMed]
2
Tapus
A
,
Mataric
MJ
,
Scasselati
B
.
The grand challenges in socially assistive robotics.
IEEE Robot Autom Mag
.
2007
;
14
(
1
):
35
42
3
Scassellati
B
. How social robots will help us to diagnose, treat, and understand autism. In:
Thrun
S
,
Brooks
R
,
Durrant-Whyte
H
, eds.
Robotics Research
.
Vol 28
.
Heidelberg, Germany
:
Springer
;
2007
:
552
563
4
Cabibihan
J-J
,
Javed
H
,
Ang
M
,
Aljunied
S
.
Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism.
Int J Soc Robot
.
2013
;
5
(
4
):
593
618
5
Meltzoff
AN
,
Kuhl
PK
,
Movellan
J
,
Sejnowski
TJ
.
Foundations for a new science of learning.
Science
.
2009
;
325
(
5938
):
284
288
[PubMed]
6
Okamura
AM
,
Mataric
MJ
,
Christensen
HI
.
Medical and health-care robotics.
IEEE Robot Autom Mag
.
2010
;
17
(
3
):
26
37
7
Kim
ES
,
Newland
E
,
Paul
R
,
Scassellati
B
.
A robotic therapist for positive, affective prosody in high-functioning autistic children
. In:
Presented at the 2008 International Meeting for Autism Research (IMFAR)
;
May 15, 2008
;
London, United Kingdom
8
Matarić
MJ
,
Eriksson
J
,
Feil-Seifer
DJ
,
Winstein
CJ
.
Socially assistive robotics for post-stroke rehabilitation.
J Neuroeng Rehabil
.
2007
;
4
:
5
[PubMed]
9
Kidd
C
,
Breazeal
C
.
Robots at home: understanding long-term human-robot interaction
. In:
Proceedings from the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
;
September 22–26, 2008
; Nice, France:
3230
3235
10
Kidd
C
;
ProQuest Dissertations Publishing
. Designing for long-term human-robot interaction and application to weight loss. 2008. Available at: http://search.proquest.com/docview/304355818. Accessed February 1, 2008
11
Beran
TN
,
Ramirez-Serrano
A
,
Vanderkooi
OG
,
Kuhn
S
.
Reducing children’s pain and distress towards flu vaccinations: a novel and effective application of humanoid robotics.
Vaccine
.
2013
;
31
(
25
):
2772
2777
[PubMed]
12
Blanson Henkemans
OA
,
Bierman
BPB
,
Janssen
J
, et al
.
Using a robot to personalise health education for children with diabetes type 1: a pilot study.
Patient Educ Couns
.
2013
;
92
(
2
):
174
181
[PubMed]
13
Riek
L
.
Wizard of oz studies in HRI: a systematic review and new reporting guidelines.
J Hum Robot Interact
.
2012
;
1
(
1
):
119
136
14
McGrath
PA
.
Pain in Children: Nature, Assessment, and Treatment
.
New York, NY
:
Guilford Press
;
1990
15
Laurent
J
,
Catanzaro
SJ
,
Joiner
TE
, et al
.
A measure of positive and negative affect for children: scale development and preliminary validation.
Psychol Assess
.
1999
;
11
(
3
):
326
338
16
Spielberger
CD
,
Gorsuch
RL
,
Lushene
RE
.
State-Trait Anxiety Inventory: Self-Evaluation Questionnaire
.
Palo Alto, CA
:
Consulting Psychologists Press
;
1970
17
Bringuier
S
,
Dadure
C
,
Raux
O
,
Dubois
A
,
Picot
M-C
,
Capdevila
X
.
The perioperative validity of the visual analog anxiety scale in children: a discriminant and useful instrument in routine clinical practice to optimize postoperative pain management.
Anesth Analg
.
2009
;
109
(
3
):
737
744
[PubMed]
18
Kurnatowski
P
,
Putyński
L
,
Łapienis
M
,
Kowalska
B
.
Physical and emotional disturbances in children with adenotonsillar hypertrophy.
J Laryngol Otol
.
2008
;
122
(
9
):
931
935
[PubMed]
19
Hicks
CL
,
von Baeyer
CL
,
Spafford
PA
,
van Korlaar
I
,
Goodenough
B
.
The Faces Pain Scale-Revised: toward a common metric in pediatric pain measurement.
Pain
.
2001
;
93
(
2
):
173
183
[PubMed]
20
von Baeyer
CL
,
Spagrud
LJ
,
McCormick
JC
,
Choo
E
,
Neville
K
,
Connelly
MA
.
Three new datasets supporting use of the Numerical Rating Scale (NRS-11) for children’s self-reports of pain intensity.
Pain
.
2009
;
143
(
3
):
223
227
[PubMed]
21
Mostafa
M
,
Crick
T
,
Calderon
AC
,
Oatley
G
. Incorporating emotion and personality-based analysis in user-centered modelling. In:
Bramer
M
.,
Petridis
M
., eds.
Research and Development in Intelligent Systems XXXIII
.
New York
:
Springer
;
2016
:
383
389
22
Jeong
S
,
Breazeal
C
,
Logan
D
,
Weinstock
P
.
Huggable: impact of embodiment on promoting verbal and physical engagement for young pediatric inpatients
. In:
Proceedings from the 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
;
August 28–September 1, 2017
; Lisbon, Portugal:
121
126
23
Barrera
ME
,
Rykov
MH
,
Doyle
SL
.
The effects of interactive music therapy on hospitalized children with cancer: a pilot study.
Psychooncology
.
2002
;
11
(
5
):
379
388
[PubMed]
24
Chapman
LM
,
Morabito
D
,
Ladakakos
C
,
Schreier
H
,
Knudson
MM
.
The effectiveness of art therapy interventions in reducing Post Traumatic Stress Disorder (PTSD) symptoms in pediatric trauma patients.
Art Ther
.
2002
;
18
(
2
):
100
104
25
Holden
EW
,
Deichmann
MM
,
Levy
JD
.
Empirically supported treatments in pediatric psychology: recurrent pediatric headache.
J Pediatr Psychol
.
1999
;
24
(
2
):
91
109
[PubMed]
26
Tsai
C-C
,
Friedmann
E
,
Thomas
SA
.
The effect of animal-assisted therapy on stress responses in hospitalized children.
Anthrozoos
.
2010
;
23
(
3
):
245
258
27
Coninx
A
,
Baxter
P
,
Oleari
E
, et al
.
Towards long-term social child-robot interaction: using multi-activity switching to engage young users.
Journal of Human-Robot Interaction
.
2016
;
5
(
1
):
32
67
28
Sabanovic
S
,
Michalowski
MP
,
Simmons
R
.
Robots in the wild: observing human-robot social interaction outside the lab
. In:
Proceedings from the 9th IEEE International Workshop on Advanced Motion Control
;
March 27–29, 2006
; Istanbul, Turkey:
596
601
29
Kory Westlund
JM
,
Jeong
S
,
Park
HW
, et al
.
Flat vs. expressive storytelling: young children’s learning and retention of a social robot’s narrative.
Front Hum Neurosci
.
2017
;
11
:
295
[PubMed]
30
Kory
J
,
Breazeal
C
.
Storytelling with robots: learning companions for preschool children’s language development
. In:
Proceedings from the 23rd IEEE International Symposium on Robot and Human Interactive Communication
;
August 25–29, 2014
; Edinburgh, Scotland:
643
648
31
Gordon
G
,
Spaulding
S
,
Kory Westlund
JM
, et al
.
Affective Personalization of a Social Robot Tutor for Children’s Second Language Skills
.
Phoenix, AZ
:
AAAI
;
2016
:
3951
3957
32
Breazeal
C
,
Harris
PL
,
DeSteno
D
,
Kory Westlund
JM
,
Dickens
L
,
Jeong
S
.
Young children treat robots as informants.
Top Cogn Sci
.
2016
;
8
(
2
):
481
491
[PubMed]
33
Coyne
I
.
Children’s experiences of hospitalization.
J Child Health Care
.
2006
;
10
(
4
):
326
336
[PubMed]
34
Sourkes
B
.
Armfuls of time: the psychological experience of the child with a life-threatening illness.
Med Princ Pract
.
2007
;
16
(
1
):
37
41
35
Burns
A
,
Greene
BR
,
Mcgrath
MJ
, et al
.
SHIMMER™ – a wireless sensor platform for noninvasive biomedical research.
IEEE Sens J
.
2010
;
10
(
9
):
1527
1534
36
Chiles
JA
,
Lambert
MJ
,
Hatch
AL
.
The impact of psychological interventions on medical cost offset: a meta‐analytic review.
Clin Psychol Sci Pract
.
1999
;
6
(
2
):
204
220
37
Saravay
SM
,
Steinberg
MD
,
Weinschel
B
,
Pollack
S
,
Alovis
N
.
Psychological comorbidity and length of stay in the general hospital.
Am J Psychiatry
.
1991
;
148
(
3
):
324
329
[PubMed]
38
Li
J
.
The benefit of being physically present: a survey of experimental works comparing copresent robots, telepresent robots and virtual agents.
Int J Hum Comput Stud
.
2015
;
77
:
23
37
39
Coan
JA
,
Schaefer
HS
,
Davidson
RJ
.
Lending a hand: social regulation of the neural response to threat.
Psychol Sci
.
2006
;
17
(
12
):
1032
1039
[PubMed]
40
The New York Times
. A talking teddy bear practicing in the pediatric hospital. New York Times. June 3, 2015. Available at: https://www.nytimes.com/2015/06/04/technology/huggable-robot-therapeutic-value-hospitals.html. Accessed July 19, 2018

Competing Interests

POTENTIAL CONFLICT OF INTEREST: The authors have indicated they have no potential conflicts of interest to disclose.

FINANCIAL DISCLOSURE: The authors have indicated they have no financial relationships relevant to this article to disclose.