Department of Biomedical Engineering Posters and Presentations

Music-Based Emotion and Social Interaction Therapy for Children with Autism Using Interactive Robots

Poster Number

110

Document Type

Poster

Publication Date

3-2016

Abstract

The purpose of this research is to develop and evaluate a multisensory robotic therapy system to stimulate the emotional and social interactivity of children with autism. Past studies have shown that robots excel in singling out and “articulating” emotions to autistic children when compared to humans. Conversely, humans can sometimes display multiple emotions at once, as well as body movements that contradict their facial cues. This can make it difficult for an autistic child to distinguish the intended emotion, and can result in a wave of sensory overload. Moreover, as studies have shown a strong connectivity within the neural domains for emotion, music and motor skills, this research aims to integrate music into the learning environment, in hopes of observing if and how it could help children in relating body movements and gestures to specific emotions. Our interactive robotic framework involves two robots: Darwin Mini, a humanoid robot that displays dynamically varied body movements and gestures and Romo, an iPhone-rover type robot that displays facial cues corresponding to specific emotions. The testing method will begin with sitting the child down with each robot separately as each displays emotional cues corresponding to specific emotions while music is simultaneously played in the background to help the children retain the correlations. The child will then watch as each robot is guided through through a maze with specific sections that would normally invoke a sensory overload during hearing, smell, taste, sight, and balance scenarios. The child can then see which emotions each robot uses to react to these scenarios, and hopefully mimic what the robots do in a real life scenario. In order to assess the effectiveness of the robotics’ interaction with the children, the emotional state of the child will be monitored throughout the time the child is interacting with the robot using a Kinect-based motion detection system and a speech analysis system, as shown in Figure 1. These systems will be able to analyze speech patterns and motion sequences to determine the level of engagement the child has with the system, as well as the emotional state of the child. Knowing the engagement and emotional state will also help the system modify the emotions being displayed if the child is determined as being in distress to help alleviate the child's emotional state. The analysis components are still in the development stages. Looking forward, the system will be fully autonomous and sent to clinics to start trials with autistic children.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Open Access

1

Comments

Presented at: GW Research Days 2016

This document is currently not available here.

Share

COinS
 

Music-Based Emotion and Social Interaction Therapy for Children with Autism Using Interactive Robots

The purpose of this research is to develop and evaluate a multisensory robotic therapy system to stimulate the emotional and social interactivity of children with autism. Past studies have shown that robots excel in singling out and “articulating” emotions to autistic children when compared to humans. Conversely, humans can sometimes display multiple emotions at once, as well as body movements that contradict their facial cues. This can make it difficult for an autistic child to distinguish the intended emotion, and can result in a wave of sensory overload. Moreover, as studies have shown a strong connectivity within the neural domains for emotion, music and motor skills, this research aims to integrate music into the learning environment, in hopes of observing if and how it could help children in relating body movements and gestures to specific emotions. Our interactive robotic framework involves two robots: Darwin Mini, a humanoid robot that displays dynamically varied body movements and gestures and Romo, an iPhone-rover type robot that displays facial cues corresponding to specific emotions. The testing method will begin with sitting the child down with each robot separately as each displays emotional cues corresponding to specific emotions while music is simultaneously played in the background to help the children retain the correlations. The child will then watch as each robot is guided through through a maze with specific sections that would normally invoke a sensory overload during hearing, smell, taste, sight, and balance scenarios. The child can then see which emotions each robot uses to react to these scenarios, and hopefully mimic what the robots do in a real life scenario. In order to assess the effectiveness of the robotics’ interaction with the children, the emotional state of the child will be monitored throughout the time the child is interacting with the robot using a Kinect-based motion detection system and a speech analysis system, as shown in Figure 1. These systems will be able to analyze speech patterns and motion sequences to determine the level of engagement the child has with the system, as well as the emotional state of the child. Knowing the engagement and emotional state will also help the system modify the emotions being displayed if the child is determined as being in distress to help alleviate the child's emotional state. The analysis components are still in the development stages. Looking forward, the system will be fully autonomous and sent to clinics to start trials with autistic children.