RoboAICon2023- Human-like robots that engage with autistic youngsters during therapy thanks to personalised machine learning.

 Autism impacts how people see, hear, and experience the world, which has an impact on how they interact with others. Because of this, communication-focused activities can be very difficult for kids with autism spectrum disorders (ASCs). As a result, it can be challenging for therapists to involve kids in these activities during educational treatment.

Recently, therapists started incorporating humanoid robots into therapy sessions as a solution to this problem. However, current robots are unable to interact with kids on their own, which is essential for enhancing therapy. The deployment of such robots is further complicated by the fact that people with ASCs have unusual and varied ways of communicating their thoughts and feelings.

A customised machine learning framework has been developed by researchers working on the EngageME project, which is financed by the EU, for robots used in autism therapy. As they explain in their article that was published in "Science Robotics," this paradigm enables robots to automatically recognise children's engagement and affect as they interact with them.

A customised strategy 

The project partners had come to the conclusion that there is no one solution that works for all children with ASCs in order to make this exciting advancement. They used demographic information, behavioural assessment results, and other features specific to each child to tailor their framework to them. The inventive architecture allowed the robots to automatically adjust their readings of the children's responses by accounting for individual and cultural variances.

It is particularly difficult to develop machine learning and AI [artificial intelligence] that is effective for autism since conventional AI techniques call for a large amount of data that are similar for each category that is taught. The conventional AI approaches fall short in autism, where heterogeneity rules, according to co-author Prof. Rosalind Picard in an essay published on "MIT News."

Robot-assisted therapy

35 kids from Serbia and Japan were used in the research team's model testing. The kids, who ranged in age from 3 to 13, engaged with the robots for 35 minutes at a time. By altering the colour of their eyes, the tone of their voice, and the posture of their limbs, the humanoid robots could portray a variety of emotions, including anger, fear, happiness, and melancholy.

The robot would record audio and video of a child's tone of voice and vocalisations as well as their facial expressions, gestures, and head attitude as it interacted with them. The robot was also given information about each child's body temperature, heart rate, and skin sweat response from a monitor worn on their wrist. The data was then utilised to extract the child's different behavioural cues and sent into the perception module of the robot.

The robot then judged the child's mood and involvement based on the extracted behavioural indicators using deep learning algorithms. The outcomes were put to use to alter how the child interacted with the robot in following therapy sessions. Experts in the field saw video footage of the therapy sessions as well. A 60% correlation between their evaluations of the kids' responses and the robots' perceptions was found. This level of agreement was greater than that attained by human specialists. The findings of the study imply that trained robots might one day be crucial in the treatment of autism.

In order to personalise therapies and make human-robot contact more engaging and natural, EngageME (Automated Measurement of Engagement Level of Children with Autism Spectrum Conditions during Human-robot Interaction) is aiming to add critical information to robots.

Reference-

https://cordis.europa.eu/project/rcn/200926/en

Comments

Popular posts from this blog

NanoResCon2023: Researchers have made great strides in utilising bacteria to create artificial cells that function like real cells.

MechResCon2023: Real-time Hologram Rendering from Interferometers is Made Possible by BitFlow Frame Grabber

AeroResCon2023: Autonomous robots will 3D print the vaulted lunar outpost for NASA and AI SpaceFactory.