RoboAICon2023: Robots that read emotions

Robots are becoming more adept at reading human emotions and thoughts simply by "looking" into their faces. This advancement could one day lead to more emotionally responsive machines that are able to recognise changes in a person's physical or mental health.

Researchers at Case Western Reserve University claim to be advancing the artificial intelligence (AI) that powers interactive video games and will soon improve the following generation of customised robots that are probably going to coexist with humans. 

Additionally, it is being done in real time by the Case Western Reserve robots. 

Kiju Lee, the Nord Distinguished Assistant Professor in mechanical and aerospace engineering at the Case School of Engineering, and graduate student Xiao Liu have created new machines that can recognise human emotions nearly quickly and with 98 percent accuracy. Similar results had previously been obtained by other researchers, but the robots frequently answered too slowly.

Even a three-second break might be uncomfortable, according to Lee. Robots will have a much tougher time interpreting someone's emotions based just on their facial expressions or body language than humans do. Unfortunately, doing this requires a lot of equipment, including video recording, which slows down the reaction. 

By adding two pre-processing video filters to another set of already-existing programmes, Lee and Liu improved the response time and enabled the robot to classify emotions based on more than 3,500 changes in face expression.

But that scarcely captures the whole range of human facial variety, according to Lee. Humans are capable of registering more than 10,000 different expressions, and each one has a particular method of conveying many of those emotions. 

But once the data are entered into the software and categorised, "deep-learning" computers can process enormous amounts of data. 

Thankfully, the seven emotions that make up the majority of human expressions—neutral, happiness, anger, sorrow, disgust, surprise, and fear—can be easily distinguished from one another. This division even takes into account regional and cultural differences.

When paired with developments made by dozens of other researchers in the field of AI, the current work by Lee and Liu, presented at the 2018 IEEE Games, Entertainment, and Media Conference, could result in a wide range of applications, according to Lee. 

Additionally, the two are also working on a different machine-learning-based method for recognising facial emotions, which has so far reached over 99% accuracy with even higher computing efficiency. 

One day, a personal robot might be capable of observing a person's substantial alterations through routine interaction, even to the point of spotting early indications of sadness, for instance.

The robot might be taught to identify it early and provide those who require social therapy with straightforward solutions, such as music and video, according to Lee. For older persons who may be experiencing depression or personality changes brought on by ageing, this could be quite helpful. 

Through cooperation with Ohio Living Breckenridge Village, Lee intends to investigate the potential application of social robots for social and emotional assistance in older adults. There, senior citizens will interact with a user-friendly, socially interactive robot and assist in evaluating the precision and dependability of the integrated algorithms.

Another potential for the future is a social robot that may "train" people to understand emotions in others by seeing the subtler facial changes that people with autism have. 

These social robots will take some time to spread in the United States, according to Lee. "However, this is already starting to happen in nations like Japan, where there is a strong culture surrounding robots. In any case, emotionally intelligent robots will coexist with us in the future."

Comments

Popular posts from this blog

AeroResCon2023: SWISS claims that higher productivity contributed to its first-half profit.

NanoResCon2023: Researchers have made great strides in utilising bacteria to create artificial cells that function like real cells.

InfraCivilCon2023: How to Make the Most of Strategic Green Spaces' Cooling Effects