New study: Robots might be able to recognize human emotions
As artificial intelligence, or AI, moves further and further into a future we only dared to dream of before, we now have to find ways to coexist with our robot counterparts.
The robots are shaping up to adapt to our nature in a way only other humans could previously do. Now, we might have robots that are advanced enough to feel the tension in a room.
Researchers from Warwick Business School, University of Plymouth, Donders Centre for Cognition at Radboud University in the Netherlands and the Bristol Robotics Lab at the University of the West of England conducted a study to see if robots could recognize human emotion.
Together, they eventually published their findings in Frontiers in Robotics and AI, discovering results that maybe shouldn't be a surprise these days.
There's little telling where the initial idea came from, whether it be science fiction or just plain common sense. Either way, machines and AI are becoming such a daily part of our lives that this step seems like a natural progression. Regardless, the researchers first had to figure out how the robots would recognize emotions.
They first observed how humans determine emotion. We can recognize excitement, anger or even boredom from the way people talk, move or the expressions they make. The idea is that robots could learn similarly, by accounting for tone of voice and body movements and understanding what these things mean.
As scientists, the researchers couldn't run on the assumption that everyone perceives emotion the same. To conduct this study properly, they first filmed many different pairs of children interacting with a robot and computer with a touchscreen.
Next, they showed these videos to 284 participants and asked them to gauge the moods of the children and whether they were cooperative. To mix things up, some participants viewed the same scenes, but with stick figures mimicking the actions instead of the children.
Both groups labeled the emotions the same for all the children, making the collected data irrefutable. The researchers then trained AI to label the clips based on this data to identify feelings and interactions.
AI is already playing a role in customer service platforms to make everyone's experience as easy as possible. To recognize how the customer is feeling -- whether that's happy, angry or something else -- can help the AI perform a better job and hopefully leave the customer in a better mood. However, this ability to recognize emotions doesn't just benefit humans.
Robots that make deliveries often get attacked or vandalized for a multitude of reasons. If a robot can sense it's in danger, it can leave a situation, as opposed to allowing itself to be attacked. In these scenarios, it's also helpful for a robot to understand the distinction between backing away or leaving a scene entirely. Either way, almost everyone can benefit from a robot understanding emotions.
This whole study may sound like we're suddenly in the middle of a science fiction novel. Once AI can recognize human emotion and interact better on the same wavelength as humans, it's likely we'll be teaching them to understand more things about humanity, not to mention teaching them to replicate our actions.
The future of AI is bright and full of unknowns for now, but we're taking it one step at a time. That first step starts with a fundamental understanding of each other and better communication between humans and machines.
Kayla Matthews is a senior writer at MakeUseOf and a freelance writer for Digital Trends. To read more from Kayla, visit her website productivitybytes.com.