RMIT researchers have developed technology that is able to detect emotions in human speech, enabling more natural conversations with robots.
Current voice-activated technology used in virtual assistants is limited by its ability to decipher human emotions, causing it to provide irrelevant responses or miss the point of some conversations entirely.
But a team of researchers from RMIT University’s School of Engineering, led by Associate Professor Margaret Lech, have developed technology to provide machines with capabilities to detect and respond to human emotions.
“There’s always an emotional context when we talk to people and we understand it, but machines don’t understand this,” Lech said.
“When we call an automatic call centre, for example, people get very frustrated because they talk to the machine and it does not understand that they are sad, they are anxious, that they want things to be done quickly.”
Lech said that current machines don’t understand emotions associated with tasks, leading to people wanting to deal with a human instead.
“There is no way to explain certain things to a machine, including those subtle cues that we can express through emotions when we talk to each other,” she said.
Lech and her team have spent 11 years creating new machine learning techniques that allow technology to understand human emotions from speech signals, and to analyse and predict patterns of emotional interactions in human conversations.
Equipped with these capabilities, voice-activated devices can now understand both the linguistic and emotional contents of speech, and provide appropriate responses. They can read seven human emotions: anger, boredom, disgust, fear, happiness, sadness and neutral.
Lech said that emotional recognition technology in machines will unlock wider applications and benefits.
“People will accept machines more, they will trust machines, they will have the feeling that the machine really understands them and can help them better,” she said.
“People, especially the elderly, will not be so reluctant to use automatic call centres. Then we can employ machines, for example, robots as companions. An older person may like actually talking to a machine and hear that the machine can laugh with her, can sympathise, and understand her feelings.
“It could also be good if used for kids’ toys. Children will interact with robotic toys that can talk emotionally, so children will learn more about emotions.”