The emotional computer – future of human-robot interaction
There is no doubt that future computers and robots need to understand us better in order to make the interaction with them more pleasant and productive. Some examples we already wrote about are Kismet robot from MIT, or FEELIX GROWING project based on Aldebaran robot Nao platform. A team of researcher from the University of Cambridge are exploring the role of emotions in human-computer interaction.
“We’re building emotionally intelligent computers, ones that can read my mind and know how I feel”, said Professor Peter Robinson, leader of the research team in the Computer Laboratory at the University of Cambridge. “Computers are really good at understanding what someone is typing or even saying. But they need to understand not just what I’m saying, but how I’m saying it.”
When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them. Brain-machine interfaces (BMI) can overcome the barrier, but they require headset wearing while you use them. An example is using BMI to make the robot Roomba sense its owners emotions.
The research team is collaborating closely with Professor Simon Baron-Cohen’s team in the University’s Autism Research Centre. Because those researchers study the difficulties that some people have understanding emotions, their insights help to address the same problems in computers.
Facial expressions are an important way of understanding people’s feelings. One system tracks features on a person’s face, calculates the gestures that are being made and infers emotions from them. It gets the right answer over 70% of the time, which is as good as most human observers. Other systems analyze speech intonation to infer emotions from the way that something is said, and analyze body posture and gestures.
“Even in something as simple as a car we need to know if the driver is concentrating and confused, so that we can avoid overloading him with distractions from a mobile phone, the radio, or a satellite navigations system”, said Ian Davies, one of the research students in Professor Robinson’s team.
Merely understanding emotions is not enough. Professor Robinson wants computers to express emotions as well, whether they are cartoon animations, or physical robots.
PhD student Tadas Baltrušaitis, another team member, works on animating figures to mimic a person’s facial expressions, while fellow PhD candidate Laurel Riek is experimenting with a robotic head modeled on Charles Babbage, which appears in the video above. Although it is helpful in demonstration of emotions, the robotic head gives that creepy sense like MDS Nexi and other emotion expressing robots.
“Charles has two dozen motors controlling ‘muscles’ in his face, giving him a wide range of expressions”, Robinson said. “We can use him to explore empathy, rapport building, and co-operation in emotional interactions between people and computers.”
Thats very informative post.. It is great pleasure to read this. The video shows how emotions can be used to enhance interaction between humans and computers. When people talk to each other, express their feelings in facial expressions, tone of voice and body postures. Even when they are interacting with machines. These hidden signals are an important part of human communication, but the computer to ignore.