Wireless Application Development

Robots don't cry, do they? How machines are getting emotional

When a drunk, 60-year-old Japanese man was arrested for attacking an emotion-reading robot last month, according to the Japan Times, he claimed he did not like the attitude of the shop assistant, so he took it out on the robot by kicking it.

The robot in question is called a Pepper and was working in a SoftBank branch in the city of Yokosuka in Greater Tokyo. Although the Pepper is not actually a ‘working’ robot, at least not in the sense that it undertakes jobs such as cleaning or stacking shelves, it does have an important role in greeting customers.

The Pepper is made as a ‘social robot’, at least according to its creator Aldebaran, part of the SoftBank group. Its whole reason for existence is to be communicative and, well, social, reading people’s emotions and reacting accordingly. Thankfully for Kiichi Ishikawa, it didn’t have a retaliation function.

There are no reports as to whether or not the Pepper actually tried to defuse the situation between the drunken Mr. Ishikawa and the shop assistant. Perhaps it came out with something even more annoying in an attempt to placate Mr. Ishikawa but didn’t have the emotional understanding to leave it alone? And that’s the point, isn’t it?

In the shadow of Professor Stephen Hawking’s apocalyptic vision of artificial intelligence (AI) last year and warnings from Elon Musk and Bill Gates among many others about the threat of AI (covered in an open letter entitled Research Priorities for Robust and Beneficial Artificial Intelligence, issued at the recent International Joint Conference on Artificial Intelligence in Buenos Aires), emotional intelligence technologies have to a certain extent been sneaking under the radar.

While the focus has been on the potential for destructive machine learning, we have failed to really grasp the potential of machines designed to read complex human emotion and react positively.

According to Patrick Levy Rosenthal, founder and CEO of EmoShape, a London and Paris based business that has developed a silicon-based ‘emotion processing unit’ (EPU), this is not so far from current reality. EmoShape is currently trialling its EPU in a bespoke AI home control console where commands are learned and tagged to emotions detected via voice patterns as well as facial expressions.

Happy face, sad face

The EPU chip has eight primary human emotions including both positive and negative feelings, to enable machines to ‘feel’ pain or pleasure, says Levy Rosenthal.

“When you have pain and pleasure you can create desire,” he says, “as the machine will try and avoid the pain and promote the pleasure.”

The EPU will detect human emotion and respond accordingly, he says adding that there are various levels of happiness. The machine will get happier if it believes the human is happy.

So would this work with machine learning robots?

“I believe that the best way to control machines is through emotions,” he said. “It’s technically impossible to program a machine to not kill a human.”

Levy Rosenthal argues that using emotion to teach AI about being happy or sad, for example, would go some way to creating a machine that would choose the path that would most likely please humans. This would also create stronger bonds between humans and their machines.

“We tend to be scared by animals that have no emotional attachment to us,” he adds, although not all humans elect to live with fluffy pets, let alone pleasant and happy people.

He admits it is complex and there is a long way to go but the premise is there, at least when dealing with everyday devices, if not the robotic humanoid of the future. So how accurate is the technology currently and are there many false positives?

“There is about an 80% accuracy rate but there are always false positives,” he replies. “Sometimes the face tracking will see a pattern on the wall and think it’s a smiling face. Also people use words that may seem negative but they are in fact used in a positive context.”

Emotion analytics

While emotional robots may still have a lot of growing up to do, the use of emotional analytics in classrooms is just around the corner. In January 2016 Lenovo Software will officially release its AirClass system (under its Stoneware brand) at CES in Las Vegas.

AirClass will feature an emotional analytics function to monitor student reaction to lessons, which in theory will help with attention span and behaviour management. It’s being driven by a technology used to increasingly monitor shoppers or gauge emotional reactions to adverts.

San Diego-based Emotient sells itself as a tool for emotion detection and sentiment analysis claiming “emotions drive spending.” It’s driven by an impressive research team led by Dr. Marian Bartlett, who pioneered machine learning approaches to facial expression detection in the mid-1990’s with Paul Ekman and Terrence Sejnowski.

So what does this tell us? Emotion technology will be big business, around $10bn in five years, at least according to a report by Crone Consulting cited in Bloomberg. It’s not surprising.  

For advertisers and retailers this is great data. For consumers may be less so. So where will this lead? Will we learn to hide our emotions more to protect ourselves from arbitrary analysis and judgement or will the machines have that angle covered too? It’s difficult to not see this spinning out of control but maybe there will be some benefits. Perhaps emotional robots can help kids with low self-esteem or confidence issues, even autism? And then of course, sell them stuff they know will make them happy.


« Michael Dell: My smartest decision


UK: Drones rules for emergency services don't apply »
Marc Ambasna-Jones

Marc Ambasna-Jones is a UK-based freelance writer and media consultant and has been writing about business and technology since 1989.

  • Mail


Do you think your smartphone is making you a workaholic?