The Quest for Emotionally Intelligent Computing

Posts by friends

The Quest for Emotionally Intelligent Computing

by Anna Kaminsky

 

Emotional Intelligence (EI) is the ability of an individual to understand, perceive and control their emotions. Emotional intelligence helps us understand ourselves and maintain relationships with others.

EI involves four basic steps – perceiving emotions, reasoning with emotions, understanding emotions and managing emotions. Consider a situation where you, upon meeting your friend, congratulate him about a promotion he received at work. In response, your friend just smiles. This emotional response might confuse you. To understand his emotions you need to differentiate whether it was a happy smile or a sarcastic one.

Most humans are able to perceive the difference between a happy smile and a sarcastic one, or between a true laugh and a forced laugh. Once the distinction is made, they should consider the emotion and decide on a proper response. Because perceived emotions can carry a variety of meanings, it is necessary to use contextual knowledge to filter out the unnecessary cues and signals that our brain is receiving. The forced laugh can be caused by person’s attitude toward you (typically, this is our first thought), or it might be a result of stress that the person is facing at home or at work.

After you have interpreted the other person’s emotions, your job is to effectively manage your own emotions and respond appropriately.

We are not born with emotional intelligence, it is something that we learn from our parents or interactions with the environment. Well, if humans can learn emotional cues, so can machines, right? While most non-autistic humans can easily distinguish between various emotional states, creating a machine which is capable of interpreting emotions is much more challenging. In her lecture “Emotional Intelligence, Technology and Autism”, MIT Professor Dr. Rosalind Picard demonstrates how inefficient software systems are in recognizing the emotions of the individuals using them.

To demonstrate her point, Dr. Picard presents an analogy to an annoying “Assistant” feature which is available in some commercial software and websites: Imagine that you are focusing on an important deadline. While you are busy, a colleague walks in your office and interrupts you. Even though you show subtle annoyance, he ignores it and offers you useless advice on some matter that annoys you further. Even if you communicate your emotional state at the moment, he chooses to ignore it. Your patience is tested to the limit and you ask him to leave. He does so by winking and doing a little dance for you.

When facing a strict time schedule, individuals tend to be annoyed by random bursts of excitement from others. Likewise, they will be annoyed by software which displays unsolicited “useful tips” while they are working.

There has been significant research on technologies which consider and understand the attitudes and mental states of their users. Researchers have called this Affective Computing. Computers do not have an inherent understanding of human emotions, and it is the aim of Affective Computing to create software which is capable of interpreting the emotional cues which humans are so proficient at interpreting.

The foundation of Affective Computing is Facial Acting Coding System (FACS). Paul Ekman and Wallace V. Friesen created FACS in 1978, and this system can code nearly any human facial expression. All emotions are then assigned Action Units (AU) that can be used in various functions in the decision-making process to recognize basic emotions. Therefore, FACS can be used to distinguish a happy laugh from a forced one. Facial cues are then mapped onto their AUs accordingly and interpreted as Happiness, Fear, Anger, Disgust, Contempt and Surprise.

Developing Emotionally Intelligent interfaces has not been an easy task. Similar to the mindsets of individuals, facial expressions carry just as much complexities and confusions. Human mindsets are a mystery, and facial expressions are just as difficult to interpret. If an individual is constantly thinking about his exam results, he is more likely to show worry. Thoughts going through a human’s mind are often projected on the face. Therefore, facial expressions change every minute to show the current state of mind of an individual. Interpreting these changing facial expressions is a difficult task for both humans and computers.

To prove that recognizing emotion requires more than face expression analysis, Dr. Picard carried out an experiment based on the work of Dr. Hillel Aviezer, Professor at Hebrew University of Jerusalem. Dr. Aviezer studied the malleability of emotional perception and defined six basic emotions: joy, anger, fear, sadness, disgust and surprise. Dr. Picard showed two pictures with facial expressions of anger and disgust. The man with the angry expression was throwing a punch, while the man with the disgusted expression was holding a dirty garment.

When the same photos were shown with the facial expressions switched, the audience did not study the facial cues being exhibited by the individuals in the picture, but rather looked at the punch and the dirty garment to give their verdict. Therefore, to interpret emotions it is necessary to take into account the environmental cues surrounding that emotion.

Research has made noticeable progress and has produced new software systems which understand how the users feel and consider their emotional state of mind. As this technology evolves, it will become adept at recognizing emotions and will make software more useful and less frustrating.


 

Image: http://www.flickr.com/photos/63015897@N02/9351886108