Trade Resources Industry Knowledge Smartphones Can Already Understand Your Voice Commands

Smartphones Can Already Understand Your Voice Commands

Smartphones can already understand your voice commands but imagine if they tried to read your emotions as well.

What if you asked it for details of movies showing at your local cinema and it replied:

Well, you look a bit sad. A rom-com should cheer you up! I'll find local listings now.

Such emotion-aware technology may seem a long way off – but not if Apple or any number of other recent start-ups (Affectiva, Emotient, nViso and Realeyes) have anything to say about it.

Your face is a window to your emotion - so why not capitalise (and commercialise) on it?

What your face says

The premise is simple – a smartphone device will encode a user's facial expressions using the built in camera. The device will then infer the emotional state of the user and modify any response accordingly:

feeling sad? Maybe it's a good time to show the funny advertisementfeeling angry? Perhaps the difficulty level of a game should be loweredfeeling happy? Show a shopper the products he bought the last time he was happy.

The logic of such technology draws support from a popular belief that we can "read" the emotional states of others by looking at their faces (test yourself on this body language quiz). With just a glance we are often quick to judge whether someone is afraid or disgusted, happy or surprised.

This belief is supported by studies conducted in the mid-20th century by US psychologist Paul Ekman demonstrating that people are able to match people's faces with emotion content (words that describe emotions, stories about emotions).

While this idea has permeated the scientific literature and popular belief, there is new research that calls these findings, and the general ability to "read" emotion faces, into question. 

Enlarge What is she feeling? The answer isn't so simple as just reading her face. Credit: Flickr/Andrew Imanaka, CC BY

Ekman-style emotion recognition findings (or the lack thereof) don't actually speak to whether someone's internal emotional state is accurately reflected on the face.

Are you lying to me?

It is possible that people might be able to see happiness in a smile, anger in scowling eyebrows, but that those faces are not consistently made when people experience those emotions. On that front the evidence is exceedingly weak despite the idea – popularised in the television drama Lie to Me – that truth and deception are leaked on to the face.

Take smiling. It's common sense that we smile when we are happy. But, we also smile when we are embarrassed or frustrated.

We are less likely to smile when we are alone, even if we are truly happy. We also sometimes make other faces when we are purportedly happy, such as Olympic medallists and other elite athletes.

Even when people are making very extreme faces, it can be difficult to tell emotional states apart based on the face alone. Facial displays don't appear to correspond to emotions in specific or unique ways as robustly as previously thought.

All of this research stands in the face (pun intended) of what most of us think that we know about perceiving emotion in others. So how is it that we are capable of knowing what someone else feels?

Current thinking in emotion research suggests that multiple sources of information are used when we name the emotion we see in a face.

In addition to the cues on the face, information from posture, the voice, the context, and our own past experiences are incorporated into our judgements of others' emotions.

While it may seem like we are simply reading our friend's face when we deem that she is happy, we're doing much, much more than that. And, to check, we would want to ask her how she is feeling.

So is it possible for a smartphone to accurately know when we're filled with pride or trepidation by snapping a picture (or even recording video) of our faces?

Emotion science says "no" – at least for now. Given the evidence, new technologies attempting to use facial displays to infer the emotional states of users are based on a rather shaky premise. On this front, smart devices are not very smart.

Apple appears to be hedging its bets. The US Patent Office is considering an application from Apple detailing an algorithm that integrates device usage patterns (such as switching between applications, websites browsed) and psychophysiological measures (such as heart rate and blood pressure) with facial displays to infer a user's emotional states.

Such an algorithm would likely do a better job than the camera alone, but we are a long way away from producing unequiovocal scientific evidence that supports that ability.

But we need to ask ourselves whether we want our devices attempting to perceive our emotions, even if the scientific evidence suggests they won't be able to do so accurately at the moment?

Do we ever want devices "reading" our emotions or should emotion perception be left to beings that can themselves feel emotions?

Source: http://phys.org/news/2014-03-smartphone-emotions.html
Contribute Copyright Policy
Your Smartphone Is Looking at You – But Can It Read Your Emotions?