Improve your user engagement metrics with emotional analytics

Find out how new emotion analytics algorithms are bringing feeling, emotional connection, and engagement to modern apps.

The next big thing in AI, emotional intelligence, could give hospitals a competitive edge

Most app development focuses on what an application is supposed to do. However there is a growing interest in crafting applications that can respond to how we feel by weaving emotion-aware signals gleaned from faces, voices, and text. At the O’Reilly Design Conference in San Francisco, Pamela Pavliscak, CEO of SoundingBox, explored how applications that senses our emotions are already changing the way we relate to technology and other people.

Already cameras are being used to capture emotional expressions in faces, microphones can analyze emotional tone of conversations, and sentiment analysis techniques are used to make sense of how people are feeling in social media. In addition, new edge devices with sensors gather data about our heart beat, brain waves, and the electrical conductivity of skin to make even more nuanced assessments of emotions. This was once the realm of self-hackers and high budget marketing teams. But now it is starting to go mainstream. Pavliscak said, “With Feelings 2.0, we are told we will get a new wave of technology that can read emotions and this will lead to a more emotional connection.”

Apple for instance recently bought out Emotient, which analyzes emotional expression in the face. This could help bring emotion recognition capabilities to iOS. In addition, both Microsoft and IBM now offer a suite of emotion analytics capabilities as part of their cloud offerings.


Who is using emotional analytics?

A number of companies have developed APIs for recognizing emotions expressed in faces. These techniques are based on the research of Paul Ekman who identified 5-universal emotion patters expressed across all cultures, popularized in the movie Inside Out. The first generation of these tools used cameras in stores to anonymously analyze the emotional impact of new products. Now developers are starting to use facial expression analytics to improve game play.

Developers are also starting to incorporate emotional sensing into a new generation of physical devices. Microsoft did a research project called MoodWings which alerts users to stress through the flapping of butterfly wings. A group of MIT researchers developed the pplkpr app to analyze and suggest responses to people that stress you out. The app correlates changes in heart rate variability with data about email and social media interactions. These early implementation can be enlightening but not always useful. Pavliscak found for example that interactions with her husband made her the happiest and also the angriest.

These tools are still in their early stages. Developers and designers have a lot of problems to solve before emotional analytics techniques are integrated into compelling and engaging apps. Part of the problem lies in developing a richer vocabulary for describing emotions. Companies and app developers all want to create products that bring happiness, but what does this really mean. Pavliscak observed, “In English we don’t have that many words for happiness. In my own research of 1,000 people, what turned out to be happy was really complicated. It includes a lot of emotions.”

Some kinds of “happiness” and delight can drive people away in the wrong context. Pavliscak did one study on a retirement planning site that looked at the impact of whimsical mascots that annoyed seniors more concerned with making better financial decisions than laughing. In the long run, Pavliscak believes we might start weaving emotional intelligence into applications, but this will require richer models about how we feel in order to have a meaningful impact.

http://www.theserverside.com/feature/Emotional-analytics-a-new-approach-to-user-engagement