Affective Interfaces (AI) is a startup with a powerful emotional recognition technology that can support marketers and brands in their go-to-market strategies. It provides insight and analytics into the emotional response of prospective users/ consumers to future products, campaigns,... that they want to test. CEO Jai Haissman walked us through the technology.
Emotions Are Universal
We've been presenting you with image recognition technologies beforehand but this one is a bunch of steps beyond as it is real-time and focuses on face motion as expressions of emotions. The technology at AI stems from Paul Ekman's studies on emotional recognition. Facial features are universal and recognizable across cultures and genders. For one thing, faces are constituted of muscles and there's only a certain number of combinations of contractions that they can perform, i.e. there's a limited set of emotions that humans can show on their faces - some of them being more socially relevant than others.
Hence, facial emotions can be decrypted and analyzed through Affective Interfaces' tools. Consumer sentiment and engagement at any moment of the purchasing funnel can therefore be tracked through data collected by AI. Emotions being universal, the AI platform can be deployed worldwide. It provides accurate metrics across all the different markets and shows how a product / content / campaign will impact the prospective user / consumer in different areas, regardless of cultural considerations.
Here's a screenshot of what the interface looks like - with a happy emotion.
Concretely, the mechanism behind AI resembles social media insofar as emotions immediately transform the link between a respondent and a prospective campaign / product / content into a relationship - as opposed to plain, passive absorption of data on the part of the respondent. Let's say a prospective user / consumer is being asked to watch an ad that a brand wants to test before releasing it to the general public. Instead of collecting metrics after the 'absorption' phase, the company actually has data in real-time while the respondent is actually watching the ad and reacting upon it.
Watch this video that Affective Interfaces sent us, showing respondents to an ad preview. Look at the final emotional peak at the end of the trailer... when the brand association comes up with a specific trigger. The original sound is left on for you to see the full system.
With such qualitative data collected from facial expression of emotions, it becomes possible to analyze the purchasing behavior and intent of consumers. It allows brands to understand why a prospective buyer dropped off a site before validating a full shopping cart for example. Down the line, it also provides companies with reliable metrics on their brand awareness levels in any given market.
Remote And Asynchronous
The way AI collects data is that once the target demographics are determined by a company / brand, the SaaS only requires respondents to sit in front of their own screen and video camera to record for the tests. Most of the time, the recording will take place from the comfort of their own homes, as the protocol requires that they isolate themselves from external distractions, biases and influences (the presence of people around would impact their facial responses, due to social interaction). The data analysis will then be made from the footage, meaning that it allows great flexibility for companies -no need to set up an on-site structure even when the market being researched is located across the globe.
Variety Of Applications
AI has just finished a study, testing emotional responses, for Procter & Gamble. But beyond testing the consumer goods market, AI's technology has many other potential applications. Think entertainment and, for instance, gaming: games that would track and factor in the players' emotional responses to determine is course. This is actually what the company is working on at the moment, "this is down the road," Haissmann told us. Or think of movies and how it would actually make it possible to crowdsource the cast and engage future viewers... These are only a couple of examples out of a full array of potential applications that include security, human resources and many more. One point Haissman insisted upon though is ethics: AI is intent on participants awareness and explicit approval in any study launched.
At this stage, Haissman told us that the SaaS is readying for its first round of funding. Our sources tell us that the company is also in talks with Stanford University for collaboration.
What do you think?