For years, marketers have used focus groups and consumer surveys to find out what people think of campaigns and advertising, but a growing number of agencies say that those answers are rarely, if ever, accurate.
One possible solution is facial-recognition tech. Omnicom’s data group, Annalect, spent Super Bowl weekend with a group of 134 people who watched Super Bowl television ads in a lab in downtown Manhattan. A prototype software randomized the commercials, and a camera took pictures of their faces every three seconds. “Feature extraction” on photos gave the researches age, gender and a mood guess.
So, for example, “happy” ads saw the eyelids tighten as the cheeks rose. “Off-putting” ads saw a narrowing of the brows and visible protrusions of the tongue.
It’s a next step into behavioral data, say Annalect researchers. “We’ve spent several years getting access to granular behavioral data so we can see who is impacting rational and emotional decision-making,” said Slavi Samardzija, chief analytics officer at Annalect.
The idea of using emotional reactions to gauge ad performance is an idea that comes around every once in a while, and there have been movements toward this at other agencies.
WPP’s Mediacom has been working with emotion-measurement company Realeyes to measure how people feel when they watch videos.
For Annalect, which compared the results of its Super Bowl test with the USA Today Ad Meter, there were marked differences. MountainDew’s disturbing “PuppyMonkeyBaby” struck a nerve on emotional responses but ranked very low on the Ad Meter, where people are asked to rank ads on a scale of 0 to 10.
But skeptics wonder if clients will be willing to pull a multimillion-dollar campaign based on a few facial twitches. Samardzija said the idea is to use the facial-recognition tech to create content. “The behavioral insight is provided to inspire creativity,” he said.
There is a need for more sophisticated data, to be sure. “I absolutely see a place for visual object recognition to measure campaign responses,” said Jeff Tan, vp, director of strategy at Posterscope, which is experimenting with facial recognition to measure effectiveness of outdoor advertising. “Ad-blocking and viewability concerns are key challenges we are seeing in the digital industry. The net effect is that all media including out-of-home location advertising is under increasing scrutiny for measurable outcomes.”
Heineken chief marketing officer Nuno Teles said that a major focus for his brand is figuring out how well ads do, and in real time. The company uses ad tracks in digital and on TV to test them with different segments, but while it tends to measure metrics like ad completion and view time, Teles said the idea of non-verbal cues is becoming “increasingly important.”
Of course, using facial-recognition tech to measure outdoor advertising has to overcome scaling and standardization challenges, not to mention city regulations and public unease over the idea of using cameras to capture people’s reactions, said Tan.
And on the creative side of the equation, more data can lead to more changes — and more work. “I already design not one but two, three, a hundred pieces of creative for programmatic to serve people,” said Martin Agency ACD David Byrne. Extra work aside, Byrne said he would welcome more ways to figure out earlier in the process if people like an ad. The idea of facial recognition is appealing because traditional ways of asking questions don’t always work. “Data is not reliable because they try to answer in ways they think you want to hear,” he said.
Client budgets are another issue, though. “I’ve worked with people who spent $2 million to shoot an ad that tested badly, and they’ll put it back on the shelf,” said Byrne. “But budgets being what they are, that’s not true any more. Most brands will not be willing to do that.”
Homepage image via Shutterstock
The post Agencies are using emotional reactions to gauge ad effectiveness appeared first on Digiday.