If the first principle of advertising is to get seen, then the second is the ad must be felt. Successful advertising depends on getting both right. But while the first is intuitively easy to understand – what is unseen cannot affect anyone – the second may be less obvious. In business, we are trained to think rationally and to set emotions aside. Or as the neuroanatomist Jill Bolte Taylor puts it, “We live in a world where we are taught from the start that we are thinking creatures that feel. The truth is, we are feeling creatures that think.” Breakthroughs in brain science over the last decades have shown that emotions are the source behind all human action. Making customers feel is thus key to gaining their attention, committing the ad to memory, and encouraging them to ultimately buy our products. Therefore, ad testing with traditional market research that relies on conscious and rational thinking often fails to capture the whole picture. And that gap between data and reality may translate into costly failures. Luckily, there are methods to round out this problem and bridge the gap.
Thanks to technological advances it is now possible to implicitly measure human responses to advertising on TV like commercials and product placements, social media video ads or YouTube pre-rolls. Facial coding and webcam eye tracking are two easy-to-use methods to measure those first crucial principles of advertising. Using them in combination turns the otherwise elusive powers of attention and emotion into unambiguous data. And together they are quick data to collect and cost-effective at that. Here, let me show you what we found out about a couple of Tobii Dynavox video ads in less than 48 hours and while spending just a few hundred dollars.
Nowhere is the war for attention more intense than in social media. So, to make sure the two video ads for Tobii Dynavox would hit the mark we tested them with Sticky by Tobii Pro. Participants saw just one of the videos in the test and while they watched it we tracked their attention and emotions via their webcam. Afterwards we also asked how they liked the ads.
When the results came back, answers to the survey questions indicated an overwhelming success in positive responses to the ads. Almost all respondents (+90 %) said their overall impression was ‘very’ or ‘somewhat positive’. Likeability is a key indicator of advertising effect, so this told us one important thing; the ad concept worked once we got people to stop and watch it.
However, it is one thing that people like a video after viewing it to end in a testing environment where we have already secured their attention. But is it engaging enough to catch and hold peoples’ attention in the wild? That is why we also tested the ads with implicit research methods.
Turning to the emotional data, we saw that sadness appeared to be the dominating emotion. And while that might raise concerns in many other cases, here it is what we were looking for. In the context of assistive technology, sadness indicates that the videos manage to engage compassion and move the viewers enough to earn their attention.
Also, drilling deeper into the data, we could see peaks of joy in the moment the video shows the happiness resulting from using the assistive technology. And eye tracking showed that almost all participants looked at the product name in the same moment as joy peaked, leaving viewers with a positive brand connection.
Overall, the data tells us we have a winning concept with a high probability to succeed in the marketplace. We also got clues as to how to edit later versions for potentially even higher engagement. We know from earlier tests that minor editing can make huge differences. A beer brand commercial, for example, got an earlier and stronger joy-response and a +40% overall higher emotional engagement by just trimming a 30 second spot to 20 seconds (Sticky by Tobii Pro research 2017).
Summing up the findings reminds me of an old parable where blind men describe an elephant by feeling one part of the animal’s body, and while each man’s description is correct, they all describe a different animal.
Similarly, in research, focusing on results from a sole data source runs the risk of missing the bigger picture. In our test, for example, relying solely on survey responses would have left us in the dark about whether they would engage enough or not. Adding implicit research, provided us with that crucial validation in an unambiguous way.
In conclusion, ad testing with webcam eye tracking and facial coding enables all campaign stakeholders – from creative teams to media buyers – to get a broader, richer, and more accurate view of how their advertising performs.
Author: Magnus Linde is an experienced Market Researcher and Analyst with 20+ years’ experience and currently running the insight agency Manolima. Magnus has also co-founded the implicit academy, a network dedicated to spread awareness and knowledge in behavioral economy, neuromarketing and implicit market research, with special focus on eye tracking, facial coding, EEG and IAT.