Blind in my left eye, blind in the right

Data is driving a revolution in marketing. Providing a new level of insight and understanding of customer behavior, and allowing real-time review of what works and what doesn’t. But the art of asking people their opinion is not dead, and should remain an important part of the insight gathering process. 

Marketing decisions have always been made on the basis of imperfect information. As a result, we’ve always been somewhat blind to what is really going on. In the past, our decisions were primarily based on customer research backed up by secondary sources and some business performance data. This combination gave us insights based mostly on opinion along with some insights into behavior. The problems with this have always been twofold.

  1. What people say and what they do are often different.
  2. Old fashioned ways of measuring performance tend to lag by weeks, often months.

You could say that we’ve been making big decisions with one eye blind.

Fast forward to today, and we see a different picture emerge. We now have advanced computing power combined with the ability to track people’s real behavior, and do it all in  real-time. The impact is profound. We no longer need to ask people what they think because we can track what they do. And we no longer have to wait weeks and months, we can see performance in seconds, minutes, hours and days. It’s an intoxicating mix.

But if we’re not careful, these data-driven decisions risk becoming just as one-eye-blind as our research driven decisions used to be.

Recently I was talking to a friend about some work they’ve been doing. Their brand is very successful and has an enviable data capability. Everything they do, every media placement, every message, every offer is tweaked to the max using data to test, analyze and iterate.

Their challenge is that because of their extreme focus on data, they stopped asking questions of their customers. After all, why ask questions that cost money to ask, take time to get responses, and might provide a biased answer when we can observe real behaviors instead?

Well, the reason is that while they'd tweaked and optimized and created an incredible performance marketing machine, they hadn’t positioned their brand optimally for growth. Without realizing, they'd unintentionally limited their potential market by optimizing to data from a fairly narrow segment of customers. As a result, they've now found a series of high value segments to whom their brand does not currently appeal. Segments they haven’t focused on because these customers don’t interact with them, and as a result have never appeared within their data model. It was only by conducting a program of initially unrelated customer research were they able to uncover these groups of customers they were otherwise blind to. Customers that are now critical in enabling their next stage of growth.

And while this is a good example, I don’t think it's a lone one. I see many forward thinking data driven companies which are very comfortable with the quantitative precision of their data, and very uncomfortable with the relative fuzziness of research, particularly qualitative research. However, if we are to avoid being permanently blind in one eye (albeit the other eye), I think we need to get comfortable with both. Using data to test, optimize, iterate and understand behavior. And using research to understand, explore, empathize with and understand opinion.

Moving forwards, the ability match and make sense of behavioral insights derived from data alongside opinion based insights derived from research will have to become a core competence of the marketing function. And, in the short term, it is likely to be a major source of difference between high performing brands and their lower performing peers.

Paul Worthington