In defence of the less defensible: why we need thick data, too

grace linkedin.jpeg

Over brunch last Sunday, my friend delighted in walking me through her Google Maps timeline. Using GPS, it tracked our previous day’s movements to the minute. It mapped our exact route to the bouldering wall, and then to lunch, and finally to another friend’s house (including a small detour to re-caffeinate). The timeline was a precise record of our day, recognising each place we visited and how long we were there. It even named my apartment building. Pushing any discomfort of surveillance aside, my researcher brain kicked in as I thought about all the possible uses for this little timeline wrought large. Like so many others, I was captivated by the potential of big data.

But are we caught up in a state of magical thinking where big data is concerned?

Qualitative research offers valuable human insight – but it risks being pushed aside

Before my friend’s visit, I’d spent the Friday at Nudgestock, Ogilvy’s annual festival of behavioural science. It was a day of lively explorations and thought-provoking talks, delving into a variety of behavioural applications that ranged from improving diversity, increasing taxes, and curating personalised playlists. I found myself nodding along enthusiastically to Tricia Wang’s talk about the necessity of considering big data alongside a quantifiably smaller, but more human, counterpart: thick data.

The world of big data has exploded in the digital age. From the websites we visit to the physical movements we make, our digital footprint grows daily. As researchers, we’re particularly charmed. Big data, with its dizzying robustness, makes our conclusions powerful. Let’s be honest: it is something the C-suite execs understand, and that offers credence to our conclusions.

However, Wang argues that our infatuation with big data has devalued the field of customer insights. Thanks to big data’s lure, we have stopped paying attention to the skills of qualitative researchers and ethnographers: people who observe people. In some organisations, insight has become a box to tick rather than a valuable source of learning. We eschew human stories in favour of hard facts, conflating access to the consumer with understanding of the consumer. But perhaps a fear of ‘smaller’, thicker data is actually harming our understanding – and, in turn, our ability to grow.

In discounting qualitative insight, we risk a world optimised only for a narrow interpretation of human experience: that which can be quantified and charted.

Decision-makers love quantitative data because it stands up to scrutiny

As an integrated researcher, I’m not unfamiliar with hearing, “this qual is nice, but do we have any quant to support it?” Even the language of the question implies that human data is weak and flimsy; in need of ‘support’. Tricia Wang calls this quantification bias: a valuing of the measurable over the immeasurable. This means that we might discount key insights, which could quite literally make or break a business, in our pursuit of the easily quantifiable.

Later in the day, Gerd Gigerenzer spoke about the paralysing effects of defensive decision-making, a strategy that protects the individual at the expense of true progression. Option A might feel instinctively ‘right’. However, if the decision maker cannot identify an adequate defence should it go wrong, it may be quietly discarded in favour of an (inferior) Option B or C.

I wondered whether there was a connection to be made here. Our clients feel pressure to make the ‘right’ decision, and what better to justify a decision than a solid statistic? It is safe. It is defensible. If they are questioned, they can point their fingers at the numbers and claim, “well, this data says…”. Quantitative data removes the burden of proof from their shoulders.

Qualitative research, however – particularly ethnography – does quite the opposite. Quantitative data feels safer, more tangible. Qualitative research, by its very nature, relies on human interpretation. Given its small base size and more abstract analysis, it is less defensible than a hard dataset. And that makes people wary.

True insight comes from integrating depth and scale

An old boss of mine used to enjoy the phrase, “rubbish in, rubbish out.” At its core, it means: our findings are only as good as the instrument used to capture them. Whilst we might spend hours crafting a flawless survey, it can’t address a problem nobody has considered. Nor can it access individuals’ stories, or emotions, or the rich context of all those who come into contact with your brand. As Rory Sutherland himself says, “All big data comes from one place: the past.” The past is a place of known knowns – but what about those things we don’t know?

Qualitative research has an image problem that we need to challenge in order to capitalise on holistic insight. Yes, we may ‘only’ have spoken to eight people – but in that time we saw the flicker of frustration as they struggled with your packaging, or watched them misunderstand your proposition entirely, or listened as they casually revealed a key priority that might otherwise never have made it as a metric in a survey.

Qual provides the colourful stories we can weave throughout a giant scaffold of data. It is the meat on the towering skeleton, fleshing out the bones that tell only half the story. To build the most complete pictures, we need to take a holistic approach to understanding people in all their complexity. By all means, examine the reams of information gathered from GPS tracking – but don’t forget the human journeys at the heart of it all. Even if that means learning to let go of the desire for the defensible at all costs. 

Annabel Gerrard