By ICG Member Nick Bonney: Deep Blue Thinking
It seems we can’t turn on a TV cookery show these days without hearing the word ‘provenance’ . It was nearly 15 years ago that celebrity chef Gordon Ramsay reared and then slaughtered 27 turkeys as part of ensuring his children understood where the food on their Christmas table came from but even more so these days we are bombarded by messages of locally grown, organic or free range.
It seems rather counter intuitive then that, if anything, the market research industry seems to be heading back to the fast food heyday of the 1980’s; We want our numbers, we want them quick, we want them processed and we don’t want to ask too many questions about where they came from. In an environment where brands are doing everything they can to leverage their first party data, market research quite often seems to be swimming in the opposite direction.
This has been brought into sharp focus for me recently across a range of briefs where the first stage has quite rightly been to mine existing research data before investing any further budget in new research. Very few clients have tables to hand these days let alone raw respondent-level data and frighteningly, when that data is interrogated there are often some curious anomalies. I was discussing the disappearance of data tables from the client-side world with a senior buyer last week and it seems rather than a conscious decision, clients have often simply got out of the habit of asking for them.
The counter argument of course is that’s because client-side research teams are quite rightly spending more time up-stream; business partnering with marketing teams and focusing on building influence with senior stakeholders but to some extent that misses the point. To use my food analogy again, if you’re serving food in your restaurant, shouldn’t you know where the ingredients are being sourced from?
From my perspective, the benefits of getting hold of tables or raw data are actually quite clear cut and interestingly benefit all parts of the client/ agency value chain.
Firstly, from a client perspective, the act of asking for them I think demonstrates that attention to detail and therefore don’t be surprised if agencies place more focus on data quality and getting to a well thought out set of analysis if they know their client is going to be looking at it. If one chef is first down the fish market, looking at the quality of the catch and the second one just waits for the delivery later in the day, who do you think will get the best produce?
Secondly, from an agency perspective, rather than being seen to provide data as a commodity it actually provides the ability to be more insightful / punchy with debriefs. Surely there’s more permission to challenge the research buyer who wants endless charts ‘just in case’ if they have all the figures they need to hand in a well-constructed set of data tables?
Finally, and most importantly, it starts to focus research budgets on the areas which can really make an impact. With every new IPA Bellweather data which emerges, we see a gloomy picture for market research spend. Extracting greater value from previous projects is a far more cost efficient process than commissioning another new piece of work. This does though create challenges for an agency-world increasingly focused on topline growth but if we genuinely want to build longer term partnerships between agency and client, surely playing the long game is in everyone’s interest here?
Quite rightly , data is often seen as a commodity but that doesn’t mean we shouldn’t treasure it. There’s a lot of loose talk about ROI on research spend, surely one of the simplest ways we have to maximise return is to ensure every piece of data we collect works as hard as it can?…