|

Media Playground 2013: The Data Debate

Media Playground 2013: The Data Debate

IMG_6813

The final debate at last week’s Media Playground has perhaps prompted the biggest reaction. Already Richard Marks of Research the Media has issued a call to action regarding ‘big data’ and Rhys McLachlan has likewise urged the industry to completely rethink the ‘buy’ or ‘bid’ in RTB following insights from the event.

Indeed, McLachlan, director of corporate and business development at Videology, aired a number of outspoken thoughts during one of the liveliest panels of the day.

“Let’s start by rocking the boat,” he said. “Data is not new. It’s as old as the clay drawings on the walls of a caveman’s home; as old as when we first started recording the world around us. It’s simply the case today that we have so much more data – so it’s really about how we now organise it to find useful insights.”

Adam Pace, managing director, Annalect Markets, was quick to agree, helping to set the scene for one of the most complex issues facing media. “Data – and how we use it – has spiralled out of control,” he told the room of 200 delegates at what was the sixth consecutive Media Playground, this year hosted at RBS in central London.

So how do we improve?

“We absolutely must make sense of it and turn it into useful information,” Pace says. But before we think about that, let’s slow down and ask what, exactly, is going on here.

IMG_6681

The term ‘big data’ did not take long to rear its head during the session and was always sure to dominate any panel looking seriously at audience measurement in the digital age. But what is really causing the growing kerfuffle in the research community?

Well, it seems that by dealing with data sets so large and complex that it becomes difficult to process by any traditional means, you can also mask effective interpretation and, ultimately, value. And if advertisers get that wrong then the whole thing comes crashing down.

“Growing fragmentation of media samples is forcing us to require bigger data, and now we have hybrid views,” says Nick North, head of innovation, audience measurement, Gfk. Essentially, he says, we’ve over-complicated things; we have so much to measure – often now in real-time – that it’s becoming very difficult to extract any real meaning.

Dominic Mills – who stated last month that he is deeply sceptical of big data (insisting that it does not even deserve capitalisation) – was, without bias, chairing the event. He told the panel that he’d often heard big data described as “liquid gold”. Was there any truth in this?

“Gold is in short supply. Data is not,” McLachlan dead-panned.

So really, what can we do? Elsewhere we have researchers calling on a more controlled use of ‘big data’ and further still a complete rethink of the term itself (‘Deep Data’ appears more appropriate to some).

“We need to evolve as agencies,” said Catherine Becker, CEO, adconnection. “We need to work out how to be much more accurate.”

McLachlan agreed. “If [a consumer is] hit with five messages and they haven’t responded, then your message is wrong…but that is the point of what we can now do: calibrate.”

IMG_6675

So it’s about handling the scale of the data effectively; learning to extrapolate and find meaning in the chaos of the big numbers?

“It does have the potential to be fantastic,” added North – but we don’t quite appear to be there just yet. Perhaps we should remember a previous MediaTel debate on the future of media research in which Andrew Bradford, vice president of client consulting at Nielsen, suggested “we just need to make mistakes and treat this as a test and learn environment.”

Whatever happens, big data is unlikely to go away and the research community is going to need to get its head around how best to steer the rest of the industry along the right path if it’s to be used responsibly and to the best effect.

Privacy

“There is a genuine concern,” North says, “that we’ll be exploited by advertisers and people are worried about privacy…if we don’t get it right, then we’ll never get people to share [data] with us.”

“It’s an ethics game, but this isn’t Orwellian,” McLachlan said. “It’s about efficiency and effectiveness. It’s about making better advertising for everyone; clients, brands and the end user.”

IMG_6667

Mills asked who actually owns the data then? The mobile phone company? The brand? The ISP? Or, heaven forbid, the consumer?

“I actually foresee the reclaiming of one’s data as an interesting part of the future,” says North, but none of the panel could really give an answer to this question.

Perhaps we’ll see more of what is happening in the US, where people sell their data, content to have their lives monitored for a fee – but then what isn’t for sale these days? It certainly highlights one key point though: the sheer value of information in today’s media industry.

Small, but perfectly formed

The session ended with some confrontation, illustrating, perhaps, the differing views across the market. McLachlan expressed a view he has now become famous for – that the BARB panel is an expensive survey that has had its day now that there is real time audience data available.

“I find it perplexing,” he said, “that a multi-billion pound advertising system is based on a sample of just 5,400 homes.”

IMG_6635

Lynne Robinson, IPA research director, who sat in the audience jumped in. She believes, quite firmly, that BARB style trading data remains essential and can still be used alongside online/server data – cautioning that not all (big) data is good data.

It’s about quality, not quantity, she insists, having the last word of the day but certainly not the last word we’ll hear on that matter.

Media Jobs