Festival of Media April 2017

On the subject of data

Looking at what the media world has had to say recently about data indicates there will be some lively discussion at the upcoming Festival of Media in Rome.

Before heading to the event, I thought it would be interesting to look at what some of the speakers have said about the role of data in the media industry over recent months.

Rob Norman, GroupM’s Chief Digital Officer makes it clear in their recent white paper that data is a key to delivering useful, customer-centric advertising.

“Useful advertising is a function of relevance which in turn is a function of time, place, context, cognitive targeting and creation, and action-ability. All of those points of relevance are directed by data.”

This begs the question of digital publishers; can they help make more useful advertising and raise the value of their digital content by providing more data. Amazon’s VP Global advertising sales, Seth Dallaire is in no doubt of the power of the data they hold based on a recent article in Campaign.

“Amazon’s behavioural data can challenge or disprove a brand’s notions of their own customers.”

If the value of data is well understood on both the buy side and sell side, what could possibly go wrong? We have some clues from Gerry D’Angelo in an Ebiquity interview he gave just before moving to P&G as Global Media Director, “In today’s media environment, the level of complexity driven by ad tech is straining relationships between advertisers and media agencies.”

And it appears that this issue has not gone unnoticed by media agencies as expressed by Sam Phillips CMO, Omnicom Media Group, when recently asked by Campaign, “If you could change one thing about the industry, what would it be”?

Sam responded“Making data complexity more comprehensible. We're working on it.”

Since the advertising industry must get to grips with ever-increasing data sources at rapidly growing data scales, this is not an issue that is going away anytime soon.

I am looking forward to engaging in the discussion at Festival of Media as we are convinced that the existing technology for creating insights and value from complex, extreme scale data is broken.

More info about big data analysis at extreme scale

GeoSpock provides sub-second big data analysis tools to companies that are ingesting billions or even trillions of data points per day. We deliver improvements to business planning and insight driven optimisations with our real-time visualisation and analysis tools. A step-change in the technology used to index and retrieve information drives value from extreme data sets where existing technology hits barriers. The tools can be stand-alone or plug into your existing dashboard whilst offering the prospect of reducing data storage charges by a third.

World leading Cambridge technology is at the heart of our infin8 indexing engine. Our entirely new approach to storing and retrieving data increases customers’ data analysis capabilities 1000- fold over existing technologies.

Check out our products to find out more.

Back to GeoSpock Blog