Digital data holds promise but it is an uphill battle

Working out a suitable marketing mix, getting to know more about customers via touchpoints like websites, mobile apps, social media channels and having better control over spending owing to newer attribution models are all strong reasons for digger deeper into digital data, but it isn’t easy.

Achieving real-time from vast amounts of unstructured data is nothing short of a roller coaster ride but for those that persevere it is worth the effort. Here EyeforTravel’s Ritesh Gupta talks to Suneel Grover, senior solutions architect, SAS, about measuring the value of digital data and why is attribution is still tough.

EFT: The focus today is on making every ad dollar count via initiatives such as attribution. How can digital data prove the value of digital marketing?

SG: There are three general approaches to attribution:

a.Last (or first) click attribution

b.Rules-based attribution (ie. Equally weighted)

c.Algorithmic attribution (ie. Statistical) ---- BEST Practice Recommendation

Why is attribution difficult, and how can digital data help? Until recently most digital behavioural data was sourced to web analytics, social media, search, display, and email. In most cases, this data was trapped in sub-digital channel silos. For the larger part of the past decade, the vendors that supported this space typically owned the data, and provided SaaS user-experience offerings to the digital marketing community. Digital data has always been a big data challenge, and the offerings from these vendors focused on analytically immature services that were limited to endless reports, summarisations, and reactive BI (Business Intelligence). Although valuable, maturing towards predictive analytics and advanced data mining approaches was never an option. If your organisation doesn’t own the data, you have nothing to feed into your data-mining tool of choice. This doesn’t help when an executive asks the question: Why?

Recently, there has been a groundswell in the integrated marketing and/or omni channel marketing communities to push digital data into the predictive analytic arena. I am seeing organisations in every business vertical imaginable striving to make progress and generating more value from their customer digital data streams.

To meet on the promise of digital attribution, you must have access to granular, detailed digital data that can be prepared, fed and processed by statistical modeling technology to inform the marketer what is influencing the consumer to purchase/convert with accurate weighting. Was it the first impression via search? Was it the 3rd impression via email? Was it an interaction with their 2nd visit to our website together with the display ad that surfaced on their social media profile page? What was more influential? What was less? Data-driven evidence in combination with your team’s subject matter expertise is our recommended best practice to the digital attribution challenge, and proving digital marketing’s value.

EFT: How can one aptly measure the value of digital data?

SG: By advancing your analytic capabilities from business intelligence, reporting and retroactive analysis and progressing towards data mining, forecasting, optimisation and proactive analysis. Ultimately, reaching the state of leveraging prescriptive analytics to make informed decisions supported by data-driven evidence. The value of digital data (or any other source for that matter) will become clearer if you are achieving quantifiable improvements towards your business goals.

EFT: What exactly does it take to process vast amounts of digital data?

SG: I will try and keep this at a high level. There are many database companies which continue to make strides in processing large volumes of data (whether that data is digitally sourced or from another origination). But processing data can take a variety of paths, and gleaning value from data requires analytics – in my opinion. Thus, running data for a summary report versus processing data through a predictive model with multiple algorithms requires different thinking as to optimising the supporting architecture to provide answers. For those interested in leveraging predictive analytics and exploratory data mining on large volumes of digital data, here are three recommended approaches:

a.In-memory Processing

b.In-database Processing

c.Grid processing

Although each process methodology is unique, there are different flavours for various challenges and levels of sophistication. Overall, the end goal is the same: to leverage computationally intensive advanced analytics on very large amounts of data efficiently to support a limited window of time to make an informed decision and/or execution.

EFT: What do you make of the role of big data in enhancing the efficacy of various attribution models?

SG: Attribution models thrive on data detail, and the more detail the better. If explainable (ie. white-box) attribution models become more accurate for the organisation, executives with decision authority will pay attention. Close attention.

Related Reads

comments powered by Disqus