The rush into Digital Transformation and use of digital marketing channels has seen an increase in observational data, almost 80% of which is unstructured. Under such circumstances, understanding your customer or getting a 360° view has a new meaning. Many have questioned whether traditional approaches like the use of surveys will be necessary because of the availability of big data. In his recent article, Gordon Wyner from Marketing Strategies International ponders that “observational data will take an increasing share of market research from surveys and instruments”. He highlights that more accurate results and better outcomes are the only way for alternatives to digital observational data to compete in the new world.

 

Traditional marketing intelligence systems have always been challenged to provide deeper insights, increased predictions in to future behaviors of customers using customer surveys, observations and experiments. They have wrestled with selection bias, non-response bias as well as the ability for traditional approaches to reduce the risk and guarantee success for new products, product/portfolio changes, marketing/message optimization and pricing/margin maximization. In the rush with more computing power than ever before, many companies have unleashed new artificial intelligence/machine learning algorithms using historical secondary/observational data that claim deep insights and predictive capabilities. The groundswell of unstructured data from social/digital media data from consumers is generating a new form of analytics using more sophisticated semantic engines like natural language understanding (using tools like IBM WATSON2, google XXX, among others)

Traditionally, market research, analytics, and marketing intelligence systems were measured on their ROI based on how well they helped reduce the risk of making a bad decision. However, core to reducing the risk or understanding ROI from data, analytics, must also be the ability of the approach to provide for more clarity around making the right decisions to support better customer acquisition, retention, and growth of customers. Actionability or clarity of actions, can only come from whether data (primary or secondary) helps decision makers reduce Signal-to-noise ratio (SNR). SNR has been widely used by engineers when designing electronics equipment. SNR= Signal/Noise or even a simple μ/σ. SNR is often measured in decibels as the unit of measure. SNR, more specifically the Ross criterion, speaks to the minimum threshold SNR required to clearly identify any type of signal; Example-Amplitude, frequency, etc. of audio signals, etc.In statistics, this is often referred to as “coefficient of variation.” Said another way, a higher SNR will yield more weight in the reason to take action or the choice of managerial decision making towards supporting customer-centric value systems.

SNR can only become smaller with such complex systems and unstructured data sources as Voice-to-text (VTT), where massive investments are being poured in for Amazon Alexa, Apple (Siri), Google Home, IBM WATSON, etc. The convenience of collecting VTT and creating big data may overshadow the value of the data and the ability to get higher SNR from this type of data. Innovation in VTT continues, as customer convenience continues to create differentiation in the future, and the value of this data needs to be assessed carefully. Other new technologies are emerging, creating even more noise in the data, particularly from video camera based facial muscle reading algorithms.

How do we make sense of this? Will Gordon Wyner’s concern hold true? Like Gordon Wyner in his article, we believe that new methods will have to merge to balance the convenience of such voice and video data, data from social media, among others that may have more noise driven by non-

representativeness of the responding population and the quality of the data that is collected, while providing convenience to the consumer. We believe that these are innovative ways to engage customers faster and more naturally and the new types of data and intelligence they collect must pass the SNR thresholds required to drive actionability and hence business outcomes.

OSG’s AI based big data analytics platform Dynamo, specifically drives a new 21st century vision for combining cognitive and behavioral ways to analyze data, improve the SNR in the marketing intelligence and deliver guidance to managers to take actions and deliver outcomes

This is why a retail customer of OSG delivered an $100M increase in revenues based on 3% increase in customer engagement while using Dynamo. Its unique big data analytics platform combines unstructured and structured data, that marries both cognitive and behavioral data and drives analysis to deliver the right outcomes. We understand not only what people do or say, but also why and help create actionable decisions. To conclude, at OSG, we think that tomorrow’s surveys will likely be increasingly verbal and recorded as both structured and unstructured responses to Amazon Alexa and Echo, would include recording of facial expressions while Dynamo will bring a better understanding of how to help change customer behavior by bringing the “what” and “why” together for each customer.

Technology helps observe customer behavior, the right triggers can help shape customer behavior.

We hope this information has been interesting and valuable to you. Please, feel free to share it with colleagues and other people in your network. We welcome discussing this topic further with you and understanding your specific challenges.

OSG Steps to Success

OSG is a “catalyst” that helps our clients be the best at decoding their customers’ decisions. Our clients have seen a minimum 20% improvement in customer engagement by implementing smart insights delivered using our behavioral analytics products.