Josh Prismon and Matt Beck came up next to walk through a 5 step plan for Big Marketing – Marketing in the era of Big Data.
- Admit you have a problem
- Stop interrupting and start understanding
- Create a personal dialogue guided by analytics
- Continuously improve your decisions
- Push out by building network effects
The context for this is the death of what they call “interruption marketing” – generating demand by interrupting people with telemarketing, direct mail, email and other broadcast techniques. Some techniques no longer work as well as they once did but some can be adapted to this new personalized marketing – social followers open emails at 3x the rate, individualized offers generate 20x response and triggered emails are 2x effective as scheduled ones.
So 1: Admitting you have a problem.
Many companies are basically yelling about their latest product every so many days without any idea what customers want, without any sense of long term engagement due to ongoing amnesia (forgetting what you said to them previously), and with no.closed loop for feedback. These same financial services companies often run their risk business completely differently. Acknowledging this leads to the next point.
2. Stop interrupting and start understanding
Consumers are more likely to share data with you if they feel like you are using that data to help them. The reality is that a company knows a lot about their customers, everything from registration and browsing history to transactions and payments. The company also has lots of ways to connect with customers who use the website, the mobile app or the call center for instance. And the company can use analytics and experience to engage with relevant offers and pricing. All of this can be measured using well defined and knowable KPIs. Challenges that must be addressed include silos that keep data from being turned into a cohesive view. Companies should also not underestimate the power
- of asking customers what they want!
Data about your customers is everywhere including transactions, geolocation data from the mobile app, ATM transactions,online or call center interactions and much more. Transaction history is critical because it is so core to what a customer is doing but the richness of data sources can and should be leveraged. Leveraging all this data requires dealing with Big Data that has massively increase volume, velocity and variety.
- The volume of new data that must be organized, and especially the volume of data about which you know little, is a critical issue for handling Big Data and is driving companies to adopt Hadoop.
- The velocity of Big Data means you need to handle streaming data, do streaming analytics and deliver fast responses for customers and that’s driving the NoSQL products focused on specific kinds of data storage use cases
- The variety of Big Data means includes all the old data you have and adds new kinds of data to it. Companies want to keep their investment in SQL-based data and extend them which leads to NewSQL technologies like Vertica, Vectorwise and Impala.
3. Create a personal dialogue guided by analytics
Central to guiding a dialogue is a sense of not just what a customer will respond to but when they will respond. This is different from more traditional campaigns where the timing was fixed. To respond to this FICO has been developing Time to Event Propensity models that is a form of discrete hazard modeling adapted for marketing. These new analytics require more data and have to be produced in very large numbers. This means storing data for a particular analytic task in a simple, scalable format and then automating the production of these new analytics at scale. Thousands of possible predictive attributes are created, the best attributes are selected automatically and thousands of models are being created, managed and updated. Then the models and all the necessary meta data about the attributes are automatically deployed.
In addition companies must be able to pull all their data together into a conversation – take all the specific interactions with a customer as well as new data sources from outside and combine into a timeline – what they call event sourcing. Snaphosts get calculated at the moments when their state changes including predictive analytics. This kind of timeline has good scalability and performance characteristics and as you have more data the value increases.
4. Continuously improving your decisions
This is simply applying the proven techniques of A/B testing and champion/challenger testing to improve the quality of decisions over time.
5. Push out by building network effects
Adding more data to the event stream continuously improves the analytics that can be built from it. Doing this incrementally, breaking down silos and integrating the data from them one at a time, works and adds value steadily. That said it is critical to show the business value of each step and to iterate quickly.