(Coverage) Salesforce Bolsters Einstein AI With Heavy-Duty Data Management

Through its acquisition of Krux, Salesforce is combining its artificial intelligence (AI) layer with deeper data management in Salesforce Marketing Cloud.
pbc

Customer Relationship Management and Data Management come together in a delicious way.

Today at its Salesforce World Tour stop in New York, the company began to roll back the curtain on how its AI and data layers will work together. Salesforce announced new AI, audience segmentation, and targeting features for Marketing Cloud based on its recent acquisition of data management platform Krux. The company’s new Marketing Cloud features, available today, add more data-driven advertising tools and an Einstein Journey Insights dashboard for monitoring end-to-end customer engagement in everything from e-commerce to email marketing.

Salesforce unveiled its Einstein AI platform this year, baking predictive algorithms, machine and deep learning, as well as other data analysis features throughout its Software-as-a-Service (SaaS) cloud. Einstein is essentially an AI layer between the data infrastructure underneath and the Salesforce apps and services on top. The CRM giant is no stranger to big money acquisitions, most recently scooping up Demandware for $2.8 billion and making a play for LinkedIn before Microsoft acquired it. The Krux acquisition gives Salesforce a new, data-driven customer engagement vector.

“We’re working to apply AI to all our applications,” said Eric Stahl, Senior Vice President of Marketing Cloud. “In Marketing Cloud, Krux now gives us the ability to do things like predictive journeys to help the marketer figure out which products to recommend. We can do complex segmentation, inject audiences into various ad networks, and do large-scale advertising informed by Sales Cloud and Service Cloud data.”

As Salesforce and Krux representatives demonstrated Krux and how it fits into the Marketing Cloud, the data management platform acted more like a business intelligence (BI) or data visualization tool than a CRM or marketing platform. Chris O’Hara, head of Global Data Strategy at Krux, talked about the massive quantities of data the platform manages, including an on-demand analytics environment of 20 petabytes (PB)—the entire internet archive is only 15 PB.

526951-krux-data-pattern-analysis“This is our idea of democratizing data for business users who don’t have a PhD in data science,” said O’Hara. You can use Krux machine-learned segments to find out something you don’t know about your audience, or do a pattern analysis [screenshot above] to understand the attributes of those users that correlate greatly. We’re hoping to use those kinds of signals to power Einstein and do things like user scoring and propensity modeling.

The Einstein Journey Insights feature is designed to analyze “hundreds of millions of data points” to identify an optimal customer conversion path. In addition to its Krux-powered Marketing Cloud features, Salesforce also announced a new conversational messaging service called LiveMessage this week for its Salesforce Service Cloud. LiveMessage integrates SMS text and Facebook Messenger with the Service Cloud console for interactions between customers and a company’s helpdesk bots.

The more intriguing implications here are what Salesforce might do with massively scaled data infrastructure like Krux beyond the initial integration. According to O’Hara, in addition to its analytics environment, Krux also processes more than more than 5 billion monthly CRM records and 4.5 million data capture events every minute, and maintains a native device graph of more than 3.5 billion active devices and browsers per month. Without getting into specifics, Salesforce’s Stahl said there will be far more cross-over between Krux data management and Einstein AI to come. In the data plus AI equation, the potential here is exponential scale.

 

CMOs and CIOs need to be more aligned

A survey of both senior marketing and IT professionals has revealed that there are significant differences between these two core business functions in their perception of organizational priorities and the quality of digital infrastructure. Governance frameworks to ensure better alignment between the CMO and CIO are often lacking.

The Backbone of Digital report, freely available from ClickZ (registration required), has also found that, compared to their colleagues in marketing,  IT professionals have a much rosier view of the customer experience their companies are delivering across digital channels.

Below I have outlined more detail around three key findings from the research which is sponsored by communications infrastructure services company Zayo.

IT pros have exaggerated view of the quality of their companies’ current infrastructure

According to the research, 88% of IT respondents describe their company’s infrastructure as ‘cutting-edge’ or ‘good’, compared to only 61% of marketing-focused respondents, a massive difference of 27 percentage points.

The research also looks at the ability of tech infrastructure to deliver across a range of marketing communications channels, with IT respondents and marketers both asked to rate performance.

Both marketers and IT professionals felt that the best engagement and experience is delivered across desktop, cited as ‘excellent’ or ‘good’ by 71% and 93% of these groups respectively, but trailed by other channels including mobile website, mobile app, desktop display, mobile display, social and push messaging.

zayo-communications-infrastructure-figure

Across the board it is evident that those working in IT have a much more optimistic view of how well they are delivering across the full gamut of digital channels compared to their IT counterparts.

It seems likely that those working in more customer-facing departments, i.e. marketers (generally), are much more likely to be aware of deficiencies impacting customer experience which can adversely affect business performance and brand reputation (and often their own bonuses).

A lack of co-operation is undermining excellence in digital delivery

Just 19% of marketers strongly agree with the statement “marketing and IT work closely together to ensure the best possible delivery of product/service”, and only 11% strongly agreed that they “have a clear governance framework to ensure that CIOs/CTOs and CMOs work together effectively”, suggesting a lack of alignment around marketing and IT business objectives.

This compares to 45% of IT professionals who strongly agreed that “marketing and IT work closely together to ensure the best possible network performance”, and a similar percentage (46%) who strongly agreed that they “have a clear governance framework to ensure that front-end business applications and back-end infrastructure work together effectively”.

While there are differing perceptions about the extent of marketing and IT co-operation, the report concludes that business objectives need to be much better aligned to ensure closer harmony across these core business functions.  If a framework to facilitate this is not put in place at the top of the organization, it becomes exponentially more difficult to implement lower down.

Speed of data-processing is crucial – real-time means real-time

Marketers are increasingly aware that the proliferation of data sources at their disposal is only of use to their businesses if they can analyse that information at high speed and transform it into the kind of intelligence that can then manifest itself as the most relevant and personalized messaging or call to action for any given site visitor.

According to Mike Plimsoll, Product and Industry Marketing Director at Adobe:

“A couple of years ago the marketing leaders at our biggest clients typically expected that data could be processed within 24 hours and that was fine.

“Now when we talk to our clients the expectation is that data is processed instantly so that when, for example, a customer engages with them on the website, the offer has been instantly updated based on something they’ve just done on another channel. All of a sudden ‘real-time’ really does mean ‘real-time’.”

The ability to harness ‘big data’ has become a pressing concern for IT departments as their colleagues in marketing departments seek to ensure they can take advantage of both structured and unstructured data and ensure the requisite speeds for real-time optimization of targeting, messaging and pricing.

More than half of IT respondents (56%) said that the ability to manage and optimize for big data was currently a ‘very relevant’ topic for their organization, in addition to 37% who said it was ‘quite relevant’.

zayo-big-data-figure

According to Chris O’Hara, Head of Global Data Strategy at Krux Digital:

“Today, consumers that are used to perfect product recommendations from Amazon and movie recommendations from Netflix expect their online experiences to be personal, email messages to be relevant, and web experiences customized.

“Delivering good customer experience has the dual effect of increasing sales lift, and also reducing churn by keeping customers happy. Things like latency, performance, and data management are all part and parcel of delivering on that concept.”

Please download our Backbone of Digital research which, as well as a survey of marketing and IT professionals, is also based on in-depth interviews with senior executives at a number of well known organizations.

Dynamic Real Time Segmentation

What-is-Real-Time-MarketingThe term “real time” is bandied about in the ad technology space almost as heavily as the word “programmatic.”

Years later, the meaning of programmatic is finally starting to be realized, but we are still a few years away from delivering truly real-time experiences. Let me explain.

Real-Time Programmatic

The real-time delivery of targeted ads basically comes down to user matching. Here is a common use case: A consumer visits an auto site, browses a particular type of minivan, leaves the site and automatically sees an ad on the very next site he or she visits. That’s about as “real-time” as it gets.

How did that happen? The site updated the user segment to include “minivan intender,” processed the segment immediately and sent that data into a demand-side platform (DSP) where the marketer’s ID was matched with the DSP’s ID and delivered with instructions to bid on that user. That is a dramatic oversimplification of the process but clearly many things must happen very quickly – within milliseconds – and perfectly for this scenario to occur.

Rocket Fuel, Turn and other big combo platforms have an advantage here because they don’t need to match users across an integrated data-management platform (DMP) and DSP. As long as marketers put their tags on their pages and stay within the confines of a single execution system, this type of retargeting gets close to real time.

However, as soon as the marketer wants to target that user through another DSP or in another channel, user matching comes back into play. That means pushing the “minivan intender” ID into a separate system, but the “real-time” nature of marketing starts to break down. That’s a big problem because today’s users move quickly between channels and devices and are not constrained by the desktop-dominated world of 10 years ago.

User matching has its own set of challenges, from a marketer’s ability to match users across their devices to how platforms like DMPs match their unique IDs to those of execution platforms like DSPs. Assuming the marketer has mapped the user to all of his or her device IDs, which is a daunting challenge, the marketer’s DMP has to match that user as quickly as possible to the execution platform where the ads are going to be targeted and run.

Let’s think about how that works for a second. Let’s say the marketer has DMP architecture in the header of the website, which enables a mom to be placed in the “minivan” segment as soon as the page loads. After processing the segment, it must be immediately sent to the DSP. Now the DSP has to add that user (or bunch of users) to their “minivan moms” segment. If you picture the internet ID space as a big spreadsheet, what is happening is that all the new minivan moms are added to the DSP’s big existing table of minivan moms so they are part of the new targeting list.

Some DSPs, such as The Trade Desk, TubeMogul and Google’s DBM, do this within hours or minutes. Others manage this updating process nightly by opening up a “window” where they accept new data and process it in “batches.” Doesn’t sound very “real-time” at all, does it?

While many DMPs can push segments in real time, the practical issue remains the ability of all the addressable channels a marketer wants to target to “catch” that data and make it available. The good news is that the speed at which execution channels are starting to process data is increasing every day as older ad stacks are re-engineered with real-time back-end infrastructure. The bad news is that until that happens, things like global delivery management and message sequencing across channels will remain overly dependent upon how marketers choose to provision their “stacks.”

The Future Is Dynamic

Despite the challenges in the real-life execution of real-time marketing, there are things happening that will put the simple notion of retargeting to shame. Everything we just discussed depends on a user being part of a segment. I probably exist as a “suburban middle-aged male sports lover with three kids” in a variety of different systems. Sometimes I’m an auto intender and sometimes I’m a unicorn lover, depending on who is using the family desktop, but my identity largely remains static. I’m going to be middle aged for a long time, and I’m always going to be a dad.

But marketers care about a lot more than that. The beer company wants to understand why sometimes I buy an ice-cold case of light beer (I’m about to watch a football game, and I might drink three or four of them with friends) and when I buy a six-pack of their craft-style ale (I’m going to have one or two at the family dinner table).

The soda company is competing for my “share of thirst” with everything from coffee to the water fountain. They want to know what my entry points are for a particular brand they sell. Is it their sports drink because I’m heading to the basketball court on a hot day, or is it a diet cola because I’m at the baseball game? The coffee chain wants to know whether I want a large hot coffee (before work) or an iced latte macchiato (my afternoon break).

This brings up the idea of dynamic segmentation: Although I am always part of a static segment, the world changes around me in real time. The weather changes, my location changes, the time changes and the people around me change constantly. What if all of that dynamic data could be constantly processed in the background and appended to static segments at the moment of truth?

In a perfect world, where the machines all talked to each other in real time and spoke the same language, this might be called real-time dynamic segmentation.

This is the future of “programmatic,” whatever that means.

[This originally appeared in AdExchanger on 8/31/2016]

Data Science is the New Measurement

tumblr_m9hc4jz_pp_x1qg0ltco1_400It’s a hoary old chestnut, but “understanding the customer journey” in a world of fragmented consumer attention and multiple devices is not just an AdExchanger meme. Attribution is a big problem, and one that marketers pay dearly for. Getting away from last touch models is hard to begin with. Add in the fact that many of the largest marketers have no actual relationship with the customer (such as CPG, where the customer is actually a wholesaler or retailer), and its gets even harder. Big companies are selling big money solutions to marketers for multi-touch attribution (MTA) and media-mix modeling (MMM), but some marketers feel light years away from a true understanding of what actually moves the sales needle.

As marketers are taking more direct ownership of their own customer relationships via data management platforms, “consumer data platforms” and the like, they are starting to obtain the missing pieces of the measurement puzzle: highly granular, user-level data. Now marketers are starting to pull in more than just media exposure data, but also offline data such as beacon pings, point-of-sale data (where they can get it), modeled purchase data from vendors like Datalogix and IRI, weather data and more to build a true picture. When that data can be associated with a person through a cross-device graph, it’s like going from a blunt 8-pack of Crayolas to a full set of Faber Castells.

Piercing the Retail Veil

Think about the company that makes single-serve coffee machines. Some make their money on the coffee they sell, rather than the machine—but they have absolutely no idea what their consumers like to drink. Again, they sell coffee but don’t really have a complete picture of who buys it or why. Same problem for the beer or soda company, where the sale (and customer data relationship) resides with the retailer. The default is to go to panel-based solutions that sample a tiny percentage of consumers for insights, or waiting for complicated and expensive media mix models to reveal what drove sales lift. But what if a company could partner with a retailer and a beacon company to understand how in-store visitation and even things like an offline visit to a store shelf compared with online media exposure? The marketer could use geofencing to understand where else consumers shopped, offer a mobile coupon so the user could authenticate upon redemption, get access to POS data from the retailer to confirm purchase and understand basket contents—and ultimately tie that data back to media exposure. That sounds a lot like closed-loop attribution to me.

Overcoming Walled Gardens

Why do specialty health sites charge so much for media? Like any other walled garden, they are taking advantage of a unique set of data—and their own data science capabilities—to better understand user intent. (There’s nothing wrong with that, by the way). If I’m a maker of allergy medicine, the most common trigger for purchase is probably the onset of an allergy attack, but how am I supposed to know when someone is about to sneeze? It’s an incredibly tough problem, but one that the large health site can solve, largely thanks to people who have searched for “hay fever” online. Combine that with a 7-day weather forecast, pollen indices, and past search intent behavior, and you have a pretty good model for finding allergy sufferers. However, almost all of that data—plus past purchase data—can be ingested and modeled inside a marketer DMP, enabling the allergy medicine manufacturer to segment those users in a similar way—and then use an overlap analysis to find them on sites with $5 CPMs, rather than $20. That’s the power of user modeling. Why don’t site like Facebook give marketers user-level media exposure data? The question answers itself.

Understanding the Full Journey

Building journeys always falls down due to one missing piece of the puzzle or another. Panel-based models continually overemphasize the power of print and linear television. CRM-based models always look at the journey from the e-mail perspective, and value declared user data above all else. Digital journeys can get pretty granular with media exposure data, but miss big pieces of data from social networks, website interactions, and things that are hard to measure (like location data from beacon exposure). What we are starting to see today is, through the ability to ingest highly differentiated signals, marketers are able to combine granular attribute data to complete the picture. Think about the data a marketer can ingest: All addressable media exposure (ad logs), all mobile app data (SDKs), location data (beacon or 3rd party), modeled sales data (IRI or DLX), actual sale data (POS systems), website visitation data (javascript on the site), media performance data (through click and impression trackers), real people data through a CRM (that’s been hashed and anonymized), survey data that been mapped to a user (pixel-enabled online survey), and even addressable TV exposure (think Comscore’s Rentrak data set). Wow.

Why is “data science the new measurement?” Because, when a marketer has all of that data at their fingertips, something close to true attribution becomes possible. Now that marketers have the right tools to draw with, the winners are going to be the ones with the most artists (data scientists).

It’s a really interesting space to watch. More and more data is becoming available to marketers, who are increasingly owning the data and technology to manage it, and the models are growing more powerful and accurate with every byte of data that enters their systems.

It’s a great time to be a data-driven marketer!

[This post originally appeared in AdExchanger on 8/12/16]