Data Science is the New Measurement

tumblr_m9hc4jz_pp_x1qg0ltco1_400It’s a hoary old chestnut, but “understanding the customer journey” in a world of fragmented consumer attention and multiple devices is not just an AdExchanger meme. Attribution is a big problem, and one that marketers pay dearly for. Getting away from last touch models is hard to begin with. Add in the fact that many of the largest marketers have no actual relationship with the customer (such as CPG, where the customer is actually a wholesaler or retailer), and its gets even harder. Big companies are selling big money solutions to marketers for multi-touch attribution (MTA) and media-mix modeling (MMM), but some marketers feel light years away from a true understanding of what actually moves the sales needle.

As marketers are taking more direct ownership of their own customer relationships via data management platforms, “consumer data platforms” and the like, they are starting to obtain the missing pieces of the measurement puzzle: highly granular, user-level data. Now marketers are starting to pull in more than just media exposure data, but also offline data such as beacon pings, point-of-sale data (where they can get it), modeled purchase data from vendors like Datalogix and IRI, weather data and more to build a true picture. When that data can be associated with a person through a cross-device graph, it’s like going from a blunt 8-pack of Crayolas to a full set of Faber Castells.

Piercing the Retail Veil

Think about the company that makes single-serve coffee machines. Some make their money on the coffee they sell, rather than the machine—but they have absolutely no idea what their consumers like to drink. Again, they sell coffee but don’t really have a complete picture of who buys it or why. Same problem for the beer or soda company, where the sale (and customer data relationship) resides with the retailer. The default is to go to panel-based solutions that sample a tiny percentage of consumers for insights, or waiting for complicated and expensive media mix models to reveal what drove sales lift. But what if a company could partner with a retailer and a beacon company to understand how in-store visitation and even things like an offline visit to a store shelf compared with online media exposure? The marketer could use geofencing to understand where else consumers shopped, offer a mobile coupon so the user could authenticate upon redemption, get access to POS data from the retailer to confirm purchase and understand basket contents—and ultimately tie that data back to media exposure. That sounds a lot like closed-loop attribution to me.

Overcoming Walled Gardens

Why do specialty health sites charge so much for media? Like any other walled garden, they are taking advantage of a unique set of data—and their own data science capabilities—to better understand user intent. (There’s nothing wrong with that, by the way). If I’m a maker of allergy medicine, the most common trigger for purchase is probably the onset of an allergy attack, but how am I supposed to know when someone is about to sneeze? It’s an incredibly tough problem, but one that the large health site can solve, largely thanks to people who have searched for “hay fever” online. Combine that with a 7-day weather forecast, pollen indices, and past search intent behavior, and you have a pretty good model for finding allergy sufferers. However, almost all of that data—plus past purchase data—can be ingested and modeled inside a marketer DMP, enabling the allergy medicine manufacturer to segment those users in a similar way—and then use an overlap analysis to find them on sites with $5 CPMs, rather than $20. That’s the power of user modeling. Why don’t site like Facebook give marketers user-level media exposure data? The question answers itself.

Understanding the Full Journey

Building journeys always falls down due to one missing piece of the puzzle or another. Panel-based models continually overemphasize the power of print and linear television. CRM-based models always look at the journey from the e-mail perspective, and value declared user data above all else. Digital journeys can get pretty granular with media exposure data, but miss big pieces of data from social networks, website interactions, and things that are hard to measure (like location data from beacon exposure). What we are starting to see today is, through the ability to ingest highly differentiated signals, marketers are able to combine granular attribute data to complete the picture. Think about the data a marketer can ingest: All addressable media exposure (ad logs), all mobile app data (SDKs), location data (beacon or 3rd party), modeled sales data (IRI or DLX), actual sale data (POS systems), website visitation data (javascript on the site), media performance data (through click and impression trackers), real people data through a CRM (that’s been hashed and anonymized), survey data that been mapped to a user (pixel-enabled online survey), and even addressable TV exposure (think Comscore’s Rentrak data set). Wow.

Why is “data science the new measurement?” Because, when a marketer has all of that data at their fingertips, something close to true attribution becomes possible. Now that marketers have the right tools to draw with, the winners are going to be the ones with the most artists (data scientists).

It’s a really interesting space to watch. More and more data is becoming available to marketers, who are increasingly owning the data and technology to manage it, and the models are growing more powerful and accurate with every byte of data that enters their systems.

It’s a great time to be a data-driven marketer!

[This post originally appeared in AdExchanger on 8/12/16]

Advertisements

Trends in Programmatic Buying

 

thefuture

The digital marketing future we were promised years ago looks pretty lame in retrospect. This is an image of a trading desk supervisor at Razorfish, circa 2013.

2015 has been one of the most exciting years in digital driven marketing to date. Although publishers have been leading the way in terms of building their programmatic “stacks” to enable more efficient selling of digital media, marketers are now catching up. Wide adoption of data management platforms has given rise to a shift in buying behaviors, where data-driven tactics for achieving effectiveness and efficiency rule. Here’s a some interesting trends that have arisen.

 

Purchase-Based Targeting

Remember when finding the “household CEO” was as easy as picking a demographic target? Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen a them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT). This trend has existed in categories like Automotive and Travel, but is now being seen in CPG. Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them. These purchase-defined segments are a more precise starting point in digital segmentation—and can be augmented by behavioral and contextual data attributes to achieve scale. The big winners here are the folks who actually have the in-store purchase information, such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing.

Programmatic Direct

For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them. It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP an exchange, SSP, and other assorted technologies to get to space on a publisher website. Marketers have seen $10 CPMs turn into just $2 of working media. Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure. A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching. This is especially effective where marketers seek as “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.

Global Frequency Management

The rise in DMPs has also led to another fast-growing trend: global frequency management. Before marketers could effectively map users to all of their various devices (cross-device identity management, or CDIM) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual. Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more. What is the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication. This means big money for marketers, who can finally start to control their messaging—putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen. It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.

Taking it in-House

More and more, we are seeing big marketers decide to “take programmatic in house.” That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data. This trend isn’t as explosive as one might think, based on the industry trades—but it is real and happening steadily. What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks. Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available. Even the oldest school marketer can license their way into a technology stack any agency would be proud of. The only thing really holding back the trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders! When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.

Closing the Loop with Data

One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.” Attitudinal “brand lift” studies, and latent post-campaign sales attribution modeling has been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. “Did my Facebook ad move any items off the shelf?” We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to able to get eCommerce-like attribution for corn flakes soon. In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store. Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display; and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made. These marketing fantasies are becoming more real every day.

Letting the Machines Decide

What’s next? The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, an test-and-learn their way to gain more precision. Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will atart doing the heavy lifting. What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a hige edge in execution. In other words, now we can look at a small group of 1000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common. Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segment automatically generated from that data, and expend it from there. This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.

It’s a great time to be a data-driven marketer!

 

Creating the Fabled 360 View of the Consumer

ImageDespite years of online targeting, the idea of having a complete, holistic “360 degree view” of the consumer has been somewhat of a unicorn. Today’s new DMP landscape and cross-device identification technologies are starting to come close, but they are missing a key piece of the puzzle: the ability to incorporate key social affinities.

In the nearby chart, you can see that online consumers tell us all about themselves in a number of ways:

Viewing Affinities: Where they go online and what they like to look at provides strong signals of what they are interested in. Nielsen, comScore, Arbitron and others have great viewership/listenership data that is strong on demographics, so we can get a great sense of the type of folks a certain website or show attracts. This is great, but brands still struggle to align demographic qualities perfectly with brand engagement. 34 year old men should like ESPN, but they could easily love Cooking.com more.

Buying Affinities: What about a person’s buying habits? Kantar Retail, OwnerIQ, and Claritas data all tell us in great detail what people shop for and own—but they lack information on why people buy the stuff they do. What gets folks staring at a shelf to “The Moment of Truth” (in P&G parlance) when they decide to make a purchase? The buying data alone cannot tell us.

Conversational Affinity: What about what people talk about online? Radian6 (Salesforce), Crimson Hexagon, and others really dig into social conversations and can provide tons of data that brands can use to get a general sense of sentiment. But this data, alone, lacks the lens of behavior to give it actionable context.

Social Behavioral Affinity: Finally, what about the actions people take in social environments? What if we could measure not just what people “like” or “follow” online, but what they actually do (like post a video, tweet a hashtag, or engage with a fan page)? That data not only covers multiple facets of consumer affinity, but also gives a more holistic view of what the consumer is engaged with.

Adding social affinity data to the mix to understand a consumer can be a powerful way to understand how brands relate to the many things people spend their time with (celebrities, teams, books, websites, musicians, etc.). Aligning this data with viewing, buying, and conversational data gets you as close as possible to that holistic view.

Let’s take an example of actionable social affinity in play. Say Whole Foods is looking for a new celebrity to use in television and online video ads. Conventional practice would be to engage with a research firm who would employ the “Q Score” model to measure which celebrity had the most consumer appeal and recognition. This attitudinal data is derived from surveys, some with large enough sample sizes to offer validity, but it is still “soft data.”

Looking through the lens of social data, you might also measure forward affinity: how many social fans of Whole Foods expressed a Facebook “like” for Beyonce, or followed her account on Twitter? This measurement has some value, but fails at delivering relevance because of the scale effect. In other words, I like Beyonce, so does my wife, and so does my daughter . . . along with many millions of other fans—so many that it’s hard to differentiate them. The more popular something is, the broader appeal and less targetability that attribute has.

So, how do you make social affinity data relevant to get a broader, more holistic, understanding of the consumer?

Obviously, both Q Score and forward affinity can be highly valuable. But when mixing viewing, buying, and listening with real social affinity data, much more becomes possible. The real power of this data comes out when you measure two things against one another. Sree Nagarajan, CEO of Affinity Answers, explained this mutual affinity concept to me recently:

“In order for the engagement to be truly effective, it needs to be measured from both sides (mutual engagement). The parallel is a real-world relationship. It’s not enough for me to like you, but you have to like me for us to have a relationship. Mapped to the brand affinity world, it’s not enough for Whole Foods fans to engage with Beyonce; enough Beyonce fans have to engage with Whole Foods (more than the population average on both sides) to make this relationship truly meaningful and thus actionable. When true engagement is married with such mutual engagement, the result is intelligence that filters out the noise in social networks to surface meaningful relationships.”

As an example, this approach was recently employed by Pepsi to choose Nicki Minaj as their spokesperson over several other well-known celebrities.

What else can social affinity data do?

  • Brands can use social affinity data to decide what content or sponsorships to produce for their users. Looking at their users’ mutual affinity between the brand and music, for example, might suggest which bands to sponsor and blog about.
  • A publisher’s ad sales team can use such data to understand the mutual affinity between itself and different brands. A highly correlated affinity between activated social visitors to GourmetAds’ Facebook page and those who post on Capital One’s Facebook page may suggest a previously unknown sales opportunity. The publisher can now prove that his audience has a positive predisposition towards the brand, which can yield higher conversions in an acquisition campaign.
  • What about media buying? Understanding the social affinity of fans for a television show can produce powerful actionable insights. As an example, understanding that fans of “Teen Wolf” spend more time on Twitter than Facebook will instruct the show’s marketing team to increase tweets—and post more questions that lead to increased retweets and replies. Conversely, an Adult Swim show may have more Facebook commenters, leading the marketer to amplify the effect of existing “likes” by purchasing sponsored posts.
  • Keyword buying is also interesting. Probing the mutual affinities between brands and celebrities, shows, music acts, and more can yield long tail suggested keyword targets for Google, Bing/Yahoo, and Facebook that are less expensive and provide more reach than those that are automatically suggested. As an example, when “Beavis and Butthead” re-launched on MTV, Google suggested keywords for an SEM campaign such as “Mike Judge” (the show’s creator) and “animated show.” Social affinity data suggested that socially activated Beavis fans also loved “Breaking Bad.” Guess what? Nobody else was bidding on that keyword, and that meant more reach, relevance, and results.

I believe that understanding social affinity data is the missing piece of the “360 degree view” puzzle. Adding this powerful data to online viewing, buying, and social listening data can open up new ways to understand consumer behavior. Ultimately, this type of data can be used to generate results (and measure them) in online branding campaigns that have thus far been elusive.

Want a full view of the people who are predisposed to love your brand? Understand what you both mutually care about through social affinities—and measure it.

[This post originally appeared in AdExchanger on 4.14.14]

 

Watching you Watch

im_watching_you_watch_me_watch_you_shirt-r24476fd527d14a9cbe18bf36d2edce07_va6lr_512

A few years ago, I had coffee with Nick Langeveld, who left Nielsen to run business development for an interesting company called Affectiva. He was telling me how the company, an MIT labs spin-off, was going to make measurement in a new direction by measuring people’s facial expressions.

Like Intel, who is going to start shipping set top boxes that know who is watching television, Affectiva is using the ability to watch consumers through their webcams as they consume video, and measure the emotions in real-time.

Now, marketers could see the exact moment when they captured surprise, delight, or revulsion in a consumer—and scale that effort to anyone with a webcam, who opted into their panel. This sounded great, but I wondered if and when large marketers would adopt such technology.

The question of adoption was answered early this year, when the company announced that both Unilever and Coca Cola would use the technology to measure all of their marketing efforts this year. In the consumer products wars, perhaps tweaking your video assets to get an extra smile or“wow moment” will give Coke what it needs to pick up market share from Pepsi. Even if that is not the case, the measurement will give their agency creative a very real—and real-time—indication of how impactful their content is.

I think this is a great sign.

The relentless automation of digital media seems to have taken a lot of emphasis off the creative. Now that digital marketers can buy audiences so precisely, delivering the right message doesn’t seem to matter as much. For agency people like R/GA’s Michael Lowenstern, who recently spoke at Digiday Agency Summit on this topic, the right equation is right message plus right creative equals more performance and lower costs. His recent Verizon campaign used hundreds of individual banner creatives, matched to audiences, and raised sales of FiOS 187%. That’s sales increase—not an increase in campaign performance.

Creative matters

When talking about higher funnel branding activities using rich media and video, creative matters even more. Building brands takes means connecting emotionally with consumers, so it would seem like technologies like Affectiva’s, which utilise “big data” approaches to drive branding, all the more relevant for this day and age. Research presented by The Intelligence Group’s Allison Arling-Giorgi at the recent 4As conference showed that Generation Y consumers find humor the most effective form of advertising, so Unilever leveraging technology to squeeze one more smile out of a 30-second spot starts to make a lot of sense.

In the case of Intel, adding a “webcam” to a set top box sounds really scary from a privacy perspective, but starts to make a lot of sense when you think about the migration of television ad dollars to display.

Theoretically, the shifting dollars reflect a desire on the advertisers’ part to achieve more granular targeting, and put the right ads in front of the right people at scale. The digital display channel offers lots of targeting, but the low-quality inventory, commoditized and bland creative units, and tremendous amount of fraud have kept a lot of TV money on the sidelines.

In 2009, Morgan Stanley analyst Mary Meeker estimated the gap between where people spend their time (increasingly online) and where advertisers spend their money (TV) represented a $50 billion global opportunity. While that has shrunk considerably, the gap is still measured in the tens of billions, despite the enormity of Google, and the embrace of ubiquitous social platforms such as Facebook and Twitter.

Connected set top boxes may be biggest online advertising disruptor

Despite being a mass reach vehicle today, plans like Intel’s threaten to disrupt online advertising in a far more fundamental way than digital has been impacting traditional ads. Connected set top boxes are connected to households—the composition of which are easy to access, both from a demographic and financial perspective. They also tend to deliver a much more engaging video experience for the brand advertiser than ignored “pre-roll” inside a tiny 300×250 pixel video player.

Now, add the element of being able to actually see who is on the couch watching—and tailor ad messaging to those viewers, based on their age, and contextual relevance of the content they are consuming. That’s powerful in a way that digital cannot ever become, despite the rise of “social TV” watchers who tweet during “appointment viewing” shows like The Walking Dead.

Delivering ads to TVs, depending on who is watching them? That’s something to really watch out for. 

[This post originally appeared on the EConsultancy Blog on 4/8]

Social Affinity

Is your media measurement as dated as this 1970s den?

Is your media measurement as dated as this 1970s den?

The New Panel-Based Audience Measurement for Brands

With the prevalence of social data, yesterday’s panel-based measurement for digital campaigns is starting to look like the wood paneling in your grandmother’s den: A bit out of fashion. Marketers have been trained to buy media based on demographics, and it is natural to want your ads to be where you think your customers are. For BMW’s new entry level sedan, that might mean finding the media that males aged 26-34, who are earning 75,000 or more a year, consume. That makes a lot of sense, but it also means that your paid media will always compete alongside ads for your competitors. That is a big win for websites with premium inventory that fits your demographic, because it means scarcity and high prices for marketers.

What if there was another way to measure what audiences are right for brands? And what if that data were based on a panel of a few hundred million people, rather than a few thousand? Well, thanks to Facebook and Twitter, we have just such a web-based panel of consumers, and they are always eager to share their opinions in the forms of “likes,” “follows,” and (more importantly) engagement. Social listening platforms have been able to tell brands what people think about them directionally, and measure how certain marketing efforts move the social needle. Listening is great, but how do you get to hear what to buy?

A company called Colligent has been going beyond listening, by measuring what people actually do on Twitter, Facebook, and other social sites. “Liking” is not enough (when me, my 10-year old daughter, and my mom “like” Lady Gaga, the audience I am a part of gets too broad to target against). What matters is when people express true affinity by sharing videos, tweeting, and commenting. When people who are nuts about a certain celebrity are also nuts about a certain brand—and that relationship over-indexes against normal affinity, you have struck real social gold: Data that can make a difference. Pepsi recently used such data to choose Nicki Minaj as a spokesperson over dozens of other choices.

What about other media? Nielsen defines television, Arbitron measures radio, and MRI defines magazine audiences by demographics. Now, for the first time, marketers can use social data—gathered from panels nearly as large as the buying population—to define audiences by their own brand and category terms. That’s a world in which Pepsi can purchase “Pepsi GRPs” across all media, rather than GRPs in a specific media.

This is the way brands will buy in the future.

[This post originally appeared on 2/21/13 in The CMO Site, a United Business Media publication]