Sharing some thoughts on launching products at Salesforce. Watch the video with password “pmwsf”
Proud to announce that my book, Data Driven, has won the 2019 Silver medal for best Business Technology book!
In August of 2007, Jenkins Group launched the Axiom Awards, “recognizing and promoting the year’s best business books.” Now, 12 years later, they have announced the winners of the 2019 Axiom Business Book Awards, honoring this year’s best business books, their authors, and publishers.
The Axiom Business Book Awards are intended to bring increased recognition to exemplary business books and their creators, with the understanding that business people are an information-hungry segment of the population, eager to learn about great new books that will inspire them and help them improve their careers and businesses.
If you want to learn about data, Chris O’Hara is the right person to ask. O’Hara, who leads global product marketing for Salesforce Marketing Cloud’s suite of data and audience products, is a big believer in the data revolution—but first, marketers need to take stock of what data they actually have.
“Some marketers think they have way more data than they actually have, and others think they don’t have a lot of data but actually do,” O’Hara said.
Before joining Salesforce, O’Hara was at Krux, the data management platform that Salesforce acquired in 2016, working on data marketing. In October, O’Hara, along with Krux alums Tom Chavez and Vivek Vaidya, released a book, “Data Driven,” which dives into how marketers should think about using data to overhaul customer engagement and experience.
Before the book’s release, Adweek talked with O’Hara about the book and about how marketers can leverage the data they have while keeping data privacy and consumer trust in mind. A portion of that conversation, which has been edited and condensed for clarity, is below.
Adweek: A lot of marketers have talked about the importance of getting better at explaining to consumers what exactly is being collected and how exactly data is being used. Do you think it’s the responsibility of tech and advertising companies to explain that to the public?
Chris O’Hara: Marketing is better when you have the permission of consumers. Consumers are entitled to know exactly how their data is being used, and consumers are absolutely entitled to have control over their own data. As you talk about the opportunities to get more personalized with customers, you’re allowed to deliver great personalization if the customer has opted in for you to do that on their behalf. If you do that without their consent, it feels creepy and wrong, right? It’s common sense. We’re always going to lead with the idea that trust comes first and that marketing is better with consent. Period.
You write in your book that the biggest risks of harnessing data are centered around privacy, security and trust. As concerns about data privacy grow, and as data breaches continue to occur, how does the industry best rebuild trust with the public? Where does the industry start with reestablishing trust and maintaining trust with consumers?
It’s all based on permission and an opted-in consumer. I like getting advertising messages that are relevant. When I am shopping for a car and I give Cars.com permission to introduce me to new models and send me an email every week, I appreciate it because I’ve asked for it. When I engage with certain sites on the web, like The Wall Street Journal, where I pay for content, I trust them with a certain amount of my data so they can make my reading experience better. That’s the way it should have been, always. Unfortunately, there are some companies in the space that have taken advantage of little oversight to do otherwise. But what we’ve seen in the market is that companies that are not leading with trust are not being valued as highly or perceived as more valuable than companies that do put trust at the center of their relationship with customers.
What’s the biggest misconception marketers have with data?
Something we write about in the book is that some marketers think they have way more data than they actually have, and others think they don’t have a lot of data but actually do. One of Pandora’s svps, Dave Smith, came to us and said, ‘I have one of the biggest mobile data assets in the world. Everyone who uses Pandora is logged in, so we know so much about our customers: what kind of cellphone they have, what kind of music they like, perhaps the ages of the kids in their home, when they listen.’ That’s a lot of data. Pandora probably has one of the largest data assets in the entire world. But Pandora doesn’t know when people are going to buy a car or people’s incomes, necessarily. They don’t know when you’re planning on taking a family vacation. So they turned to second- and third-party data to enrich their understanding of consumers.
COLOGNE – At Salesforce, the acquisitions keep on coming, most recently that of AI-powered marketing intelligence and analytics platform Datorama. The company’s ongoing mantra is “integration” and it seems to have no shortage of assets to leverage in that quest.
It all stems from what Chris O’Hara, VP, Product Marketing, calls the “fourth industrial revolution” led by things like data, AI and the internet of things.
“It’s harder for marketers to deliver personalization at scale to consumers and that’s the goal. So everything we’re doing at Salesforce is really about integration,” O’Hara says in this interview with Beet.TV at the recent DMEXCO conference.
By way of examples, he cites the acquisition of ExactTarget about four years ago with the intention of making email “a very sustainable part of marketing, such that it’s not just batch and blast email marketing but it’s also your single source of segmentation for the known consumer.” The end result was the ExactTarget Marketing Cloud Salesforce Integration.
In late 2016, Salesforce bought a company called Krux and within six months had morphed it into Salesforce DMP. It was a way to assist marketers in making sense of households “comprised of hundreds of cookies and dozens of different devices” and aggregate them to a single person or households “so can get to the person who makes the decision about who buys a car or what family vacation to take,” O’Hara says.
Salesforce DMP benefits from machine-learned segmentation, now known as Einstein Segmentation, to make sense out of the thousands of attributes that can be associated with any given individual and determine what makes them valuable. Developing segments by machine replaces “you as a marketer using your gut instinct to try to figure out who’s the perfect car buyer. Einstein can actually tell you that.”
In March of 2018, MuleSoft, one of the world’s leading platforms for building application networks, joined the Salesforce stable to power the new Salesforce Integration Cloud. It enables companies with “tons of legacy data sitting in all kinds of databases” to develop a suite of API’s to let developers look into that data and “make it useful and aggregate it and unify it so it can become a really cool, consumer-facing application, as an example.”
Datorama now represents what O’Hara describes as a “single source of truth for marketing data, a set of API’s that look into campaign performance and tie them together with real marketing KPI’s and use artificial intelligence to suggest optimization.”
In addition to driving continual integration, Salesforce sees itself as “democratizing” artificial intelligence, according to O’Hara. “There’s just too much data for humans to be able to make sense of on their own. You don’t have to be a data statistician to be able to use a platform like ours to get better at marketing.”
This interview is part of a series titled Advertising Reimagined: The View from DMEXCO 2018, presented by Criteo. Please find more videos from the series here.
Good time with the Canadian Marketing Association digging into data management!
Currently, marketers don’t have a single source of truth about their consumers. Tomorrow, there must be a single place to build consumer profiles with rich attribute data, and provisioned to the systems of engagement where that consumer spends their time.
At a recent industry event, we heard a lot about the upcoming year in marketing, and how data and identity will play a key role in driving marketing success.
As a means to master identity, some companies have heralded the idea of the customer data platform (CDP), but the category is still largely undefined. For example, many Salesforce customers believe that they already have a CDP. The reason? They have several different ways of segmenting known and unknown audiences between a data management platform (DMP) and CRM platform.
In an article I wrote here last year, I introduced a simple “layer cake” marchitecture, describing the three core competencies for effective modern marketing. In such a fast moving and evolving industry, I have since refined it to the core pillars of identity, orchestration and intelligence:
With this new marchitecture, brands have the ability to know consumers, engage with them through each touchpoint and use artificial intelligence to personalize each experience.
Mastering each layer of complexity is difficult, requiring an investment in time, technology and people. Lets focus on perhaps the most important – the data management layer where the new CDP category is trying to take hold.
The next wave of data management
By now, it’s safe to say marketers have mastered managing known data. A few years ago, when I was working for a software company that also managed postal mailing lists, I was astonished at the rich and granular data attached to mailing lists. There is a reason direct mail companies can justify $300 CPMs – it works, because direct marketers truly know their customers.
After joining Salesforce, I was similarly awed by the power to carefully segment CRM data, and provision journeys for known customers spanning email, mobile, Google and Facebook, customer service interactions and even community websites.
How can we get to this level of precision in the world of unknown (anonymous) consumer data?
As marketing technology and advertising technology converge, so must the identity infrastructure that underlies both. Put more simply, tomorrow’s systems need a single, federated ID that is trust-based. Companies must have a single source of truth for each person, the ability to attach various keys and IDs to that unified identity, as well as have a reliable and verifiable way to opt people out of targeting.
Let’s take a look at what that might look like:
This oversimplification looks at the various identity keys used for each system and the channels they operate in. Today, the CRM is the system of record for engaging consumers directly in channels like direct mail, email campaigns and service call centers. The DMP, on the other hand, is the system of record for more passive, anonymous engagement in channels like display, video and mobile.
When consumers make themselves known, they “pull” engagement from their favorite brands by requesting more information and opting into messaging. At the top of the funnel, we “push” engagements to them via display ads and social channels.
As a marketer, if you have the right technologies in place, you can seamlessly connect the two worlds of data for more precise consumer engagement. The good news is that, martech and adtech have already converged. Recent research from Salesforce shows that 94% of marketers use CRM data to better engage with consumers through digital advertising, and over 91% either already own or plan to adopt DMP over the next year.
So, if mastering consumer identity is the most important element in building tomorrow’s data platform then what, exactly, are the capabilities that need to be addressed? There are three:
1. A single data segmentation engine
Currently, marketers don’t have a single source of truth about their consumers.
Here’s why: Brands build direct mail lists and email lists in their CRM. Separately, they build digital lists of consumers in a DMP tool. Then, they have lists of social handles for followers in various platforms like Facebook and Twitter. Consumer behaviors like browsing and buying that happen on the ecommerce platforms are often not integrated into a master data record. And distributed marketing presents a challenge because a big mobile company or auto manufacturer may have thousands of franchised locations with their own, individual databases.
Segmentation is all over the place. Tomorrow, there must be a single place to build consumer profiles with rich attribute data, and provisioned to the systems of engagement where that consumer spends their time.
2. Data pipelining and governance capabilities
This identity layer must also have the ability to provision data, based on privacy and usage restrictions, to systems of engagement.
For example, when a consumer buys shoes, they should be suppressed from promotions for that product across all channels. When a consumer logs a complaint on a social channel, a ticket needs to be opened in the call center’s system for better customer service. When a person opts out and chooses to be “forgotten,” the system needs to have the ability to delete not only email addresses, but hundreds of cookies, platform IDs and other addressable IDs in order to meet compliance standards with increasingly restrictive privacy laws and, more importantly, giving consumers control over their own data.
Finally, marketers need the ability to ingest valuable DMP data back into their own data environments to enrich user profiles, perform user scoring, as well as build propensity models and lifetime value scores. This requires granular data storage, fast processing speeds and smart pipelines to provision that data.
3. Leaping from DMPs to holistic data management
Ad technology folks are guilty of thinking of cross-device identity (CDIM) as the definition of identity management. Both deterministic and predictive cross-device approaches are more important than ever, but in a world where martech and adtech are operating on the same budgets and platform, today’s practitioner must think more broadly.
Marketers can no longer depend solely on another party’s match table to bridge the divide between CRM and DMP data. A more durable, and privacy-led connector between known and unknown ID types is required. Moreover, when they can, marketers need the ability to enrich email lists with anonymous DMP attributes to drive more performance in known channels—now only possible when a single party manages the relationship.
These three tenets of identity are the starting point for building the data platform of the future. The interest and excitement around CDPs is well placed, and a positive sign that we are evolving our understanding of identity as the driving force behind the changes in marketing.
[This article originally appeared in Econsultancy’s blog on 2/1/2018]
How Granular Data Collection and a Robust Second-Party Data Strategy Changes the Game
The world’s largest marketers and media companies have strongly embraced data management technology to provide personalization for customers that demand Amazon-like experiences. As a single, smart hub for all of their owned data (CRM, email, etc)—and acquired data, such as 3rd party demographic data —DMPs go a long way towards building a sustainable, modern marketing strategy that accounts for massively fragmented digital audiences.
The good news is most enterprises have taken a technological leap of faith, and embraced a data strategy to help them navigate our digital future. The bad news is, the systems they are using today are deeply flawed and do not produce optimal audience segmentation.
A Little DMP History
Marketers were slower to embrace DMP technology, and they quickly grasped the opportunity too. Now, instead of depending on ad networks to aggregate reach for them, they started to assemble their own first-party data asset—overlapping their known users with publishers’ segments, and buying access to those more relevant audiences. The more cookies, mobile IDs, and other addressable keys they could collect, the bigger their potential reach. Since most marketers had relatively small amounts of their own data, they supplemented with 3rd-party data—segments of “intenders” from providers like Datalogix, Nielsen, and Acxiom.
The two primary use cases for DMPs have not changed all that much over the years: both sides want to leverage technology to understand their users (analytics) and grow their base of addressable IDs (reach). Put simply, “who are these people interacting with my brand, and how can I find more of them?” DMPs seem really efficient at tackling those basic use cases, until you find out that they were doing it the wrong way the whole time.
What’s the Problem?
To dig a bit deeper, the way first-generation DMPs go about analyzing and expanding audiences is through mapping cookies to a predetermined taxonomy, based on user behavior and context. For example, if my 17-year-old son is browsing an article on the cool new Ferrari online, he would be identified as an “auto intender” and placed in a bucket of other auto intenders. The system would not store any of the data associated with that browsing session, or additional context. It is enough that the online behavior met a predetermined set of rules for “auto-intender” to place that cookie among several hundred thousand other “auto- intenders.”
The problem with a fixed, taxonomy-based collection methodology is just that—it is fixed, and based on a rigid set of rules for data collection. Taxonomy results are stored (“cookie 123 equals auto-intender”)—not the underlying data itself. That is called “schema-on-write,” an approach that writes taxonomy results to an existing table when the data is collected. That was fine for the days when data collection was desktop-based and the costs of data storage were sky-high, but it fails in a mobile world where artificial intelligence systems crave truly granular, attribute-level data collected from all consumer interactions to power machine learning.
There is another way to do this. It’s called “schema-on-read,” which is the opposite of schema-on-write. In these types of systems, all of the underlying data is collected, and the taxonomy result is created upon reading all of the raw data. In this instance, say I collected everything that happened on a popular auto site like Cars.com? I would collect how many pages were viewed, dwell times on ads, all of the clickstream collected in the “build your own” car module, and the data from event pixels that collected how many pictures a user viewed of a particular car model. I would store all of this data so I could look it up later.
Then, if my really smart data science team told me that users who viewed 15 of the 20 car pictures in the photo carousel in one viewing session were 50% more likely to buy a car in the next 30 days than the average user, I would build a segment of such users by “reading” the attribute data I had stored. This notion—total data storage at the attribute (or “trait”) level, independent of a fixed taxonomy—is called completeness of data. Most DMPs don’t have it.
Why Completeness Matters
Isn’t one auto-intender as good as another, despite how those data were collected? No. Think about the other main uses of DMPs: overlap reporting and indexing. Overlap reporting seeks to overlay an enterprise’s first party data asset with another. This is like taking all the visitors to Ford’s website, and comparing that audience to every user on a non-endemic site, like the Wall Street Journal. Every auto marketer would love to understand which high-income WSJ readers were interested in their latest model. But, how can they understand the real intent of users if they are just tagged as “auto intenders?” How did the publisher come to that conclusion? What signals contributed to having that those users qualify as “intenders” in the first place? How long ago did they engage with an auto article? Was it a story about a horrific traffic crash, or an article on the hottest new model? Without completeness, these “auto intenders” become very vague. Without all of the attributes stored, Ford cannot put their data science team to work to better understand their true intent.
Indexing, the other prominent use case, scores user IDs based on their similarity to a baseline population. For example, a popular women’s publisher like Meredith might have an index score of 150 against a segment of “active moms.” Another way of saying this is that indexing helps understand the “momness” of those women, based on similarity to the overall population. Index scoring is the way marketers have been buying audience data for the last 20 years. If I can get good reach with an index score above 100 at a good price, then I’m buying those segments all day long. Most of this index-based buying happens with 3rd-party data providers who have been collecting the data in the same flawed way for years. What’s the ultimate source of truth for such indexing? What data underlies the scoring in the first place? The fact is, it is impossible to validate these relevancy scores with the granular, attribute-level data being available to analyze.
Therefore, it is entirely fair to say that most DMPs have excellent intentions, but lack the infrastructure to perform 100% of the most important things DMPs are meant to do: understand IDs, and grow them through overlap analysis and indexing. If the underlying data has been improperly collected (or not there at all), then any type of audience profiling by any means is fundamentally flawed.
What to do?
To be fair, most DMPs were architected during a time when it was unnecessary to collect data through a schema-on-read methodology—and extremely costly. Today’s unrelenting shift to AI-driven marketing necessitates this approach to data collection and storage, and older systems are tooling up to compete. If you want to create a consumer data platform (“CDP”), the hottest new buzzword in marketing, you need to collect data in this way. So, the industry is moving there quickly. That said, many marketers are still stuck in the 1990s. Older DMPs are somewhat like the technology mullet of marketing—businesslike in the front, with something awkward and hideous hidden behind.
Beyond licensing a modern, schema-on-read system for data management so marketers can collect their own data in a granular way, there is another way to do things like indexing and overlap analysis well: license data from other data owners who have collected their data in such a way. This means going well beyond leveraging commoditized third-party data, and looking at the world of second-party data. Done correctly, real audience planning starts with collecting your own data effectively and extends to leveraging similarly collected data from others—second party data that is transparent, exclusive, and unique.
Today’s consumers are highly demanding. They expect curated movie recommendations from Netflix, one-click restaurant reservations from OpenTable, on-demand limousine service from Uber, limitless housing options from AirBnB and the world of commerce available 24/7 from Amazon Prime. It’s a great time to be alive for a consumer, but perhaps the worst possible time for the CMO of any other company. Just think, Uber doesn’t own cars. They are a technology company built from the ground up to deliver personalized service at scale to consumers—that’s what today’s marketing is all about.
Only a few short years ago, CMOs had a difficult, but simpler, remit: build the brand and the consumers follow. Absolut vodka was about as undifferentiated a product as anything on the market, but great packaging and a clever ad campaign made it a power brand. It thrived because the world still worked on the principles of How Brands Grow, Byron Sharp’s 2010 book. Sharp posited that a marketer needed two things to succeed: availability in the consumer’s mind and availability of the product at the shelf. Brands like P&G’s Tide control lots of mindshare with mass media budgets, and P&G ensures it is widely available at every supermarket so a consumer can easily choose between it and Wisk at the “moment of truth.”
That system is dying rapidly, as mass media channels become fragmented into thousands of websites, apps, streaming media channels and experiences we don’t even understand yet. As a marketer, you can’t “buy eyeballs” today like you used to. This paradigm is largely responsible for the ever-shrinking average CMO tenure (from 44 months last year to only 42 months today). CMO’s must be prepared to insert themselves along steps of the consumer journey that move from channel to channel, and also have the ability to capture each tiny piece of digital exhaust that consumers’ gadgets and gizmos throw off, helping to inform their understanding of how they engage with a brand.
To make it clear, here’s a chart:
|OLD CMO||NEW CMO|
|Rents access to people||Owns people data|
|One-to-many marketing||One-to-one engagement|
|Big bets on limited channels||Small bets on dozens of channels|
|One “big tent” message for many||Dozens of messages for segments|
|Panel-based attribution||Real-time feedback|
|Agency defines strategy||Marketer owns strategy, agency executes|
Yesterday’s CMO would “buy eyeballs” with big TV and print campaigns, and use subscriber information as a proxy for targeted reach. Today’s CMO wants to own cookies and mobile keys so they can have a one-to-one conversation. Yesterday’s CMO looked at the performance at the end of a campaign, and optimized for the next one based on results from a survey. Today’s CMOs crave access to real-time performance data so they can optimize at run time. Things couldn’t be more different.
In this new norm, what should CMOs do to ensure they stay ahead of the curve? They have to change the way they think about consumer identity and how that impacts their work as marketers–and redefine the way they think about “marketing” in general.
Identity beyond IDs
A few months ago, I wrote that “identity is the new basis of competition” in marketing. That’s still true—you can’t build meaningful cross-channel experiences if you can’t tie people together with their devices. To that end, I recently was invited to an internal town hall with marketers from a large beauty company where the CMO announced that they just now eclipsed over 500 million addressable IDs in their data management platform. Her staff started clapping. Why? Because these weren’t known buyers, just cookies and mobile IDs—but they represented the ability for marketers to connect and build experiences for anonymous people who interacted with their website, mobile app or an ad. That has real, tangible value.
But, devices don’t buy things, people do. Just because you have a good device graph with billions of cookies, e-mail addresses and mobile keys doesn’t mean you have a good view of the people behind that information. Identity data must be augmented with data from systems of engagement to formulate a true view of the consumer. Every click, download, article read, and video view throws off digital exhaust that is filled with scraps of information that machines can use to paint a truer picture of a consumer’s identity. When marketers start valuing all data as a financial asset, they are starting the process of turning IDs into people.
For those of us in the industry, we can be forgiven if we think the world revolves around display, social, mobile and video advertising. We’ve gotten really good at delivering personalized digital experiences in real time, and we have a Lumascape full of clever technologies that are moving the needle for brands that are trying to reach connected consumers. CMOs must think outside the Lumascape, and connect these important addressable touchpoints to mass channels like TV, radio and print in order to deliver personalized experiences at scale.
More than Marketing
The problem is that our definition of marketing often misses the concept of touch points that can exist separately from marketing. These touch points can include interactions between salespeople and potential customers, what happens when a product is returned, conversations on community sites and forums where customers talk to each other about a brand, and also within the e-commerce experience when a consumer is making a purchase. These are arguably more valuable interactions with consumers than a digital banner ad or email because these are either people that are existing customers or those about to buy. They’re incredibly valuable to a brand and don’t involve the traditional notion of “marketing” whatsoever.
Let’s take a look at an example. I fly Delta because I love their app, and they reward my loyalty with special phone numbers so I can reach someone no matter how hairy things get throughout my travel experience. Every time I interact with their website, app and service representative is an opportunity for Delta to market to me—and also an opportunity for the brand to learn more about the way I fly and what matters most to me as a consumer. Getting it consistently right keeps me loyal, but getting even slightly wrong brings me one step closer to tweeting #DeltaStinks. Not fair, but that’s representative of brand relationships today.
To be successful, CMOs must expand their definition of identity. “Identity” is more than just an ID. It’s what is formed after capturing every possible insight from every interaction. And “marketing” is not just about cross-channel messaging, it’s about creating great consumer experiences with every touchpoint that happens including sales, service, commerce and more.
It’s a great time to be a data-driven marketer.
[This article originally appeared in AdExchanger on 10.16.2017]
The Rules of Data-Driven Marketing are Changing as Data Rights Management Takes Center Stage
Unless you’ve been living off the grid, you’ve seen the promise of “data as the new oil” slowly come to fruition over the last five years. Connected devices are producing data at a Moore’s Law-like rate, and companies are building the artificial intelligence systems to mine that data into fuel that will power our ascension into a new paradigm we can’t yet understand. Whether you are in the Stephen Hawking camp (“The development of full artificial intelligence could spell the end of the human race”) or the Larry Page camp (“artificial intelligence [is] the ultimate version of Google”), we can all agree that data is the currency in the AI future.
In our world, we are witnessing an incredible synthesis of fast-moving, data-driven advertising technology coming rapidly together with the slower (yet still data-driven) world of marketing technology. Gartner’s Marty Kihn thinks the only way these two worlds tie the knot for the long term is centered around data management platforms. I think he’s right, but I also think what we know as a DMP today will evolve quickly as the data it manages grows and its applications evolve alongside it.
I think the most immediate changes we will bear witness to in this ongoing evolution are the changes in how data—the lifeblood of modern marketing—will be piped among data owners and those who want to use it. Why? Because the way we have been doing it for the past 20 years in incredibly flawed, and second- and third-party data owners are getting the short end of the stick..
Unless you are Google, Facebook, Amazon or the United States government, you will never have enough data as a marketer. Big CPG companies have been collecting data for years (think of rewards programs and the like), but the tens or even hundreds of millions of addressable IDs they have managed to gather often pales in comparison to the billions of people who interact with their brands every day across the globe. To fill the gaps, they turn to second- and third-party sources of data for segmentation, targeting and analytics.
The real usage of the data was sometimes unknown. Many cookies got hijacked for use into other—even competitive—systems, and there was little transparency into what was happening with the underlying data asset. But, the checks still came every single month. The approach worked when the best data owners (quality publishers) had a thriving direct sales channel.
Fast-forward to today, the game has changed considerably. More than half of enterprise marketers own a DMP, and even smaller mid-market advertisers are starting to license data technology. Data is being valued as a true financial asset and differentiator. On the publisher’s side, manual sales continue to plummet as programmatic evolves and header bidding supercharges the direct model with big data technology. In short, marketers need more and more quality data to feed the machines they are building to compete, and publishers are getting better and more granular control over their data.
More importantly, data owners are beginning to organize around a core principle: Any system that uses my data for insights that doesn’t result in a purchase of that data is theft.
Theft is a strong word but, if we truly value data and agree that it’s a big differentiator, it’s hard to argue with. For years, data owners have accepted a system that allowed wide access to their data for modeling and analytics in return for the occasional check. For every cookie targeted in programmatic that was activated to create revenue, a million more were churned to power analytics in another system. Put simply from the data owner’s perspective, if you are going to use my data for analytics and activation, but only pay me for activation, that’s going to be a problem.
In order to fix this, the systems of the future have to offer the ability for data owners to provision their data in more granular ways. Data owners need complete control of the following:
How is the data being used? Is it for activation, lookalike modeling, analytics in a data warehouse, user matching, cross-device purposes or another use case? Data owners need to be able to approve the exact modalities in which the data are leveraged by their partners.
What is the business model? Is this a trade deal, paid usage, fixed-price or CPM? How long is the term—a single campaign, or a year’s worth of modeling? Data owners should be able to set their own price—directly with the buyer—with full transparency into all fees associating with piping the data to a partner,.
What is being shared? What attributes or traits are being shared? Is it just user IDs, or IDs loaded with valuable attributes, such as a device graph that links an individual to all the devices they use? Data owners need powerful tools that offer a granular level of control for controlling data at the attribute level, and deciding how much of their data they are willing to share–and at what price.
Outside of big data and blockchain conversations, the phrase “data provisioning” is rarely heard, but it’s about to be a big part of our advertising ecosystem. However, it is those very security concerns that have kept data sharing at scale from becoming a reality. The answer is an ecosystem that offers complete control and transparency–and a smart layer of software-enabled governance tools that can stay ahead of nuances in law, such as the new GDPR requirements require. As adtech and marketing tech continue to come together, and systems evolve in parallel with their ability to make the best use of data, the systems of the future must first ensure data security before data innovation can truly happen.
Data may be the new oil, but will it be run by adtech wildcatters, or will the rules be governed by the data owners themselves?
[This was originally published in AdExchanger on 9/26/17]
Every marketer and media company these days is trying to unlock the secret to personalization. Everyone wants to be the next Amazon, anticipating customer wants and desires and delivering real-time customization.
Actually, everyone might need to be an Amazon going forward; Harris Interactive and others tell us that getting customer experience wrong means up to 80% of customers will leave your brand and try another – and it takes seven times more money to reacquire that customer than it did initially.
How important is personalization? In a recent study, 75% marketers of marketers said that there’s no such thing as too much personalization for different audiences, and 94% know that delivering personalized content is important to reaching their audiences.
People want and expect personalization and convenience today, and brands and publishers that cannot deliver it will suffer similar fates. However, beyond advanced technology, what do you need to believe to make this transformation happen? What are the core principles a company needs to adhere to, in order to have a shot at transforming themselves into customer-centric enterprises?
Here are five:
Put People First
It’s a rusty old saw but, like any cliché, it’s fundamentally true. For years, we have taken a very channel-specific view of engagement, thinking in terms of mobile, display, social and video. But those are channels, apps and browsers. Browsers don’t buy anything; people do.
A people-centric viewpoint is critical to being a modern marketer. True people-based marketing needs to extend beyond advertising and start to include things like sales, service and ecommerce interactions – every touchpoint people have with brands.
People – customers and consumers – must reside at the center of everything, and the systems of engagement we use to touch them must be tertiary. This makes the challenges of identity resolution the new basis of competition going forward.
Collect Everything, Measure Everything
A true commitment to personalized marketing means that you have to understand people. For many years, we have assigned outsized importance to small scraps of digital exhaust such as clicks, views and likes as signals of brand engagement and intent. Mostly, they’ve lived in isolation, never informing a holistic view of people and their wants and desires.
Now we can collect more of this data and do so in real time. Modern enterprises need to become more obsessive about valuing data. Every scrap of data becomes a small stitch in a rich tapestry that forms a view of the customer.
We laughed at the “data is the new oil” hyperbole a few years back – simply because nobody had a way to store and extract real value from the sea of digital ephemera. Today is vastly different because we have both the technology and processes to ingest signals at scale – and use artificial intelligence to refine them into gold. Businesses that let valuable data fall to the floor without measuring them might already be dead, but they just don’t know it yet.
Be A Retailer
A lot of brands aren’t as lucky as popular hotel booking sites. To book a room, you need to sign up with your email. Once you become a user, the company collects data on where you like to go, how often you travel, how much you pay for a room and even what kind of mattress you prefer. Any brand would kill for that kind of one-to-one relationship with a customer.
Global CPG brands touch billions of lives every day, yet often have to pay other companies to learn how their marketing spend affected sales efforts. Brands must start to own customer relationships and create one-to-one experiences with buyers. We are seeing the first step with things like Dash buttons and voice ordering, though still through a partner, but we will see this extend even further as brands change their entire business models to start to own the retail relationship with people. The key pivot point will come when brands actually value people data as an asset on their balance sheets.
See The World Dynamically
The ubiquity of data has led to an explosion of microsegmentation. I know marketers and publishers that can define a potential customer to 20 individual attributes. But people can go from a “Long Island soccer mom” on Monday to an “EDM music lover” on Friday night. Today’s segmentation is very much static – and very ineffective for a dynamic world where things change all the time.
To get the “right message, right place, right time” dynamic right, we need to understand things like location, weather, time of day and context – and make those dynamic signals part of how we segment audiences. To be successful, marketers and media companies must commit to thinking of customers as the dynamic and vibrant people they are and enable the ability to collect and activate real-time data into their segmentation models.
Think Like A Technologist
Finally, to create the change described above requires a commitment to understanding technology. You can’t do “people data” without truly understanding data management technology. You can’t measure everything without technology that can parse every signal. To be a retailer, you have to give customers a reason to buy directly from you. Thinking about customers dynamically requires real-time systems of collection and activation.
But technology and the people to run it are expensive investments, often taking months and years to show ROI, and the technology changes at the velocity of Moore’s Law. It’s a big commitment to change from diaper manufacturer to marketing technologist, but we are starting to understand that it is the change required to survive an era where people are in control.
Some say that it wasn’t streaming media technology that killed Blockbuster, but the fact that people hated their onerous late fees. It was probably both of those things. Tomorrow’s Blockbusters will be the companies that cannot apply these principles of modern, personalized marketing – or do not want to make the large investments to do so.
[This article originally appeared in AdExchanger on 8/7/2017.]
Gartner’s Marty Kihn recently made an argument that ad tech and mar tech would not come together, contrary to what he had predicted a few years ago. When Marty speaks about ad tech, people listen.
Like many people, when I read the headline, I thought to myself, “That makes no sense!” But those who read the article more closely understand that the disciplines of ad tech and mar tech will certainly be bound closer together as systems align – but the business models are totally incompatible.
Advertising technology and the ecosystem that supports it, both from a commercial business model perspective (percentage of media spend billed in arrears) and the strong influence of agencies in the execution process, has meant that the alignment with software-as-a-service (SaaS) marketing technology is not just an engineering problem to solve.
Marketing leaders and brands need to change the way they do their P&L and budgeting and reevaluate business process flows both internally and with outside entities such as agencies to ensure that even if the technology may be right, the execution needs to be optimal to achieve the desired results.
There are also plenty of technical hurdles to overcome to truly integrate mar tech and ad tech – most notably, finding a way to let personally identifiable information and anonymous data flow from system to system securely. While those technical problems may be overcome through great software engineering, the business model challenge is a more significant hurdle.
I remember getting some advice from AdExchanger contributor Eric Picard when we worked together some years ago. I was working at a company that had a booming ad tech business with lots of customers and a great run rate, operating on the typical ad network/agency percentage-of-spend model.
At the time, we were facing competition from every angle and getting disrupted quickly. Eric’s suggestion was to transform the company to a platform business, license our technology for a fixed monthly fee and begin to build more predictable revenues and a dedicated customer base. That would have meant parting ways with our customers who would not want to pay us licensing fees and rebuilding the business from scratch.
Not an easy decision, but one we should have taken at the time. Eric was 100% right, but transforming a “run rate” revenue ad tech business into a SaaS business takes a lot of guts, and most investors and management didn’t sign up for that in the first place.
This is a long way of saying that Marty is right. There are tons of ad tech businesses that simply cannot transform themselves into marketing software stacks, simply because it requires complete change – from a structural financial perspective (different business model) and a people perspective (different sales skills required).
[This post appeared in AdExchanger on 5/9/2017]
I saw a great presentation at this year’s Industry Preview where Brian Anderson of LUMA Partners presented on the future of marketing clouds. His unifying marketechture drawings looked like an amalgamation of various whiteboarding sessions I have had recently with big enterprise marketers, many of whom are building the components of their marketing “stacks.” Marketers are feverishly licensing offerings from all kinds of big software companies and smaller adtech and martech players to build a vision that can be summed up like this:
The Data Management Layer
Today’s “stack” really consists of three individual layers when you break it down. The first layer, Data Management (DM), contains all of the “pipes” used to connect people identity together. Every cloud needs to take data in from all kinds of sources, such as internet cookies, mobile IDs, hashed e-mail identity keys, purchase data, and the like. Every signal we can collect results in a richer understanding of the customer, and the DM layer needs access to rich sets of first, second, and third-party data to paint the clearest picture.
The DM layer also needs to tie every single ID and attribute collected to an individual, so all the signals collected can be leveraged to understand their wants and desires. This identity infrastructure is critical for the enterprise; knowing that you are the same guy who saw the display ad for the family minivan, and visited the “March Madness Deals” page on the mobile app goes a long way to attribution. But the DM layer cannot be constrained by anonymous data. Today’s marketing stacks must leverage DMPs to understand pseudonymous identity, but must find trusted ways to mix PII-based data from e-mail and CRM systems. This latter notion has created a new category—the “Customer Data Platform” (CDP), and also resulted in the rush to build data lakes as a method of collecting a variety of differentiated data for analytics purposes.
Finally, the DM layer must be able to seamlessly connect the data out to all kinds of activation channels, whether they are e-mail, programmatic, social, mobile, OTT, or IOT-based. Just as people have many different ID keys, people have different IDs inside of Google, Facebook, Pinterest, and the Wall Street Journal. Connecting those partner IDs to an enterprises’ universal ID solves problems with frequency management, attribution, and offers the ability to sequence messages across various addressable channels.
You can’t have a marketing cloud without data management. This layer is the “who” of the marketing cloud—who are these people and what are they like?
The Orchestration Layer
The next thing marketers need to have (and they often build it first, in pieces) is an orchestration layer. This is the “When, Where, and How” of the stack. E-mail systems can determine when to send that critical e-mail; marketing automation software can decide whether to put someone in a “nurture” campaign, or have a salesperson call them right away; DSPs decide when to bid on a likely internet surfer, and social management platforms can tell us when to Tweet or Snap. Content management systems and site-side personalization vendors orchestrate the perfect content experience on a web page, and dynamic creative optimization systems have gotten pretty good at guessing which ad will perform better for certain segments (show the women the high-heeled shoe ad, please).
The “when” layer is critical for building smart customer journeys. If you get enough systems connected, you start to realize the potential for executing on the “right person, right message, right time” dynamic that has been promised for many years, but never quite delivered at scale. Adtech has been busy nailing the orchestration of display and mobile messages, and the big social platforms have been leveraging their rich people data to deliver relevant messages. However, with lots of marketing money and attention still focused on e-mail and broadcast, there is plenty of work to be done before marketers can build journeys that feature every touchpoint their customers are exposed to.
Marketers today are busy building connectors to their various systems and getting them to talk to each other to figure out the “when, where, and how” of marketing.
The Artificial Intelligence Layer
When every single marketer and big media company owns a DMP,and has figured out how to string their various orchestration platforms together, it is clear that the key point of differentiation will reside in the AI layer. Artificial intelligence represents the “why” problem in marketing—why am I e-mailing this person instead of calling her? Should I be targeting this segment at all? Why does this guy score highly for a new car purchase, and this other guy who looks similar doesn’t? What is the lifetime value of this new business traveler I just acquired?
While the stacks have tons of identity data, advertising data, and sales data, they need a brain to analyze all of that data and decide how to use it most effectively. As marketing systems become more real-time and more connected to on-the-go customers than ever before, artificial intelligence must drive millions of decisions quickly, gleaned from billions of individual data points. How does the soda company know when to deliver an ad for water instead of diet soda? It requires understanding location, the weather, the person, and what they are doing in the moment. AI systems are rapidly building their machine learning capabilities and connecting into orchestration systems to help with decisioning.
All Together Now
The layer cake is a convenient way to look at what is happening today. The vision for tomorrow is to squish the layer cake together in such a way that enterprises get all of that functionality in a single cake. In four or five years, every marketing orchestration system will have some kind of built-in DMP—or seamless connections to any number of them. We see this today with large DSPs; they all need an internal data management system for segmentation. Tomorrow’s orchestration systems will all have built-in artificial intelligence as a means for differentiation. Look at e-mail orchestration today. It is not sold on its ability to deliver messages to inboxes, but rather on its ability to provide that service in a smarter package to increase open rates and provide richer analytics.
It will be fun to watch as these individual components come together to form the marketing clouds of the future. It’s a great time to be a data-driven marketer!
[This post was originally published April 4, 2017 on Econsultancy blog
How Beacons Might Alter The Data Balance Between Manufacturers And Retailers
As Salesforce integrates DMP Krux, Chris O’Hara considers how proximity-based personalization will complement access to first-party data. For one thing, imagine how coffeemakers could form the basis of the greatest OOH ad network.
For years, marketers have been talking about building a bridge between their existing customers, and the potential or yet-to-be-known customer.
Until recently, the two have rarely been connected. Agencies have separate marketing technology, data and analytics groups. Marketers themselves are often separated organizationally between “CRM” and “media” teams – sometimes even by a separate P&L.
Of course, there is a clearer dividing line between marketing tech and ad tech: personally identifiable information, or PII. Marketers today have two different types of data, from different places, with different rules dictating how it can be used.
In some ways, it has been natural for these two marketing disciplines to be separated, and some vendors have made a solid business from the work necessary to bridge PII data with web identifiers so people can be “onboarded” into cookies.
After all, marketers are interested in people, from the very top of the funnel when they visit a website as an anonymous visitor, all the way down the bottom of the funnel, after they are registered as a customer and we want to make them a brand advocate.
It would be great — magic even — if we could accurately understand our customers all the way through their various journeys (the fabled “360-degree view” of the customer) and give them the right message, at the right place and time. The combination of a strong CRM system and an enterprise data management platform (DMP) brings these two worlds together.
Much of this work is happening today, but it’s challenging with lots of ID matching, onboarding, and trying to connect systems that don’t ordinarily talk to one another. However, when CRM and DMP truly come together, it works.
What are some use cases?
Targeting people who haven’t opened an email
You might be one of those people who don’t open or engage with every promotional email in your inbox, or uses a smart filter to capture all of the marketing messages you receive every month.
To an email marketer, these people represent a big chunk of their database. Email is without a doubt the one of the most effective digital marketing channels, even though as few as 5% of people who engage are active buyers. It’s also relatively fairly straightforward way to predict return on advertising spend, based on historical open and conversion rates.
The connection between CRM and DMP enables the marketer to reach the 95% of their database everywhere else on the web, by connecting that (anonymized) email ID to the larger digital ecosystem: places like Facebook, Google, Twitter, advertising exchanges, and even premium publishers.
Understanding where the non-engaged email users are spending their time on the web, what they like, their behavior, income and buying habits is all now possible. The marketer has the “known” view of this customer from their CRM, but can also utilise vast sets of data to enrich their profile, and better engage them across the web.
Combining commerce and service data for journeys and sequencing
When we think of the customer journey, it gets complicated quickly. A typical ad campaign may feature thousands of websites, multiple creatives, different channels, a variety of different ad sizes and placements, delivery at different times of day and more.
When you map these variables against a few dozen audience segments, the combinatorial values get into numbers with a lot of zeros on the end. In other words, the typical campaign may have hundreds of millions of activities — and tens of millions of different ways a customer goes from an initial brand exposure all the way through to a purchase and the becoming a brand advocate.
How can you automatically discover the top 10 performing journeys?
Understanding which channels go together, and which sequences work best, can add up to tremendous lift for marketers.
For example, a media and entertainment company promoting a new show recently discovered that doing display advertising all week and then targeting the same people with a mobile “watch it tonight” message on the night of it aired produced a 20% lift in tune-in compared to display alone. Channel mix and sequencing work.
And that’s just the tip of the iceberg — we are only talking about web data.
What if you could look at a customer journey and find out that the call-to-action message resonated 20% higher one week after a purchase?
A pizza chain that tracks orders in its CRM system can start to understand the cadence of delivery (e.g. Thursday night is “pizza night” for the Johnson family) and map its display efforts to the right delivery frequency, ensuring the Johnsons receive targeted ads during the week, and a mobile coupon offer on Thursday afternoon, when it’s time to order.
How about a customer that has called and complained about a missed delivery, or a bad product experience? It’s probably a terrible idea to try and deliver a new product message when they have an outstanding customer ticket open. Those people can be suppressed from active campaigns, freeing up funds for attracting net new customers.
There are a lot of obvious use cases that come to mind when CRM data and web behavioral data is aligned at the people level. It’s simple stuff, but it works.
As marketers, we find ourselves seeking more and more precise targeting but, half the time, knowing when not to send a message is the more effective action.
As we start to see more seamless connections between CRM (existing customers) and DMPs (potential new customers), we imagine a world in which artificial intelligence can manage the cadence and sequence of messages based on all of the data — not just a subset of cookies, or email open rate.
As the organizational and technological barriers between CRM and DMP break down, we are seeing the next phase of what Gartner says is the “marketing hub” of interconnected systems or “stacks” where all of the different signals from current and potential customers come together to provide that 360-degree customer view.
It’s a great time to be a data-driven marketer!
Chris O’Hara is the head of global marketing for Krux, the Salesforce data management platform.
The term “real time” is bandied about in the ad technology space almost as heavily as the word “programmatic.”
Years later, the meaning of programmatic is finally starting to be realized, but we are still a few years away from delivering truly real-time experiences. Let me explain.
The real-time delivery of targeted ads basically comes down to user matching. Here is a common use case: A consumer visits an auto site, browses a particular type of minivan, leaves the site and automatically sees an ad on the very next site he or she visits. That’s about as “real-time” as it gets.
How did that happen? The site updated the user segment to include “minivan intender,” processed the segment immediately and sent that data into a demand-side platform (DSP) where the marketer’s ID was matched with the DSP’s ID and delivered with instructions to bid on that user. That is a dramatic oversimplification of the process but clearly many things must happen very quickly – within milliseconds – and perfectly for this scenario to occur.
Rocket Fuel, Turn and other big combo platforms have an advantage here because they don’t need to match users across an integrated data-management platform (DMP) and DSP. As long as marketers put their tags on their pages and stay within the confines of a single execution system, this type of retargeting gets close to real time.
However, as soon as the marketer wants to target that user through another DSP or in another channel, user matching comes back into play. That means pushing the “minivan intender” ID into a separate system, but the “real-time” nature of marketing starts to break down. That’s a big problem because today’s users move quickly between channels and devices and are not constrained by the desktop-dominated world of 10 years ago.
User matching has its own set of challenges, from a marketer’s ability to match users across their devices to how platforms like DMPs match their unique IDs to those of execution platforms like DSPs. Assuming the marketer has mapped the user to all of his or her device IDs, which is a daunting challenge, the marketer’s DMP has to match that user as quickly as possible to the execution platform where the ads are going to be targeted and run.
Let’s think about how that works for a second. Let’s say the marketer has DMP architecture in the header of the website, which enables a mom to be placed in the “minivan” segment as soon as the page loads. After processing the segment, it must be immediately sent to the DSP. Now the DSP has to add that user (or bunch of users) to their “minivan moms” segment. If you picture the internet ID space as a big spreadsheet, what is happening is that all the new minivan moms are added to the DSP’s big existing table of minivan moms so they are part of the new targeting list.
Some DSPs, such as The Trade Desk, TubeMogul and Google’s DBM, do this within hours or minutes. Others manage this updating process nightly by opening up a “window” where they accept new data and process it in “batches.” Doesn’t sound very “real-time” at all, does it?
While many DMPs can push segments in real time, the practical issue remains the ability of all the addressable channels a marketer wants to target to “catch” that data and make it available. The good news is that the speed at which execution channels are starting to process data is increasing every day as older ad stacks are re-engineered with real-time back-end infrastructure. The bad news is that until that happens, things like global delivery management and message sequencing across channels will remain overly dependent upon how marketers choose to provision their “stacks.”
The Future Is Dynamic
Despite the challenges in the real-life execution of real-time marketing, there are things happening that will put the simple notion of retargeting to shame. Everything we just discussed depends on a user being part of a segment. I probably exist as a “suburban middle-aged male sports lover with three kids” in a variety of different systems. Sometimes I’m an auto intender and sometimes I’m a unicorn lover, depending on who is using the family desktop, but my identity largely remains static. I’m going to be middle aged for a long time, and I’m always going to be a dad.
But marketers care about a lot more than that. The beer company wants to understand why sometimes I buy an ice-cold case of light beer (I’m about to watch a football game, and I might drink three or four of them with friends) and when I buy a six-pack of their craft-style ale (I’m going to have one or two at the family dinner table).
The soda company is competing for my “share of thirst” with everything from coffee to the water fountain. They want to know what my entry points are for a particular brand they sell. Is it their sports drink because I’m heading to the basketball court on a hot day, or is it a diet cola because I’m at the baseball game? The coffee chain wants to know whether I want a large hot coffee (before work) or an iced latte macchiato (my afternoon break).
This brings up the idea of dynamic segmentation: Although I am always part of a static segment, the world changes around me in real time. The weather changes, my location changes, the time changes and the people around me change constantly. What if all of that dynamic data could be constantly processed in the background and appended to static segments at the moment of truth?
In a perfect world, where the machines all talked to each other in real time and spoke the same language, this might be called real-time dynamic segmentation.
This is the future of “programmatic,” whatever that means.
[This originally appeared in AdExchanger on 8/31/2016]
It’s a hoary old chestnut, but “understanding the customer journey” in a world of fragmented consumer attention and multiple devices is not just an AdExchanger meme. Attribution is a big problem, and one that marketers pay dearly for. Getting away from last touch models is hard to begin with. Add in the fact that many of the largest marketers have no actual relationship with the customer (such as CPG, where the customer is actually a wholesaler or retailer), and its gets even harder. Big companies are selling big money solutions to marketers for multi-touch attribution (MTA) and media-mix modeling (MMM), but some marketers feel light years away from a true understanding of what actually moves the sales needle.
As marketers are taking more direct ownership of their own customer relationships via data management platforms, “consumer data platforms” and the like, they are starting to obtain the missing pieces of the measurement puzzle: highly granular, user-level data. Now marketers are starting to pull in more than just media exposure data, but also offline data such as beacon pings, point-of-sale data (where they can get it), modeled purchase data from vendors like Datalogix and IRI, weather data and more to build a true picture. When that data can be associated with a person through a cross-device graph, it’s like going from a blunt 8-pack of Crayolas to a full set of Faber Castells.
Piercing the Retail Veil
Think about the company that makes single-serve coffee machines. Some make their money on the coffee they sell, rather than the machine—but they have absolutely no idea what their consumers like to drink. Again, they sell coffee but don’t really have a complete picture of who buys it or why. Same problem for the beer or soda company, where the sale (and customer data relationship) resides with the retailer. The default is to go to panel-based solutions that sample a tiny percentage of consumers for insights, or waiting for complicated and expensive media mix models to reveal what drove sales lift. But what if a company could partner with a retailer and a beacon company to understand how in-store visitation and even things like an offline visit to a store shelf compared with online media exposure? The marketer could use geofencing to understand where else consumers shopped, offer a mobile coupon so the user could authenticate upon redemption, get access to POS data from the retailer to confirm purchase and understand basket contents—and ultimately tie that data back to media exposure. That sounds a lot like closed-loop attribution to me.
Overcoming Walled Gardens
Why do specialty health sites charge so much for media? Like any other walled garden, they are taking advantage of a unique set of data—and their own data science capabilities—to better understand user intent. (There’s nothing wrong with that, by the way). If I’m a maker of allergy medicine, the most common trigger for purchase is probably the onset of an allergy attack, but how am I supposed to know when someone is about to sneeze? It’s an incredibly tough problem, but one that the large health site can solve, largely thanks to people who have searched for “hay fever” online. Combine that with a 7-day weather forecast, pollen indices, and past search intent behavior, and you have a pretty good model for finding allergy sufferers. However, almost all of that data—plus past purchase data—can be ingested and modeled inside a marketer DMP, enabling the allergy medicine manufacturer to segment those users in a similar way—and then use an overlap analysis to find them on sites with $5 CPMs, rather than $20. That’s the power of user modeling. Why don’t site like Facebook give marketers user-level media exposure data? The question answers itself.
Understanding the Full Journey
Why is “data science the new measurement?” Because, when a marketer has all of that data at their fingertips, something close to true attribution becomes possible. Now that marketers have the right tools to draw with, the winners are going to be the ones with the most artists (data scientists).
It’s a really interesting space to watch. More and more data is becoming available to marketers, who are increasingly owning the data and technology to manage it, and the models are growing more powerful and accurate with every byte of data that enters their systems.
It’s a great time to be a data-driven marketer!
[This post originally appeared in AdExchanger on 8/12/16]
As a longtime digital practitioner, I sometimes feel ashamed that I haven’t clicked on many banner ads in the last 10 years or so. It’s not that I don’t like banner ads. I recognize that advertising is the thing that supports all of the great content I read. I don’t even mind lots of ads in my paid, expensive, print and online versions of The Wall Street Journal – I sometimes even read them.
But standard banners rarely get any consideration or clicks from me, unless they are incredibly relevant. Standard banner ads aren’t particularly engaging – and the marketers buying them are getting frustrated with an ecosystem rife with fraud, technology taxes and nonhuman traffic.
Some of the world’s top marketers are actively working on “ban-the-banner” initiatives, driven by the theory that nothing but engagement matters – a KPI more easily correlated to watching an entire video than the much-maligned click. They believe great brands should tell great stories, so it seems obvious that the scant real estate and functionality offered by banner slots makes creating consumer engagement difficult, if not impossible.
At the intersection of an amazing technology-driven programmatic buying landscape and the increasingly creative social-led atmosphere of the new web is online video. It kind of snuck up on us, steadily creeping into our social feeds, blogs and favorite website destinations. That’s a very good thing: The reason linear television continues to command the lion’s share of media dollars is because people like to be entertained, and watching something is so much easier than reading something. Watching is a passive experience, but an emotional one.
Video is a place where brands can tell amazing stories, make a great pitch and drive consumer engagement. After years of perfecting 15-second, 30-second and one-minute spots, media agencies are eager to leverage their linear creative in new formats to reach audiences that seem to be abandoning traditional television in droves. This, coupled with a few other factors, is causing advertisers to rapidly move away from animated banners to video.
Millennials Don’t Like Television
Perhaps the most pressing dynamic forcing more video adoption among marketers is thatmillennials – who comprise an estimated 80 million-plus US consumers and will spend $200 billion next year – don’t really watch television anymore.
This is both from a delivery and physical dynamic; they do not watch video on television sets as much as they consume it on tablets, phones and other devices, and they also prefer on-demand viewing to scheduled programming. It makes sense. Even a 13” laptop with a retina display beats a 70” HDTV when held several inches in front of one’s face.
And the rise of streaming services has matured, giving consumers a legitimate option to “unplug” from traditional cable services and consume all the content they want on demand. Marketers must adapt to a reality that makes mobile the top priority for younger consumers, and adjust to the fact that many of the places millennials consume their content are relatively ad-free zones – at least in terms of traditional advertising units. It just so happens that video advertising fits into this new world nicely.
Time-Spent: The New Currency
If video ad delivery is going to be the mainstream unit, then it also follows that things like impressions and clicks are becoming irrelevant quickly. For video, the coin of the realm is time spent, and it is actually a pretty strong sign of engagement and valuable proxy for brand sentiment.
While it may be true that we are forced to consume some pre-roll before engaging with organic content, the best part of online video is that consumers waiting for content on their iPhones are much less likely to take a trip to the kitchen for a snack, as they do when standard commercials come on the tube. Instead of a solid three-minute block of commercials, they only have to engage with a single ad. Also, that ad can be tailored to individual preferences. That means even more engagement, less ad abandonment and a lot of measurability.
Data management platforms are helping marketers segment audiences that are prone to engage through an entire length of video and understand the types of content that produce longer viewing times and true engagement – and modeling those audiences to find lookalikes.
Linear And Online Video Must Connect For True Attribution
Probably the greatest thing about online video is the hope of leveraging data to connect online audiences with linear ones, and getting a better sense of media mix modeling and multitouch attribution.
Comcast certainly gets it. Connecting set-top box data with online ad serving means being able to touch a consumer with video across multiple screens – and bring real measurement of audiences that are increasingly device agnostic. Large telecoms, such as Verizon, are acquiring companies that provide the “last mile” of value from their broadband pipes, and that mile is as much about online video ad delivery as it is about website content.
The battle to do this correctly will be won at the “people level,” which is why we are seeing such a pitched battle of cross-device graphs; unless marketers can connect people with all of their devices, true attribution is simply impossible, rather than just being hard.
It’s an interesting time for modern marketers and publishers as they try and grow out of what we will see as the very early days of addressable advertising, and into a world dominated by on-demand content across a multitude of screens. The common denominator is video advertising, and I’m going long on the companies in the ecosystem that are going to power this new reality.
[This article originally appeared in AdExchanger on 6.30.16]
The report looks at the challenges and opportunities for agencies that want to become trusted stewards of their clients’ data.
I sat down with the author, Chris O’Hara, to find out more.
Q. It seems like the industry press is continually heralding the decline of media agencies, but they seem to be very much alive. What’s your take on the current landscape?
For a very long time, agencies have been dependent upon using low-cost labor for media planning and other low-value operational tasks.
While there are many highly-skilled digital media practitioners – strategists and the like – agencies still work against “cost-plus” models that don’t necessarily map to the new realities in omnichannel marketing.
Over the last several years as marketers have come to license technology – data management platforms (DMP) in particular – agencies have lost some ground to the managed services arms of ad tech companies, systems integrators, and management consultancies.
Q. How do agencies compete?
Agencies aren’t giving up the fight to win more technical and strategic work.
Over the last several years, we have seen many smaller, data-led agencies pop up to support challenging work – and we have also seen holding companies up-level staff and build practice groups to accommodate marketers that are licensing DMP technology and starting to take programmatic buying “in-house.”
It’s a trend that is only accelerating as more and more marketer clients are hiring Chief Data Officers and fusing the media, analytics, and IT departments into “centers of excellence” and the like.
Not only are agencies starting to build consultative practices, but it looks like traditional consultancies are starting to build out agency-like services as well.
Not long ago you wouldn’t think of names like Accenture, McKinsey, Infinitive, and Boston Consulting Group when you think of digital media, but they are working closely with a lot of Fortune 500 marketers to do things like DMP and DSP (demand-side platform) evaluations, programmatic strategy, and even creative work.
We are also seeing CRM-type agencies like Merkle and Epsilon acquire technologies and partner with big cloud companies as they start to work with more of a marketer’s first-party data.
As services businesses, they would love to take share away from traditional agencies.
Q. Who is winning?
I think it’s early days in the battle for supremacy in data-driven marketing, but I think agencies that are nimble and willing to take some risk upfront are well positioned to be successful.
They are the closest to the media budgets of marketers, and those with transparent business models are really strongly trusted partners when it comes to bringing new products to market.
Also, as creative starts to touch data more, this gives them a huge advantage.
You can be as efficient as possible in terms of reaching audiences through technology, but at the end of the day, creative is what drives brand building and ultimately sales.
Q. Why should agencies embrace DMPs? What is in it for them? It seems like yet another platform to operate, and agencies are already managing DSPs, search, direct buys, and things like creative optimization platforms.
Ultimately, agencies must align with the marketer’s strategy, and DMPs are starting to become the single source of “people data” that touch all sorts of execution channels, from email to social.
That being said, DMP implementations can be really tough if an agency isn’t scoped (or paid) to do the additional work that the DMP requires.
Think about it: A marketer licenses a DMP and plops a pretty complicated piece of software on an agency team’s desk and says, “get started!”
That can be a recipe for disaster. Agencies need to be involved in scoping the personnel and work they will be required to do to support new technologies, and marketers are better off involving agencies early on in the process.
Q. So, what do agencies do with DMP technology? How can they succeed?
As you’ll read in the new guide, there are a variety of amazing use cases that come out of the box that agencies can use to immediately make an impact.
Because the DMP can control for the delivery of messages against specific people across all channels, a really low-hanging fruit is frequency management.
Doing it well can eliminate anywhere from, 10-40% of wasteful spending on media that reaches consumers too many times.
Doing analytics around customer journeys is another use case – and one that attribution companies get paid handsomely for.
With this newly discovered data at their fingertips, agencies can start proving value quickly, and build entire practice groups around media efficiency, analytics, data science – even leverage DMP tech to build specialized trading desks. There’s a lot to take advantage of.
Q. You interviewed a lot of senior people in the agency and marketer space. Are they optimistic about the future?
Definitely. It’s sort of a biased sample, since I interviewed a lot of practitioners that do data management on a daily basis.
But I think ultimately everyone sees the need to get a lot better at digital marketing and views technology as the way out of what I consider to be the early and dark ages of addressable marketing.
The pace of change is very rapid, and I think we are seeing that people who really lean into the big problems of the moment like cross-device identity, location-based attribution, and advanced analytics are future-proofing themselves.
Although it’s starting to become a well-worn aphorism, “data is the new oil” resonates more than ever. Like oil, data is an abundant resource, but it doesn’t become useful until it is refined for use and turned into fuel.
Without the proper refinement, big data may be worthless. The stock of big data unicorn Palantir, for example, sunk on news that it lost key client relationships due to a lack of perceived value. The company collected abundant data from CPG companies but was unable to apply it to practical use cases, according to a recent article.
Marketers are starting to turn away from using abundant, yet commoditized, third-party data sources in exchanges and move toward creating peer-to-peer data relationships and leveraging second-party data for targeting. This speaks to the refinement of targeting data: Better quality in the raw materials always yields more potent fuel for performance. Not all data is the same, and not every technology platform can spin data straw into gold.
Marketers have been using available data for addressable marketing for years, but now are starting to mine their own data and get value from the information they collect from registrations, mobile applications, media performance and site visitation. Data management platforms (DMPs) are helping them collect, refine, normalize and associate their disparate first-party data with actual people for targeting.
This is a beautiful thing. Technology is enabling marketers to mine their own data and own it. Yet many marketers are still just scraping the surface of what they can do, and using data primarily for the targeting of addressable media.
Some, however, are starting to deliver customer experiences that go beyond targeting display advertising by using data to shape the way consumers interact with their brands beyond media.
The case for personalization – customer experience management, or CX – is palpable. When the Watermark Group studied [PDF] the cumulative stock performance of Forrester Research-rated “leaders” or “laggards” in customer experience, the results were staggering. During a period in which the S&P 500 grew by 72%, those focused on personalized experiences outperformed the market by 35%, and the laggards underperformed by 45% on average. That’s a delta of nearly 80% in stock price performance between the winners and losers.
Moreover, 89% of customers who have a, unsatisfactory experience will leave a brand, according to a recent study; the cost of reacquiring a churned customer can run up to seven times the amount it took to win a new customer.
The stakes could not be higher for marketers and publishers looking to drive bottom-line performance. For many companies, whether they are marketing print or online subscriptions, promoting their content or selling products off the shelf, it’s hard to justify the heavy costs associated with licensing platforms to gather the right data and use that data to drive relevant customer experiences to their CFOs. Yet, when looking at big company priorities on multiple surveys, the desire to “create more relevant customer experiences” is right up there with “earn more revenue” and “increase profits.” Why?
The simple answer is that customer experience has an enormous impact on both revenue and profitability. Giving new customers the right experience provides a higher probability of winning them, and giving existing customers relevant experiences reduces churn – and creates opportunities to sell them more products, more often. When both top-line revenue and profitability can be driven through a single initiative, most CFOs start to invest and will continue to invest as results confirm the initial thesis.
Take the “heavy user” of a quick-service restaurant who dines several times a week and consistently transacts an over-average per-visit receipt. QSRs understand the impact these valued customers have on the bottom line. These users provide a strong baseline of predicable revenue, are usually the first to try new product offerings and respond to market-facing initiatives, such as discounting and couponing, which can strategically increase short-term receipts. Smart marketers should not be content to sit back and let this valuable segment remain stagnant or find new offerings with a competitive restaurant. They must show these users that they are valued, ensure they retain or increase store visits and keep them away from the hamburger next door.
That can be as simple as offering a coupon for a regular’s favorite order. Or it can be as complex as developing a mobile application that enables the customer to order his food in advance and pick it as soon as it’s ready.
Since the restaurant collects point-of-sale data and has authenticated user registration data from the mobile app, it can now personalize the customer’s order screen with his most popular orders to shorten the mobile ordering experience. Perhaps the app can offer special discounts to frequent diners for trying – and rating – new menu items. When on the road, the app can recommend other locations and direct him right to the drive-in window through popular map APIs. The possibilities are endless when you start to imagine how data can drive your next customer interaction.
Marketers and publishers are quickly embracing their first-party data and aligning it with powerful applications that drive customer experience, increase profits, reduce customer churn and boost lifetime value.
It’s a great time to be a data-driven marketer.
[This post originally appeared in AdExchanger on 5/23/16]
Any AdExchanger reader probably knows more about data management technology than the average Joe, but many probably associate data management platforms (DMPs) with creating audience segments for programmatic media.
While segmentation, audience analytics, lookalike modeling and attribution are currently the primary use cases for DMP tech, there is so much more that can be done with access to all that user data in one place. These platforms sitting at the center of a marketer’s operational stack can make an impact far beyond paid media.
As data platforms mature, both publishers and marketers are starting to think beyond devices and browsers, and putting people in the center of what they do. Increasingly, this means focusing on giving the people what they want. In some cases that means no ads at all, while in others it’s the option to value certain audiences over others and serve them an ad first or deliver the right content – not just ads – based on their preferences.
Beyond personalization, there are DMP plays to be made in the areas of ad blocking and header bidding.
DMPs see a lot of browsers and devices on a monthly basis and strive to aggregate those disparate identities into a single user or universal consumer ID. They are also intimately involved in the serving of ads by either ingesting ad logs, deploying pixels or having a server-to-server connection with popular ad servers. This is great for influencing the serving of online ads across channels, but maybe it can help with one of the web’s most perplexing problems: the nonserving of ads.
With reports of consumers using applications to block as many as 10% of ads, wouldn’t it be great to know exactly who is blocking those ads? For publishers, that might mean identifying those users and suppressing them from targeting lists so they can help marketers get a better understanding of how much reach they have in certain audience segments. Once the “blockers” are segmented, publishers can get a fine-grained understanding of their composition, giving them insights about what audiences are more receptive to having ad-free or paid content experiences.
A lot of these issues are being solved today with specialized scripts that either aren’t very well coded, leading to page latency, or are scripted in-house, adding to complexity. Scripts trigger the typical “see ads or pay” notifications, which publishers have seen become more effective over time. The DMP, already installed and residing in the header across the enterprise, can provision this small feature alongside the larger application.
Speaking of DMP architecture being in the header, I often wonder why publishers who have a DMP installed insist on deploying a different header-bidding solution to manage direct deals. Data management tech essentially operates by placing a control tag within the header of a publisher website, enabling a framework that gives direct and primary access to users entering the page. Through an integration with the ad server, the DMP can easily and quickly decide whether or not to deliver high-value “first looks” at inventory.
Today, the typical large publisher has a number of supply-side platforms (SSPs) set up to handle yield management, along with possibly several pieces of infrastructure to manage that critical programmatic direct sale. Publishers can reduce complexity and latency by simply using the pipes that have already been deployed for the very reason header bidding exists: understanding and managing the serving of premium ads to the right audiences.
Maybe publishers should be thinking about header bidding in a new way. Header-bidding tags are just another tag on the page. Those with tag management-enabled DMPs could have their existing architecture handle that – a salient point made recently by STAQ’s James Curran.
Curran also noted that the DMP has access, through ad log ingestion, to how much dough publishers get from every drop in the waterfall, including from private marketplace, programmatic direct header and the open exchanges. Many global publishers are looking at the DMP inside their stack as a hub that can see the pricing landscape at an audience level and power ad servers and SSPs with the type of intelligent decisioning that supercharges yield management.
In ad technology, we talk a lot about the various partners enabling “paid, owned and earned” exposures to consumers, but we usually think of DMPs as essential only for the paid part.
But the composition of a web page, for example, is filled with dozens of little boxes, each capable of serving a display ad, video ad, social widget or content. Just as the DMP can influence the serving of ads into those little boxes, it can also influence the type of content that appears to each user. The big automaker might want to show a muscle car to that NASCAR Dad when he hits the page or a shiny new SUV to the Suburban Mom who shuttles the kids around all day.
Or, a marketer with a lot of its own content (“brands are publishers,” right?) may want to recommend its own articles or videos based on the browsing behavior of an anonymous user. The big global publisher may want to show a subscriber of one magazine a series of interesting articles from its other publications, possibly outperforming the CPA deals it has with third parties for subscription marketing.
This one-to-one personalization is possible because DMPs can capture not only the obvious cookie data but also the other 60% of user interactions and data, including mobile apps, mobile web, beacon data and even modeled propensity data from a marketer or publisher’s data warehouse.
Wouldn’t it be cool to serve an ad for a red car when the user has a statistically significant overlap with 10,000 others who have purchased red cars in the past year? That’s how to apply data science to drive real content personalization, rather than typical retargeting.
These are just some of the possibilities available when you start to think as the DMP as not just a central part of the ad technology “stack” but the brains behind everything that can be done with audiences. This critical infrastructure is where audience data gets ingested in real time, deployed to the right channels at speed and turned into insights about people. In a short period of time, the term “DMP” will likely be shorthand for the simple audience targeting use case inside of the data-driven marketing hub.
It’s a great time to be a data-driven marketer.
We’ve been hearing about big data driving marketing for a long time, and to be honest, most is purely aspirational.
Using third-party data to target an ad in real time does deploy some back-end big-data architecture for sure. But the real promise of data-driven marketing has always been that computers, which can crunch more data than people and do it in real time, could find the golden needle of insight in the proverbial haystack of information.
This long-heralded capability is finally moving beyond the early adopters and starting to “cross the chasm” into early majority use among major global marketers and publishers.
Leveraging Machine Learning For Segmentation
Now that huge global marketers are embracing data management technology, they are finally able to start activating their carefully built offline audience personas in today’s multichannel world.
Big marketers were always good at segmentation. All kinds of consumer-facing companies already segment their customers along behavioral and psychographic dimensions. Big Beer Company knows how different a loyal, light-beer-drinking “fun lover” is from a trendsetting “craft lover” who likes new music and tries new foods frequently. The difference is that now they can find those people online, across all of their devices.
The magic of data management, however, is not just onboarding offline identities to the addressable media space. Think about how those segments were created. Basically, an army of consultants and marketers took loads of panel-based market data and gut instincts and divided their audience into a few dozen broad segments.
There’s nothing wrong with that. Marketers were working with the most, and best, data available. Those concepts around segmentation were taken to market, where loads of media dollars were applied to find those audiences. Performance data was collected and segments refined over time, based on the results.
In the linear world, those segments are applied to demographics, where loose approximations are made based on television and radio audiences. It’s crude, but the awesome reach power of broadcast media and friendly CPMs somewhat obviate the need for precision.
In digital, those segments find closer approximation with third-party data, similar to Nielsen Prizm segments and the like. These approximations are sharper, but in the online world, precision means more data expense and less reach, so the habit has been to translate offline segments into broader demographic and buckets, such as “men who like sports.”
What if, instead of guessing which online attributes approximated the ideal audience and creating segments from a little bit of data and lot of gut instinct, marketers could look at all of the data at once to see what the important attributes were?
No human being can take the entirety of a website’s audience, which probably shares more than 100,000 granular data attributes, and decide what really matters. Does gender matter for the “Mom site?”Obviously. Having kids? Certainly. Those attributes are evident, and they’re probably shared widely across a great portion of the audience of Popular Mom Site.
But what really defines the special “momness” of the site that only an algorithm can see? Maybe there are key clusters of attributes among the most loyal readers that are the things really driving the engagement. Until you deploy a machine to analyze the entirety of the data and find out which specific attributes cluster together, you really can’t claim a full understanding of your audience.
It’s all about correlations. Of course, it’s pretty easy to find a correlation between only two distinct attributes, such as age and income. But think about having to do a multivariable correlation on hundreds of different attributes. Humans can’t do it. It takes a machine-learning algorithm to parse the data and find the unique clusters that form among a huge audience.
Welcome to machine-discovered segmentation.
Machines can quickly look across the entirety of a specific audience and figure out how many people share the same attributes. Any time folks cluster together around more than five or six specific data attributes, you arguably have struck gold.
Say I’m a carmaker that learned that some of my sedan buyers were men who love NASCAR. But I also discovered that those NASCAR dads loved fitness and gaming, and I found a cluster of single guys who just graduated college and work in finance. Now, instead of guessing who is buying my car, I can let an algorithm create segments from the top 20 clusters, and I can start finding people predisposed to buy right away.
This trend is just starting to happen in both publishing and marketing, and it has been made available thanks to the wider adoption of real big-data technologies, such as Hadoop, Map Reduce and Spark.
This also opens up a larger conversation about data. If I can look at all of my data for segmentation, is there really anything off the table?
Using New Kinds Of Data To Drive Addressable Marketing
That’s an interesting question. Take the company that’s manufacturing coffee machines for home use. Its loyal customer base buys a machine every five years or so and brews many pods every day.
The problem is that the manufacturer has no clue what the consumer is doing with the machine unless that machine is data-enabled. If a small chip enabled it to connect to the Internet and share data about what was brewed and when, the manufacturer would know everything their customers do with the machine.
Would it be helpful to know that a customer drank Folgers in the morning, Starbucks in the afternoon and Twinings Tea at night? I might want to send the family that brews 200 pods of coffee every month a brand-new machine after a few years for free and offer the lighter-category customers a discount on a new machine.
Moreover, now I can tell Folgers exactly who is brewing their coffee, who drinks how much and how often. I’m no longer blind to customers who buy pods at the supermarket – I actually have hugely valuable insights to share with manufacturers whose products create an ecosystem around my company. That’s possible with real big-data technology that collects and stores highly granular device data.
Marketers are embracing big-data technology, both for segmentation and to go beyond the cookie by using real-world data from the Internet of Things to build audiences.
It’s creating somewhat of a “cluster” for companies that are stuck in 2015.
We have all heard about the Democratic Party’s skill with data, and there is no doubt the Obama campaign’s masterful use of first-party registration data to drive online engagement, raise funds and influence political newbies helped put him over the line.
Four years later, the dynamics are mostly similar, but we have moved into a world where mobile is dominant, more young new voters are highly engaged and the standard segmentation – at least on the Republican side – might as well be thrown out the window.
In other words, everyone is getting influenced on their mobile phone, especially through news and social channels. There are a ton more mobile-first, new voters out there, and nobody is really sure which voters make up this weird new Trump segment.
To get a handle on this, political advertisers need to properly onboard and analyze their data to identify who they should target, where they live and what they like.
Understand Voter Identity
In politics, a strong “ground game” is key. That means real, old-school retail politics, such as knocking on doors and getting voters in specific precincts out on Election Day. All campaigns have the voter rolls and can do their fill of direct mail, robocalls and door knocking.
But how to influence voters well before Election Day who are tethered to their devices all day and night? It requires a digital strategy that can reach voters across the addressable channels they are on, including display, video, mobile and email. This strategy should leverage an identity graph to ensure the right messaging is hitting the same voter – at the right cadence.
Maybe “Joe the Firefighter,” a disaffected moderate Democrat who has had it with the Clintons, visited the Donald’s website and is ready to “Make America great again.” Before cross-device capabilities were strong, you could only retarget Joe the next time you saw his cookie online.
Today, Joe can get an equity message reinforced on display (“Make America great again!”), a mobile “nudge” to take action when we see Joe on his tablet at night (“Donate now!”) and follow up with an email a few days before the big rally (“Come see the Donald at the Civic Center!”).
Beyond this capability is the incredibly important task of laddering up individual identity into householding, so we can understand the composition of Joe’s family, since households often vote together and contain more than one registered voter.
Nail Geographic Targeting by County and District
Since “all politics is local,” it follows that all digital advertising should be locally targeted. This is table stakes for digital providers that work with campaigns, and targeting down to the ZIP+4 level has brought a level of precision to district-level outreach that approaches direct mail.
But direct mail (household targeting) is the crown jewel and digital is still trying to cross that divide, but is held back by a fragmented ecosystem of identity and, more importantly, privacy considerations.
This has always been a key consideration, given the fact that a small percentage of key districts can flip the presidency to one party or another.
Affiliation Modeling Through Behavior
Sometimes getting an understanding of someone’s party affiliation is super obvious, such as “liking” a specific candidate on social media. But, sometimes, a user’s affinity has to be derived through attributes derived through his or her behavior and the context of content consumed over time.
Data management platforms are bringing more precision to this type of modeling. Functionality, such as algorithmic segmentation, is helping digital analysts go beyond the basics. It’s fairly easy to correlate two or three attributes, such as income and gender, to estimate party affiliation. In this cycle, for example, we have seen a strong bias toward Trump from lower-income males with less than a college degree.
However, it’s hard for humans to correlate eight or more distinct attributes. Maybe those lower-education, low-income, rural males who love NASCAR actually lean toward Bernie Sanders in certain districts. Letting the machines crunch the numbers can give digital campaign managers an unseen advantage, and that capability has just now become available at scale.
“In 2016, relying on TV advertising to sway voters is no longer a solid campaign tactic,” JC Medici, Rocket Fuel’s national director of politics and advocacy, told me via email. “To secure the White House in November, candidates must now add a strong digital media strategy by utilizing best-in-class AI, correlated with strong voter and propensity data assets to ensure they are delivering ads to the right voter, on the right screen, at the right time.”
One of the hot new areas for political campaign targeting is social affinity, the idea that there is a mutual affinity that can be measured between interests.
Yes, when someone “likes” Hillary, you have an obvious target. But, how about those folks who haven’t stated an obvious choice? Maybe 80% of Hillary fans also liked cat shelters, yellow dresses and Chris Rock.
When strong correlations between deterministic social behavior are shown, it becomes fairly easy to leverage that data for targeting – and make informed choices regarding media. People who liked Hillary also like certain TV shows, actors, causes and websites. Campaign managers can leverage data from Affinity Answers, Affinio and other companies to understand these relationships and exploit them to build support for candidates, while leveraging the ability to geotarget at very granular levels on Facebook.
The Free State Project, an organization committed to getting 20,000 “liberty-loving” people to move to New Hampshire and work toward limited government, just reached its goal – talk about a tough conversion. President Carla Gericke credits the use of data-driven targeting on Facebook for the achievement.
Speaking of social, it is also highly important to get the context right.
“Programmatic has introduced two new challenges: bots (who don’t vote) and brand safety,” Trust Metrics CRO Marc Goldberg told me. “In the age of immediate and shocking news, it has become more important that a political ad does not end up next to porn, hate or issues that are contradictory to the politician’s beliefs. One screen shot and bam, you are on Twitter.”
Onboarding And Offboarding
Perhaps the most critical functionality for digital political campaigns continues to be the ability to “onboard” offline data, such as phone numbers, email addresses and party affiliation, and match it to an online ID for targeting purposes. This is essentially table stakes, considering the years of political investment in collecting offline records for phone banks and direct mail campaigns.
Previously, the onboarding of such data was limited to associating it with an active cookie for retargeting use. But with the emergence of real cross-channel device graphs, this data can now be tied to a universal consumer ID that is persistent and collects attributes over time.
Simply put, that onboarded email – now a UID – can be mapped to a number of identities, including Apple and Android mobile identifiers, third-party IDs from Experian and the like and device IDs from Roku and other OTT devices. In other words, the device graph enables that email to be associated with the voter’s omnichannel footprint, giving campaigns the ability to sequentially target messages, map creative to execution channels and truly understand attribution.
What’s even more exciting is the idea of offboarding some digital data back into the CRM. How valuable would it be to know that a potential voter watched an entire YouTube video on a candidate after being reached by the phone bank? Certain types of behavioral data, depending on compliance with privacy policies, can be brought back into the CRM to impact the effectiveness of offline voter outreach.
It is fair to say that 2016 is the most exciting campaign season we’ve had in a generation – and it’s only the primary season. As data-driven marketers, we will see campaigns push the limit in applying big marketing dollars to digital channels, trying to unlock new, mobile-first millennial voters, while persuading independents through more addressable advertising then ever.
It’s a great time to be a data-driven marketer.
If you think about the companies with perhaps least amount of consumer data, you would automatically think about consumer packaged goods (CPG) manufacturers. Hardly anybody registers for their website or joins their loyalty clubs; moms don’t flock to their branded diaper sites; and they are at arms-length from any valuable transaction data (store sales) until well after the fact. So, with little registration, website, or offline sales data, why are so many large CPG firms licensing an expensive first-party data management platform?
While CPG companies will never have the vast amounts of point-of-sale, loyalty-card, app, and website data that a big box retailer might have, they do spend a ton of dough on media. And, as we all know, with large media expenditures come tons of waste. Combine this with the increasingly large investment and influence that activist investors and private equity companies have in CPG, and you can see where this leads. PE companies have installed zero-based budgeting that forces CPG concerns to rationalize every penny of the marketing budget—which, until lately, has been subject to the Wannamaker Rule (“I know half of my budget is working, but not which half”). Enter the DMP for measurement and global frequency control, cutting off and reallocating potentially millions of dollars in “long tail” spending. Now, the data that the CPG marketer actually has in abundance (media exposure data), can be leveraged to the hilt.
This first and most obvious CPG use case has been discussed extensively in past articles. But there is much more to data management for CPG companies. Here are just a few tactics big consumer marketers have written into their data-driven playbooks:
The Move to Purchase-Based Targeting (PBT)
Marketers have come a long way from demographic targeting. Yes, gender, age and income are all reliable proxies for finding those “household CEOs,” but we live in complicated times and “woman, aged 25-54, with 2 children in household” is still a fairly broad way to target media in 2016. Today, men are increasingly as likely to go grocery shopping on a Thursday night. Marketers saw this and shifted more budget to behavioral, psychographic, and contextual targeting—but finding cereal buyers using proxies such as site visitation sharpened the targeting arrow only slightly more than demography.
Packaged goods marketers have long understood the value of past purchases (loyalty cards and coupons), but until the emergence of data management technologies, have struggled to activate audiences based on such data. Now, big marketers can look at online coupon redemption or build special store purchase segments (Datalogix, Nieslen Catalina, News America Marketing) and create high value purchase-based segments. The problem? Such seed segments are small, and must be modeled to achieve scale. Also, by the time the store sales data comes in, it’s often far too late to optimize a media plan. That said, CPG marketers are finding that product purchasers share key data attributes that reveal much about their household composition, behavior, and—most interestingly—affinity for a company’s other products. It may not seem obvious that a shopping basket contains diapers and beer—until you understand that Mom sent Dad out to the store to pick up some Huggies, and he took the opportunity to grab a cold six-pack of Bud Light. These insights are shaping modern digital audience segmentation strategy, and those tactics are becoming more and more automated through the use of algorithmic modeling and machine-learning. CPG has seen the future, and it is using PBT to increase relevant reach.
Optimizing Category Reach
CPG marketers are constantly thinking about how to grow the amount of product they sell, and those thoughts typically vary between focusing on folks who are immensely loyal (“heavy” category buyers) versus those who infrequently purchase (“light” or “medium” category buyers). Who to target? It’s an interesting question, and one answered more decisively with purchase-based sales data.
Take the large global soda company as an example. Their average amount of colas their customer consumes is 15 a year, but that is an immensely deceptive number. The truth is that the company has a good amount of “power users” who drink 900 colas a year (two and a half per day), and a lot of people who may only drink 2-3 colas during the entire year. Using the age-old “80/20 Rule” as a guideline, you would perhaps be inclined to focus most of the marketing budget on the 20% of users who supposedly make up 80% of sales volume. However, closer examination reveals that heavy category buyers may only be driving as little as 50% of total purchase volume. So, the marketer’s quandary is, “Do I try and sell the heavy buyer his 901st cola, or do I try and get the light buyer to double his purchase from two to four colas a year?”
Leveraging data helps CPG companies not have to decide. Increasingly, companies are adopting frequency approaches that identify the right amount of messaging to nurture the heavy users (maybe 2-3 messages per user, per month) and bring light buyers to higher levels of purchase consideration (up to 20 messages per month). Moreover, by using DMP technology to segment these buyers based on their category membership, creatives can be adjusted based on the audience. Heavy buyers get messages that reinforce why the love the brand (“share the love”), and light buyers can receive more convincing messages (“tastes better”).
Increasing Lift through Cross-Channel Messaging
CPG marketers have some highly evolved models that show just how much lift a working media dollar has on sales, and they use this guide to decision on media investment by both channel and partner. With the power of DMPs for cross-channel measurement, CPG companies are finally able to apply even small insights they can to tweak sales lift.
What if the data reveal that a 50% mixture of equity and direct response ad creatives lifts coupon downloads by 200%? In other words, instead of just showing “Corn Flakes are Yummy” ads, you mixed in a few “Buy Flakes now at Kroger and save!” creatives afterwards, and you saw a huge impact on your display performance? Sadly, this simple insight was not available before data management platforms corralled cross-channel spending and associated it with an individual, but now these small insights are adding up to appreciable sales lift.
In another example, a large CPG company sees massive lift in in-store coupon redemptions by running branded display ads on desktop all throughout the week—but giving a “mobile nudge” on the smartphone on Friday night when it’s time to fill the pantry. This cross-channel call-to-action has seen real results, and only involves grabbing a brand-favorable consumer’s attention on another device to create a big impact. Again, a simple tactic—but also impossible without the power of a DMP.
CPG marketers have been able to achieve a ton of progress by working with relatively sparse amounts of data. What can you do with yours?
Marketers are getting frustrated with spending up to 60% of their working media dollars to fund intermediaries between themselves and their publishing partners. By the time a marketer pays his agency, trading desk, exchange, third-party data provider, and subsidizes the publisher’s ad serving stack, dollars turn into dimes. Marketers want less fraud, more people, less ad tech, and to put more media dollars to work to drive performance. Quality publishers, who for so long sacrificed control for access to an always-on stream of programmatic cash, are now seeing balance return, as shady sources of inventory leave the ecosystem and start to create scarcity for premium supply.
Publishers with desired audiences are starting to leverage hacks like “header bidding” and private marketplaces to get more control and capture more revenue from transactions. But they are also starting to look at data-only transactions among trusted demand-side partners. Now that marketers are catching up with DMP technology, securely sharing audiences becomes possible, opening up a new era where “second party” data is poised to reign supreme. Before we talk about how that happens, let’s first define some data terms:
First-party data is proprietary data that marketers and publishers have collected – with permission, of course – and, therefore, own. It can be cookies collected from a site visit, offline data onboarded into addressable IDs and even data from marketing campaigns. Second-party data is simply someone else’s first-party data. Second-party data gets created any time two companies strike up a deal for data that is not publicly available. The most common use case is that of a marketer – say a big airline –getting access to data for a publisher’s frequent travelers. Big Airline might say to Huge News Site with business travelers, “Let’s user match, so every time I see one of my frequent flyers on your site, I can serve him an ad.” Huge News Site may decide to allow Big Airline to target its users wherever they are found (a “bring your own data” deal) or make such a deal incumbent upon buying media. Either way, Big Airline now has tons of really valuable Huge News Site reader data available in its data-management platform (DMP) for modeling, analysis and targeting.
Despite the much heralded death or merely diminution of third-party data, it is still a staple of addressable media buying. This is data that is syndicated and made available for anyone to buy. This data could describe user behavior (Polk “auto intenders” of various stripes) or bucket people into interesting addressable segments based on their life circumstances (Nieslen “Suburban Strivers”), describe a user’s income level (Acxiom or Experian) or tell you where a user likes to go via location data (PlaceIQ or Foursquare). Most demand-side platforms (DSPs) make a wide variety of this data available within their platforms for targeting, and DMPs enable users to leverage third-party data for segment creation – usually allowing free usage for analytics and modeling purposes and getting paid upon successful activation. Data Quality And Scale So, which kind of data is the best? When asked that question by a marketer, the right question is inevitably, “all of it.” But, since that’s an annoying answer, let’s talk about the relative scale and value of each type of data. It’s easily visualized by this wonderfully over-simplified triangle:
First-party data is the most limited in scope, yet the most powerful. For marketers –especially big CPG marketers who don’t get a lot of site traffic – first-party data is incredibly sparse but is still the absolute most valuable signal to use for modeling. Marketers can analyze first-party data attributes to understand what traits and behaviors consumers have in common and expand their reach using second- or third-party data. Retail and ecommerce players are more fortunate. A Big Box Store has first-party data out the wazoo: loyalty card data, point-of-sale system data, app data, website registration data, site visit data and maybe even credit card data if it owns and operates a finance arm. It can leverage a DMP to understand how media exposure drove a store visit, where customers were in the store (beacons!), what was purchased, how many coupons were remitted and whether or not they researched their purchase on the site. Talk about getting “closed loop” sales attribution. The power of first-party data is truly amazing.
The biggest problem with third-party data is that all of my competitors have it. In programmatic marketing, that means both Ford and Chevy are likely bidding on the same “auto-intender” and driving prices up. The other problem is that I don’t know how the data was created. What attributes went into deciding whether or not this “auto intender” is truly in-market for a car? There are no real rules about this stuff. A guy who read the word “car” in an article might be an “auto-intender” just as someone who looked a four-door sedans three times in the last 30 days on reputable auto sites. Quality varies. That being said, there is huge value in having third-party data at your disposal. Ginormous Music App, for example, has built a service that is essentially a DMP for music; it knows how to break down every song, assign very granular attributes to it and delivers highly customized listening experiences for free and paid users. Those users are highly engaged, have demonstrated a willingness to buy premium services and are, by virtue of their mobile device, easily found at precise geolocations. Yet, for all of that, the value to a marketer of a Maroon Five segment is rather small. Everyone likes Maroon Five, from grandmothers to tweens to Dads. A Maroon Five segment provides little value to an advertiser. Yet, if Ginormous Music App could push its app-based user data (IDFAs) into the cookie space and find a user match, it could effectively use third-party data to understand the income, behavior and general profile of many Maroon Five fans. And that’s what their advertisers like to buy. That’s pretty damn valuable.
So, how about “second-party” data? These are the “frequent business travelers” on Huge News Site and the “Mitsubishi intenders” on Large Auto Site. These are real users, with true demonstrated intent and behavior that has been validated on real properties. One of the most valuable things about audiences built on second-party data is that there is usually transparency regarding how those users found their way into a segment.
The ironic and kind of beautiful thing about the emergence of second-party data is that it is most often merely a connection to a premium publisher’s users. However, it can be uncoupled from a publisher’s media sales practice. Marketers, increasingly sick of all the fraud and junk in the programmatic ecosystem are turning toward second-party data to access the same audiences they bought heavily in print 30 years ago. This time, however, they are starting to get both the quality – and the quantitative results – they were looking for. On the flip side, quality publishers are starting to understand that, when offered in a strict, policy-controlled environment that protects their largest asset – audience data – they can make way more money with data deals than media deals.
Put simply, second-party data is heralding a return to the good old days when big marketers depended on relationships with big publishers as the stewards of audiences, and they created deep, direct relationships to ensure an ongoing value exchange. Today, that exchange increasingly happens through web-based software rather than martini lunches.
[This article originally appeared in AdExchanger on 1/25/16]
2015 was a fantastic year for many data-driven marketers, with data management platforms (DMPs), consultancies and marketers getting something nice under their trees.
Unfortunately, 2015 also saw legacy networks, supply-side platforms (SSPs) and some less nimble agencies receive coal in their respective stockings for failing to keep up with the rapidly changing paradigm as marketing and ad technology merge.
In the great tradition of end-of-year prediction articles, here’s my take on the year’s biggest developments and what we’ll see in 2016, including a rapid technology adoption from big marketers, a continuing evolution of the agency model and an outright revolution in how media is procured.
I thought 2015 was supposed to herald the “death of the digital agency model.” As agencies struggled to define their value proposition to big marketers that were increasingly bringing “programmatic in house,” agencies were reputed to be on the ropes. Massive accounts with billions of dollars in marketing spend were reviewed, while agencies churned through cash pitching to win new business – or at least trying keep old business.
The result? Agencies swapped a ton of money, but were abandoned by no serious marketers. Agencies got a lot smarter, and starting spinning new digital strategies and DMP practices to combat the likes of system integrators and traditional consultancies. And the band played on.
In 2016, we will continue to see agencies strengthen their digital strategy bench, start moving “marketing automation” practices into the DMP world and offer integration services to help marketers build bespoke “stack” solutions. Trading desks will continue to aggressively pursue unique relationships with big publishers and start to embrace new media procurement methodologies that emphasize their skillset, rather than the bidded approach in open exchanges (more on that below).
Marketers Hug Big Data
Marketers started to “cross the chasm” in 2015 and more widely embrace DMPs. It’s no longer just “early adopters” such as Kellogg’s that are making the market. Massive top-100 firms have fully embraced DMP tech and are starting to treat online data as fuel for growth.
Private equity and activist investors continue to put the squeeze on CPG companies, which have turned to their own first-party data to find media efficiency as they try to control the one line item in the P&L usually immune to risk management: marketing spend.
Media and entertainment companies are wrangling their consumer data to fuel over-the-top initiatives, which put a true first-party relationship with their viewers front and center. Travel companies are starting to marry their invaluable CRM data to the anonymous online world to put “butts in seats” and “heads in beds.”
If 2015 saw 15% of the Fortune 500 engage with DMPs, 2016 is when the early majority will surge and start to make the embrace of DMP tech commonplace. The land grab for 24-month SAAS contracts is on.
It used to be a that a senior-level digital guy would get sick of his job and leave it (or his job would leave him), leading to a happy consultant walking around advising three or four clients on programmatic strategy. In 2015, that still exists but we’ve seen a rise in scale to meet the needs of a rapidly changing digital landscape.
Marketers and publishers are hiring boutique consultancies left and right to get on track (see this excellent, if not comprehensive, list). Also, big boys, including Accenture, Boston Consulting Group and McKinsey, are in the game, as are large, media-centric firms, such as MediaLink.
These shops are advising on data strategy, programmatic media, organizational change management and privacy. They are helping evaluate expensive SAAS technology, including DMPs and yield management solutions, and also doing large systems integrations required to marry traditional databases with DMPs.
Match Rates (Ugh)
Perhaps unpublicized, with the exception of a few nerdy industry pieces, we saw in 2015 a huge focus on “match rates,” or the ability for marketers to find matches for their first-party data in other execution systems.
Marketers want to activate their entire CRM databases in the dot-com space, but are finding only 40% to 50% of cookies that map to their valuable lists. When they try to map those cookies to a DSP, more disappointment ensues. As discussed in an earlier article, match rates are hard to get right, and require a relentless focus on user matching, great “onboarding services,” strong server-to-server connections between DMPs and DSPs (and other platforms) and a high frequency of user matching.
This was the year that marketers got disappointed in match rates and started to force the industry to find better solutions. Next year, huge marketers will take bold steps to actually share data and create an available identity map of consumers. I think we will see the first real data consortium emerge for the purposes of creating an open identity graph. That’s my big prediction – and hope – for 2016.
Head For The Headers
2015 was the year of “header bidding,” the catch-all phrase for a software appliance that gives publishers the chance to offer favored demand-side partners a “first look” at valuable inventory. I am not sure if “header bidding” will ultimately become the de facto standard for “workflow automation,” but we seem to be relentlessly marching back to a world in which publishers and marketers take control of inventory procurement and get away from the gamesmanship inherent in exchange-based buying.
Big SSPs and networks that have layered bidding tech onto open exchanges are struggling. Demand-side platforms are scrambling to add all sorts of bells and whistles to their “private marketplaces,” but the industry evolves.
Next year, we will see the pace of innovation increase, and we have already seen big trade desks make deals with DMPs to access premium publisher inventory. It’s nice to see premium publisher inventory increase in value – and I believe it will only continue to do so.
2016 will be the year of “second-party data” and the winners will be the ones with the technology installed to easily transact on it.
2015 was a great year for data-driven marketing, and 2016 will be even more fun. Stay safe out there.
This post originally appeared in AdExchanger on 12/17/2015
2015 has been one of the most exciting years in digital driven marketing to date. Although publishers have been leading the way in terms of building their programmatic “stacks” to enable more efficient selling of digital media, marketers are now catching up. Wide adoption of data management platforms has given rise to a shift in buying behaviors, where data-driven tactics for achieving effectiveness and efficiency rule. Here’s a some interesting trends that have arisen.
Remember when finding the “household CEO” was as easy as picking a demographic target? Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen a them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT). This trend has existed in categories like Automotive and Travel, but is now being seen in CPG. Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them. These purchase-defined segments are a more precise starting point in digital segmentation—and can be augmented by behavioral and contextual data attributes to achieve scale. The big winners here are the folks who actually have the in-store purchase information, such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing.
For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them. It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP an exchange, SSP, and other assorted technologies to get to space on a publisher website. Marketers have seen $10 CPMs turn into just $2 of working media. Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure. A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching. This is especially effective where marketers seek as “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.
Global Frequency Management
The rise in DMPs has also led to another fast-growing trend: global frequency management. Before marketers could effectively map users to all of their various devices (cross-device identity management, or CDIM) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual. Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more. What is the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication. This means big money for marketers, who can finally start to control their messaging—putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen. It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.
Taking it in-House
More and more, we are seeing big marketers decide to “take programmatic in house.” That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data. This trend isn’t as explosive as one might think, based on the industry trades—but it is real and happening steadily. What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks. Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available. Even the oldest school marketer can license their way into a technology stack any agency would be proud of. The only thing really holding back the trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders! When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.
Closing the Loop with Data
One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.” Attitudinal “brand lift” studies, and latent post-campaign sales attribution modeling has been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. “Did my Facebook ad move any items off the shelf?” We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to able to get eCommerce-like attribution for corn flakes soon. In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store. Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display; and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made. These marketing fantasies are becoming more real every day.
Letting the Machines Decide
What’s next? The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, an test-and-learn their way to gain more precision. Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will atart doing the heavy lifting. What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a hige edge in execution. In other words, now we can look at a small group of 1000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common. Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segment automatically generated from that data, and expend it from there. This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.
It’s a great time to be a data-driven marketer!
If you work in digital marketing for a brand or an agency, and you are in the market for a data management platform, you have probably asked a vendor about match rates. But, unless you are really ahead of the curve, there is a good chance you don’t really understand what you are asking for. This is nothing to be ashamed of – some of the smartest folks in the industry struggle here. With a few exceptions, like this recent post, there is simply not a lot of plainspoken dialogue in the market about the topic.
Match rates are a key factor in deciding how well your vendor can provide cross-device identity mapping in a world where your consumer has many, many devices. Marketers are starting to request “match rate” numbers as a method of validation and comparison among ad tech platforms in the same way they wanted “click-through rates” from ad networks a few years ago. Why?
As a consumer, I probably carry about twelve different user IDs: A few Chrome cookies, a few Mozilla cookies, several IDFAs for my Apple phone and tablets, a Roku ID, an Experian ID, and also a few hashed e-mail IDs. Marketers looking to achieve true 1:1 marketing have to reconcile all of those child identities to a single universal consumer ID (UID) to make sure I am the “one” they want to market to. It seems pretty obvious when you think about it, but the first problem to solve before any “matching” tales place whatsoever is a vendor’s ability to match people to the devices and browser attached to them. That’s the first, most important match!
So, let’s move on and pretend the vendor nailed the cross-device problem—a fairly tricky proposition for even the most scaled platforms that aren’t Facebook and Google. They now have to match that UID against the places where the consumer can be found. The ability to do that is generally understood as a vendor’s “match rate.”
So, what’s the number? Herein lies the problem. Match rates are really, really hard to determine, and they change all the time. Plus, lots of vendors find it easier to say, “Our match rate with TubeMogul is 92%” and just leave it at that—even though it’s highly unlikely to be the truth. So, how do you separate the real story from the hype and discover what a vendor’s real ability to match user identity is? Here are two great questions you should ask:
What am I matching?
This is the first and most obvious question: Just what are you asking a vendor to match? There are actually two types of matches to consider: A vendor’s ability to match a bunch of offline data to cookies (called “onboarding”), and a vendor’s ability to match a set of cookie IDs to another set of cookie IDs.
First, let’s talk about the former. In onboarding—or matching offline personally identifiable information (PII) identities such as an e-mail with a cookie—it’s pretty widely accepted that you’ll manage to find about 40% of those users in the online space. That seems pretty low, but cookies are a highly volatile form of identity, prone to frequent deletion, and dependent upon a broad network of third parties to fire “match pixels” on behalf of the onboarder to constantly identify users. Over time, a strong correlation between the consumer’s offline ID and their website visitation habits—plus rigor around the collection and normalization of identity data—can yield much higher offline-to-online match results, but it takes effort. Beware the vendor who claims they can match more than 40% of your e-mails to an active cookie ID from the get-go. Matching your users is a process, and nobody has the magic solution.
As far as cookie-to-cookie user mapping, the ability to match users across platforms has more to do with how frequently the your vendors fire match pixels. This happens when one platform (a DMP) calls the other platform (the DSP) and asks, “Hey, dude, do you know this user?” That action is a one-way match. It’s even better when the latter platform fires a match pixel back—“Yes, dude, but do you know this guy?”—creating a two-way identity match. Large data platforms will ask their partners to fire multiple match pixels to make sure they are keeping up with all of the IDs in their ecosystem. As an example, this would consist of a DMP with a big publisher client who sees most of the US population firing a match pixel for a bunch of DSPs like DataXu, TubeMogul, and the Trade Desk at the same time. Therefore, every user visiting a big publisher site would get that publisher’s DMP master ID matched with the three separate DSP IDs. That’s the way it works.
Given the scenario I just described, and even accounting for a high degree of frequency over time, match rates in the high 70 percentile are still considered excellent. So consider all of the work that needs to go into matching before you simply buy a vendor’s claim to have “90%” match rates in the cookie space. Again, this type of matching is also a process—and one involving many parties and counterparties—and not just something that happens overnight by flipping a switch, so beware of the “no problem” vendor answers.
What number are you asking to match?
Let’s say you are a marketer and you’ve gathered a mess of cookie IDs through your first-party web visitors. Now, you want to match those cookies against a bunch of cookie IDs in a popular DSP. Most vendors will come right out and tell you that they have a 90%+ match rate in such situations. That may be a huge sign of danger. Let’s think about the reality of the situation. First of all, many of those online IDs are not cookies at all, but Safari IDs that cannot be matched. So eliminate a good 20% of matches right off the bat. Next, we have to assume that a bunch of those cookies are expired, and no longer matchable, which adds another 20% to the equation. I could go on and on but, as you can see, I’ve just made a pretty realistic case for eliminating about 40% of possible matches right off the bat. That means a 60% match rate is pretty damn good.
Lots of vendors are actually talking about their matchable population of users, or the cookies you give them that they can actually map to their users. In the case of a DMP that is firing match pixels all day long, several times a day with a favored DSP, the match rate at any one time with that vendor may indeed be 90-100%–but only of the matchable population. So always ask what the numerator and denominator represent in a match question.
You might ask whether or not this means the popular DMP/DSP ”combo” platforms come with higher match rates, or so-called “lossless integration” since both the DMP and DSP carry an single architecture an, therefore, a unified identity. The answer is, yes, but that offers little differentiation when two separate DMP/DSP platforms are closely synched and user matching.
Marketers are obsessing over match rates right now, and they should be. There is an awful lot of “FUD” (fear, uncertainty, and doubt) being thrown around by vendors around match rates—and also a lot of BS being tossed around in terms of numbers. The best advice when doing an evaluation?
- Ask what kind of cross-device graph your vendor supports. Without the fundamental ability to match people to devices, the “match rate” number you get is largely irrelevant.
- Ask what numbers your vendor is matching. Are we talking about onboarding (matching offline IDs to cookies) or are we talking about cookie matching (mapping different cookie IDs in a match table)?
- Ask how they are matching (what is the numerator and what is the denominator?)
- Never trust a number without an explanation. If your vendor tells you “94.5%” be paranoid!
- And, ask for a match test. The proof is on the pudding!
As I’ve previously discussed, there are several basic use cases of the modern data management platform (DMP) for marketers. They include getting “people data” from addressable devices into a single system, controlling how it’s matched with different execution platforms and managing the frequency of messaging across devices.
In a world of ultra-fragmented device identity and multiple addressable media channels, you should be able to tie them together and make sure consumers get the optimal amount of messages. Big marketers use these tactics to save tons of money by chopping off the “long tail” of impressions, such when marketers deliver more than 30 impressions per user each month, and reinvesting to find more deduplicated reach.
There is so much more to the successful application of a DMP, though. The most cutting-edge marketers are taking DMPs to the next level, after investing the time in building consumer identity graphs and getting their match rates with execution platforms as high as possible.
There are several plays you can run when you start to dig in and put the data to work.
Supercharge The Bidding Strategy
After identifying the long tail of impression frequency and diverting that investment into reach, where users are served up to three impressions per month, the key is driving users down into the sweet spot of frequency. This is where users are more likely to download more coupons, for example, or complete more video views.
If that sweet spot is between four and 20 impressions, marketers can adjust their strategy in biddable environments to ensure they are willing to pay more to “win” users who have only been exposed to three impressions so far. DMPs can match users with fidelity and deliver in near real time these types of targeting sets to multiple execution platforms, including those for display, video and search.
Optimize Partner Investment Through Reach Analysis
It’s a great start to manage addressable media delivery on a global basis, but what happens after you have identified all of those wasted impressions?
Naturally, the money marketers are spending reaching consumers for the 100th time can be better spent looking for net new consumers. But how do you get them?
For a diaper manufacturer that wants to reach the estimated 6 million new mothers in market every year, it’s critically important to get to 100% reach against that audience. Many marketers start with a single, broad reach partner, such as Yahoo, and see how close they can get to total reach.
It’s fantastic to leverage big spending power to drive down prices and get massive customer service attention to spread a message to as many unique users as possible. But no single partner can get a marketer to 100%. That’s where the DMP comes in.
It’s not just about filling in the missing 25% of an audience that matters; the diaper manufacturer wants to hit those incremental moms across quality, well-lit sites. Determining where you can get a few more million deduplicated moms is the first step. The key is to then decide where to find them more effectively from an investment standpoint, which requires an overlap analysis.
Enhance Partner Selection Through Overlap Analysis
Say our diapers manufacturer found 4 million new moms on Yahoo at a reasonable CPM. The DMP can then look across all addressable media investments and run a “Where are my people?” type of analysis.
Maybe this advertiser has another 20 partners on the plan after getting the bulk of unique reach from a single partner. How many more unique moms were found on Meredith? Moreover, how about finding moms on classic news and entertainment sites, such as NBC or Turner properties, or even non-endemic sites? Maybe there is an incremental 500,000 first-party “diaper moms” on a particular site, but now the advertiser can decide, based on performance KPIs and price, how valuable those particular moms are.
If those moms on a popular news site can be had for $5 CPM, maybe they are a more valuable reach vehicle than those found on the obvious “Moms.com” site. Without the DMP, they’ll never know.
Plus, marketers are also starting to optimize the way they procure such audiences, by leapfrogging over the existing ad tech ecosystem and doing audience-based programmatic direct buying using their new DMP pipes.
Understand KPIs Drivers Through Journey Building
Marketers that have deduplicated their audience and built an effective reach strategy can now go to the next level and start finding how those diaper moms moved from their first touch point in the customer journey to an actual action, such as downloading a retail coupon or requesting a sample package. When an audience is unified through a DMP, it’s possible to see the channels through which people move across their “customer journey” from awareness to action.
As an example, more large CPG companies are putting more investment into online video and, in fact, one of the world’s largest marketers has embraced a “ban the banner” approach and values engagement more than any other KPI – a metric more easily understood with video. With that in mind, a journey analysis can show marketers if seeing a few search impressions helped drive more completed views on (expensive) video and drive more brand engagement.
Did consumers download more coupons after viewing two equity (branding) impressions or before seeing the “buy now” (direct-response) message? The ability to understand how messages work together sequentially is the ultimate guide to being able to inform media investment strategy.
These are just a few of the next-level media use cases that can be accomplished once DMP fundamentals are put in place. DMPs are starting to shine a light on the “people data” that will drive the next decade of smart media investment. I think we will look back on the last 15 years of addressable marketing and wonder how we ever made such decisions without a clear view of audience first.
DMPs are starting to shine a light on the effectiveness of marketing, and giving marketers lots of new knobs and levers to pull.
It’s a great time to be a data-driven marketer.
Marketers have always craved access to quality audience at scale. That was once as easy as scheduling buys on the top three broadcast networks and buying full-page ads in national newspapers. Today, the world is more complicated, with attention shifting into a splintered digital universe of thousands of channels across multiple media types.
Ad tech companies have tried to corral a massively expanding world of inventory in ad exchanges, along with the means to bid inside them. This “programmatic” world of inventory procurement is deeply flawed, yet still the best thing we have at the moment.
It’s flawed because it mostly offers access to commoditized “display” ad units of dubious value and struggles to deliver real audiences, rather than robots. But it’s also good because we have taken the first steps past a ridiculous paradigm of buying media through relationships and fax machines, while starting to bring an analytical discipline to media investment that is based on measurement.
So, as we sled the downward slope of the programmatic buying Hype Cycle, we are starting to see some new trends in inventory procurement – namely, a strategy that involves replacing some or all of the licensed programmatic architecture, as well a growing reliance on one’s own data.
But first, before we get into the nuts and bolts of how that works, some history:
The Monster We Created
After convincing ourselves of the lack of scalability in the direct model, where we would call an ad rep, we have set up a lot of distance between a marketer and their desired audience.
Imagine I am a cereal manufacturer and have discovered through media mix modeling that digital moms on Meredith sites drive a lot of offline purchases. They are the “household CEOs” that drive grocery store purchasing, try new things and are influential among their peer group, in terms of recommending new products. In today’s new media procurement paradigm, there are many “friends” standing between my target and me:
- Media agency: This is a must-have, unless marketers want to add another 100 people to their headcount with an expertise in media, but this adds 5% to 10% in costs to media buys.
- Trading desk: Although many marketers are starting to take this functionality in-house, whether you trade internally or leverage an agency trading desk, you can expect 10% to 15% of media costs to go to the personnel needed to run this type of operation.
- Demand-side platform (DSP): Don’t forget about the technology. A 15% bid reduction fee is usually required to leverage the smart tools necessary to find your inventory at scale across exchanges.
- Private marketplace: But wait! We use private marketplaces to make exclusive deals among a small pool of preferred vendors. Yes, but they operate inside DSPs and carry transactional fees that can add between 5% and 10% extra.
- Third-party data: You can’t target effectively without adding a nice layer of audience data on your buy, but expect to pay at least $1 CPM for the most basic demographic targeting – a significant percentage of cost even on premium buys.
- Exchanges: Maybe you pay for this via your DSP, but someone is paying for a seat on an ad exchange and that cost is passed through a provider, which can add another several percentage points.
- Supply-side platform (SSP): It’s not just the demand side that needs to leverage expensive technology to navigate the new world of digital media. Publishers pay up to 15% in fees to deploy SSPs, a smart inventory management technology to help them manage their “daisy chain” of networks and channel sales providers to get the best yield. This is baked into the media cost and passed along to the advertiser.
- Ad server: Finally, the publisher pays a fee to get the ad delivered to the site. It is a somewhat small price, but one that is passed along to the advertiser, usually baked in to the media cost.
This is essentially the middle of a crowded LUMAscape, a bunch of different disintermediating technologies that stand between an advertiser and the publisher. Marketers pay for everything I just described. They may not license the publisher’s SSP for them, but they are subsidizing it. After running this gauntlet, marketers with $10 to spend on “cereal moms” end up with much less than half in media value – the amount the publisher ends up with after the disintermediation takes place. This can be anywhere from 10% to 40% of the working media spend.
That’s probably the biggest problem in ad tech right now.
We’ve essentially created a layer of technology so gigantic in between marketers and audiences, that 60% to 70% of media investment dollars land up in venture-funded technology companies’ hands, rather than the media owner creating the perceived value. How do we change that paradigm?
Leapfrogging the Middleware
Data management technology is increasingly replacing some of the middleware in this procurement equation, effectively writing the third chapter in the saga we know as programmatic direct.
Here is a bit of background.
What I call “Programmatic Direct 1.0” was the short-lived period in which companies leveraging the DoubleClick for Publishers (DFP) ad-serving API built static marketplaces of premium inventory.
For example, a premium publisher like Forbes might decide to place a chunk of 500,000 home page impressions in a marketplace at a $15 CPM. Buyers could go into an interface, transact directly with the publisher and secure the inventory. The problem that inventory owners had a hard time valuing their future inventory and buyers weren’t keen to log into yet another platform to buy media. This phase effectively ended with the Rubicon Project buying several leaders in the space, ShinyAds and iSocket, and AdSlot taking over workflow automation software provider Facilitate Media. Suddenly, “programmatic direct” platforms started to live inside systems where media planners actually bought things.
Programmatic direct’s second act (2.0) is prevalent today. Companies use deal IDs or build PMPs within real-time systems and exchanges to have more control over procurement than what is available in an auction environment. Sellers can set prices and buyers can secure rights to inventory at a set, transparent cost. This works pretty well, but comes with the same gigantic stack of providers as before and includes additional transaction fees. This is akin to making a deal to buy a house directly from the owner, but agreeing to pay the real estate broker fee anyway. The thing about programmatic direct transactions is that they are fundamentally different than RTB because they don’t have to take place in “real time,” nor do they involve bidding. A brand-new set of pipes is required.
“Programmatic direct 3.0” – or whatever we decide to call it – looks a bit different. Let’s say the big cereal company uses a data-management platform (DMP) to collect its first-party data and creates segments of users from both offline user attributes and page-level attributes from site visitation behavior. The marketers have created a universal ID (UID) for every user. Let’s imagine they discovered 200,000 were females, 24 to 40 years old, living in two-child households with income greater than $150,000 and interested in health and fitness. Great.
Now imagine that a huge women’s interest site deployed its own first-party DMP and collected similar attributes about their users, who were assigned UIDs. If the marketer and publisher have the same enterprise data architecture, they could match their users, make a deal and discover that there’s an overlap of 125,000 of users on the site. Maybe the marketer agrees to spend $7 CPM to target those users, along with users who are statistically similar, every time they are seen on the site for November.
The DMP can push that segment directly into the publisher’s DFP. No trading desk fees, DSP fees, third-party data costs or SSPs involved. The same is true for a variety of companies that have built header bidding solutions, although they see less data than first-party DMPs.
With this 3.0 approach, most of the marketer’s $7 is spent on media, rather than a basket of technologies, and the publisher gets to keep quite a bit of that revenue.
Sounds like a good deal.
Almost every marketer is starting to lean into data management technology. Whether they are trying to build an in-house programmatic practice, use data for site personalization, or trying to obtain the fabled “360 degree user view,” the goal is to get a handle on their data and weaponize it to beat their competition.
In the right hands, a data management platform (DMP) can do some truly wonderful things. With so many use cases, different ways to leverage data technology, and fast moving buzzwords, it’s easy for early conversations to get way too “deep in the weeds” and devolve into discussions of “match rates” and how cross-device identity management works. The truth is that data management technology can be much simpler than you think.
At its most basic level, DMP comes down to “data in” and “data out.” While there are many nuances around the collection, normalization, and activation of the data itself, let’s look at the “data in” story, the “data out” story, and an example of how those two things come together to create an amazing use case for marketers.
The “Data In” Story
To most marketers, the voodoo that happens inside the machine isn’t the interesting part of the DMP, but it’s really where the action happens. Understanding the “truth” of user identity (who are all these anonymous people I see on my site and apps?) is what makes the DMP useful in the first place, making one-to-one marketing and understanding customer journeys something that goes beyond AdExchanger article concepts, and starts to really make a difference!
- Not Just Cookies: Early DMPs focused on mapping cookie IDs to a defined taxonomy and matching those cookies with execution platforms. Most DMPs—from lightweight “media DMPs” inside of DSPs to full-blown “first-party” platforms—handle this type of data collection with ease. Most first-generation DMPs were architected as cookie collection and distribution platforms, meant to associate a cookie with an audience segment, and pass it along to a DSP for targeting. The problem is that people are spending more time in cookie-less environments, and more time on mobile (and other devices). That means today’s DMPs have to have the ability to do more than organize cookies, but also be able to capture a large variety of disparate identity data, which can also include hashed CRM data, data from a point-of-sale (POS) system, and maybe even data from a beacon signal.
- Ability to Capture Device Data: To a marketer, I look like eight different “Chris OHara’s:” three Apple IDFAs, several Safari unique browser signatures, a Roku device ID, and a hashed e-mail identity or two. These “child identities” must be reconciled to a “Universal ID” that is persistent and collects attributes over time. Most DMPs were architected to store and manage cookies for display advertising, not cross-device applications, so platforms’ ability to ingest highly differentiated structured and unstructured data are all over the map. Yet, with more and more time dedicated to devices instead of desktop, cookies only cover 40% of today’s pie.
- Embedded Device Graph: Cross-device identification is notoriously difficult, requiring both the ability to identify people through deterministic data (users authenticate across mobile and desktop devices), or the skill to apply smart algorithms across massive datasets to make probabilistic guesses that match users and their devices. Over the next several years, the word “device graph” will figure prominently in our industry, as more companies try and innovate a path to cross-device user identity—without data from “walled garden” platforms like Google and Facebook. Since most algorithms operate in the same manner, look for scale of data; the bigger the user set, the more “truth” the algorithms can identify and model to make accurate guesses of user identity.
The “data in” story is the fundamental part of DMP—without being able to ingest all kinds of identifiers and understand the truth of user identity, one-to-one marketing, sequential messaging, and true attribution is impossible
While the “data in” story gets pretty technical, the “data out” story starts to really resonate with marketers because it ties three key aspects of data-driven marketing together. Here’s what a DMP should be able to do:
- Reconcile Platform Identity: Just like I look like eight different “Chris O’Haras” based on my device, I also look like 8 different people across media channels. I am a cookie in DataXu, another cookie in Google’s DoubleClick, and yet another cookie on a site like the New York Times. The role of the DMP is to user match with all of these platforms, so that the DMP’s universal identifier (UID) maps to lots of different platform IDs (child identities). That means the DMP must have the ability to connect directly with each platform (a server-to-server integration being preferable), and also the chops to trade data quickly, and frequently.
- Unify the Data Across Channels: To a marketer, every click, open, like, tweet, download, and view is another speck of gold to mine from a river of data. When aggregated at scale, these data turn into highly valuable nuggets of information we call “insights.” The problem for most marketers that operate across channels (display, video, mobile, site-direct, social, and search, just to name a few) is that the fantastic data points they receive all live separately. You can log into a DSP and get plenty of campaign information, but how do you relate a click in a DSP with a video view, an e-mail “open,” or someone who has watched a YouTube on an owned and operated channel? The answer is that even the most talented Excel jockey running twelve macros can’t aggregate enough ad reports to get decent insights. You need a “people layer” of data that spans across channels. To a certain extent, who cares what channel performed best, unless you can reconcile the data at the segment level? Maybe Minivan Moms convert at a higher percentage after seeing multiple video ads, but Suburban Dads are more easily converted on display? Without unifying the data across all addressable channels, you are shooting in the dark.
- Global Delivery Management: The other thing that becomes possible when you tie both cross-device user identity and channel IDs together with a central platform is the ability to manage delivery globally. More on this below!
If I am a different user on each channel—and each channel’s platform or site enables me to provide a frequency cap—it is likely that I am being over-served ads. If I run ads in five channels and frequency cap each one at 10 impressions a month per user, I am virtually guaranteed to receive 50 impressions over the course of a month—and probably more depending on my device graph. But what if the ideal frequency to drive conversion is only 10 impressions? I just spent 5 times too much to make an impact. Controlling frequency at the global level means being able to allocate ineffective long-tail impressions to the sweet spot of frequency where users are most likely to convert, and plug that money back into the short tail, where marketers get deduplicated reach.
In the above example, 40% of a marketer’s budget was being spent delivering between 1-3 impressions per user every month. Another 20% was spent delivering between 4-7 impressions, which conversion data revealed to be where the majority of conversions were occurring. The rest of the budget (40%) was spent on impressions with little to very little conversion impact.
In this scenario, there are two basic plays to run: Firstly, the marketer wants to completely eliminate the long tail of impressions and reinvest it into more reach. Secondly, the marketer wants to push more people from the short tail down into the “sweet spot” where conversions happen. Cutting off long tail impressions is relatively easy, through sending suppression sets of users to execution platforms.
“Sweet spot targeting” involves understanding when a user has seen her third impression, and knowing the 4th, 5th, and 6th impressions have a higher likelihood of producing an action. That means sending signals to biddable platforms (search and display) to bid higher to win a potentially more valuable user.
It’s Rocket Science, But It’s Not
If you really want to get deep, the nuts and bolts of data management are very complicated, involving real big data science and velocity at Internet speed. That said, applying DMP science to the common problems within addressable marketing is not only accessible—it’s making DMPs the must-have technology for the next ten years, and global delivery management is only one use case out of many. Marketers are starting to understand the importance of capturing the right data (data in), and applying it to addressable channels (data out), and using the insights they collect to optimize their approach to people (not devices).
It’s a great time to be a data-driven marketer!
The story of display inventory procurement started with the Publisher Direct Era, when publishers were firmly in control of their banners, and kept them safely hidden behind sales forces and rate cards. Then the Network Era crept in, and smart companies like Tacoda took all the unwanted banners and categorized them. Advertisers liked to buy based on behavior, and publishers liked the extra check at the end of the month for hard-to-sell inventory.
That was no fun for the demand side though. They started the Programmatic Era, building trading desks, and leveraging DSPs to make sure they were the ones scraping a few percentage points from media deals. Why let networks have all of the arbitrage fun? The poor publisher was left to try and fight back with SSPs and more technology to battle the technology that was disintermediating them, kind of like a robot fight on the Science Channel.
But all of the sudden, publishers realized how silly it was to let someone else determine the value of their inventory, and launched the DMP Era. They ingested first-party data from their registration and page activity and created real “auto intenders” and “cereal moms” and wonderful segments that they could use to effectively sell to marketers. Now, every smart publisher knows more about their inventory than 3rd parties, and they can also find their readers across the wider Web through exchanges. A win-win!
Then all of the marketers in the world started reading AdExchanger, and saw the publisher example, and thought, “Wow, good call!” They started to truly understand how much money Programmatic companies were taking out of the investment they earmarked for media (silly marketers, Y U no read Kawaja’s first IAB deck?), and decided to use their own technology and data to power audience targeting. If it were a baseball game, this DMP Era for Marketers would be in the first or second inning, but the pitcher is throwing at a fast pace.
The next thing that happened was the Programmatic Direct Era, which lasted about ten minutes and effectively jumped the shark when Rubicon bought two of the more prominent companies involved (ShinyAds and iSocket). Programmatic Direct marketplaces promised a flip of the yield curve for publishers to expose the “fat middle” of undervalued impressions. They attempted this by placing blocks of inventory in a marketplace, and enabled the publisher to set rates, impression levels, and provide API access directly into their ad server. Alas, a tweak to Google’s API did not an industry make. Marketers loved the idea, but since they use audience as the primary mechanism to value inventory, PD marketplaces failed as stand-alone entities and were gobbled up. Under the steady hand of RTB-based technologies, they slowly evolve based on buy-side methodologies. Again, the demand side foils a perfectly reasonable, publisher-derived procurement scheme!
So, what’s next?
The Programmatic Direct Era still lives, albeit within private marketplaces (PMPs) and Direct Deal functionality. The IAB’s Open Direct protocol remains stuck at 1.0, but there is hope—and this time it’s a change that is positive for both marketers and publishers. The latest Era in inventory procurement is what I call Total Automation. Let me explain.
Say a big auto manufacturer has a DMP and has identified, via purchase information, the exact profile of everyone who buys their minivan. Call then “Van Moms.” Then suppose the publisher, who licenses an instance of the same DMP, is a women-friendly publication chock full of those Van Moms—and women who just happen to look like Van Moms. It’s pretty easy to pipe those Moms from the marketer right to the publisher. That process, which you might call Programmatic Direct 2.0, is interesting.
It requires no exchanges, no 3rd party data, no DSPs, no “private marketplace” no SSP, and potentially no agencies (spare the thought!). All it requires is some technology to map users and port them directly into an ad server.
What I just described is happening today, and moving quickly. Marketers are discovering that the change from demo-based buying to purchase-based buying through 1st party data is winning them more customers. Publishers are asking for—and commanding—high CPMs, and those CPMs are backing out for marketers. Thanks to all the crap in open exchanges, paying more for quality premium, “well lit” inventory actually works better than slogging through exchanges trying to find the audience needle in a haystack full of robots and “natural born clickers.”
The new Era of Total Automation will start putting publishers back on the map—but not all of them. The big distinction between the winners and losers will not only be the quality of their audience but, more importantly, the first-party data used to derive that audience. Not long ago, it was easy to apply a layer of 3rd party data and call someone an “auto intender” if they brushed past an article on the latest BMW. But compare that to the quality of an “auto intender” on a car site that has looked at 5 sedans over the last 2 weeks, and also used a loan calculator. There’s no comparison. The latter “intender,” collected from page- and user-level attributes directly by the publisher is 10 times more valuable (or $30 CPM rather than $3, if you like). The reason? That user volunteered real, deterministic information about herself that the publisher can validate. I am willing to bet that an auto manufacturer would pay a high CPM for access to an identified basket of those intenders on an ongoing, “always on” basis.
This is fantastic news for publishers that have great, quality inventory and have implemented a first-party data strategy. It’s even better news for the marketers that have embraced data management, and can extract and find their perfect audience on those sites. The Era of Total Automation will be over when every single marketer has a DMP. At that time, we will discover that there is no longer a glut of display inventory—all of the quality “Van Moms” and “Business Travelers” and the like will be completely spoken for. What will be left is a large pile of unreliable, long tail inventory available for the brave DR marketer and his DSP.
I think both marketers and publishers should welcome this new Era of data-driven one-to-one marketing. The crazy thing is that, once we get it right, it looks just like an anonymized version of direct mail—perhaps the oldest, greatest, most effective and measurable marketing tactic ever invented!
[This post originally appeared in AdExchanger on 7/2/15]
Clayton Christensen, the father of “disruptive innovation,” would love the ad technology industry.
With more than 2,500 Lumascape companies across various verticals chasing an exit, venture funding drying up for companies that haven’t made an aggressive SAAS revenue case and the rapid convergence of marketing and ad technology, the next few years will see some dramatic shifts.
The coming tsunami of powerful megatrends is driving ad technology relentlessly forward at a time when data is king and the companies that best package and integrate it into multichannel inventory procurement will be the rulers.
In a world where scale matters most, the big are getter bigger and smaller players are getting forced out, which is not necessarily good for innovation.
Data: Powering The Next Decade Of Ad Tech
Data, especially as it relates to “people data,” is and will be the dominant theme for ad technology going forward.
Monolithic companies with access to a people-based identity graph are leaning in heavily to identity management, trying to own the phone book of the connected device era. Facebook’s connection to Atlas leverages powerful and deeply personal deterministic data, continually volunteered on a daily basis by its users, to drive targeting. Google is attaching its massive PII data set garnered through Gmail, search and other platforms to its execution platforms with its new DMP, DoubleClick Audience Manager.
Both platforms prefer to keep information on audience reach safely within their domains, leaving marketers wondering how smart it really to tie the keys of user identity in a “walled garden” with media execution.
Will large marketers embrace these platforms for their consumer identity management needs, or will they continue to leverage them for media and keep their data eggs in another basket?
While some run into the arms of powerful cloud solutions that combine data management with media execution, many are choosing to take a “church and state” approach to data and media, keeping them separate. Marketers have to decide whether the risk of tying first-party data together with someone’s media business is worth having an all-in-one approach.
Agencies Must Adapt Or Die As Consultancies Edge Into Programmatic
Media agencies have also been challenged to provide more transparency around the way they procure inventory, the various incentive schemes they have with publishers and their overall methodology for finding audiences. With cross-device proliferation, agencies must be able to identify users to achieve one-to-one marketing programs, and they need novel ways to reach those users at scale.
That means a commitment to automation, albeit one that may come at the expense of revenue models derived through percentage of spend and arbitrage. Agencies will need new ways to add value in a world where demand-side players are finding closer connections to the supply side.
As media margins collapse, agencies need to act as data-driven marketing consultants to lift margins and stay relevant. They face increasing competition from large consultancies whose bread and butter has been technology integration. It’s a tough spot but opportunities abound for smart agencies that can differentiate themselves.
Zombie Companies Die Off But Edge-Case Innovation Continues
We’ve been talking about “zombie ad tech” for years now, but we are finally starting to see the end of the road for many point solution companies that have yet to be integrated into larger mar tech “stacks.”
Data-management platforms with native tag-management capabilities are displacing standalone tag-management companies. Retargeting is a tactic, not a standalone business, which is now a status quo part of many execution platforms. Fraud detection systems are slowly being dragged into existing platforms as add-on functionality. Individual data providers are being sucked into distribution platforms and data exchanges that offer customer exposure at scale. The list goes on and on.
This is an incredibly positive thing for marketers and publishers, but it is also a challenge. Cutting-edge technologies that give a competitive advantage are rarely so advantageous after they’ve moved into a larger “cloud.” Smart tech buyers must strike a balance between finding the next shiny objects that confer differentiating value, while building a stable “stack” that can scale as they grow.
That said, the big marketing technology “clouds” offered by Adobe, Oracle and Salesforce continue to grow, as they gobble up interesting pieces of the digital marketing “stack.”
Will marketers go all-in on someone’s cloud, build their own “cloud” or leverage services offerings that bring a unified capability together through outsourcing?
Right now, the jury is out, mostly because licensing your own cloud takes more than just money, but also the right personnel and company resources to make it work. Yet, marketers are starting to understand that the capability to build automated efficiency is no longer just a function of marketing, but a way to leverage people data to drive value across the entire company.
Today’s media targeting will quickly give way to tomorrow’s data-driven enterprise strategy. It’s happening now, and quickly
New Procurement Models Explode Exchanges, Drive Direct Deals
I think the most exciting things happening in ad technology are happening in inventory procurement.
Programmatic direct technologies are evolving, adding real audience enablement. Version 1.0 of programmatic direct was the ability to access a futures marketplace of premium blocks of inventory. Most buyers, used to transacting on audience, not inventory, rejected the idea.
Version 2.0 brings an audience layer to premium, well-lit inventory, while changing the procurement methodology. I think most private marketplaces within ad exchanges are placeholders for a while, as big marketers and publishers start connecting real people data pipes together and start to buy directly. It’s happening now – quickly.
I also can see really innovative companies leaning into creating a whole new API-driven way of media planning and buying across channels that makes sense. In the near future, the future-driven approaches of companies like MassExchange will bring to cross-channel inventory procurement a methodology that is more regulated, transparent and reminiscent of financial markets. It’s a fun space to watch.
Who will begin adding algorithmic, data-science driven automation and proficiency to the planning process, not just execution and optimization in the programmatic space?
Many of those in the ad technology and media game are here for the challenge, the rapid pace of innovation and the opportunity to change the status quo. We are all getting way more than we imagined lately, in a fun, exciting and fast-moving environment that punishes failure harshly, but rewards true market innovation. Stay safe out there.
[This post was originally published in AdExchanger on 6.16.15]
I-COM Global Summit: Panel Discussion on Leveraging Big Data to take Programmatic to the Next Level – Chris O’Hara, Krux Digital, Eric Picard, Mediamath & Tom Simpson, MediaQuark
Chris O’Hara, VP Strategic Accounts, Krux Digital, USA, Eric Picard, VP Strategic Partnerships, Mediamath, UK and Tom Simpson, CEO, MediaQuark, Singapore were speakers and David Smith, CEO & Founder, Mediasmith, USA was moderator in Leveraging Big Data to take Programmatic to the Next Level. This discussion had no presentation.
In early 2012, when data management technology was somewhat nascent, I wrote about “the five things to expect from a DMP.” They were: To unlock the power of one’s first party data; decrease reliance upon third party data; generate unique audience insights; use data to audience power new channels; and create efficiency. A little over three years later, those things still continue to drive interest in DMP technology—and great value for both publishers and marketers.
The “table-stakes” functionality of DMPs—segmentation, lookalike modeling, targeting, and analytics—continue to resonate. Even the least advanced DMPs have those abilities, and this is what people who buy DMP software should expect from any system. Unfortunately, there are now dozens of “platforms” that claim DMP technology. Some are legitimate players, born from the ground up to be “first-party” DMPs. Some have been created as “lightweight” DMPs to collect and distribute cookies for display advertising. And still others are legacy tag management or network platforms that have bolted on DMP functionality as they work towards a fuller “stack” solution that marketers say they want.
Writing this article again, three years later, I would still encourage software buyers to evaluate their DMP choice on the ability of their partner to meet the above-listed criteria. But, there has been so much nuance and development over the last several years. Therefore, additional selection criteria present themselves if one is expected to make a reasonably informed choice in DMP selection going forward.
Here’s what the modern DMP consumer should be looking out for:
- Lookback: Three years ago I talked about “lookback windows” in the context of giving publishers the ability to attribute future conversion events to ads shown previously on their site. That is still a compelling publisher user case. What “lookback windows” really refer to is whether or not your DMP can capture 100% of the raw, log-level user event data—and store it. This necessitates an open taxonomy (because “you don’t know what you don’t know,”) and also the ability to store tons of data and make it accessible quickly. This is considered to be complete data architecture. Many DMPs operate with a rigid, defined taxonomy and only collect segment IDs—not the underlying data. That’s a problem for businesses that need to move fast and activate new segments opportunistically. Ask how—and for how long—your DMP stores data.
- Onboarding: Lots of DMPs claim to have the ability to easily ingest CRM and other offline data and match it to cookies, but the truth is everyone depends on a limited set of “onboarding” vendors to provide the matches. That’s fine, but there are some nuances and subtleties involved in the process by which offline data enters the online identity space (hashing). DMPs should enable seamless connection to all three major onboarding providers, the ability to select the methodology by which offline identity is matched to online, and also be able to automatically choose which onboarding partner is right for each identity. Ask how each DMP you evaluate works with each vendor, what kind of match rates you can expect, and how each stores persistent user identity to insure better matches over time.
- Measurement: Let’s face it, the ability to tweak programmatic audience delivery to online video viewability numbers up a few percentage points is great, but nothing moves the needle like linear television. Marketers spend a ton of money there, and will continue to do so for the foreseeable future—all the while moving incremental percentages of their budget into the digital channels where folks are spending an increasing amount of time. But, they are never really going to go full throttle with digital until they can reconcile reach and frequency across channels—and those channels must include linear! Your DMP should be able to handle overlap reporting, light attribution, and cross-channel media performance—but it should also start making some highly informed guesses about how linear audiences map to digital ones, in order to enable true attribution and media mix models. Ask how your DMP is positioned to tie the linear and digital strings together from a measurement perspective.
- CDIM: Three years ago, we were still waiting for the “year of mobile” to occur, so “cross device identity management” was still largely pre-funded slideware on some entrepreneur’s computer. Jump to today, and “CDIM” and “CDUI” are at the tip of every ad tech tongue! As more and more people move from device to device—almost none of which support the traditional cookie as an identifier—marketers and publishers desperately need to map devices to people. It’s the only way to deliver the fabled “360 degree view” of the user. Ask your DMP vendor how they are prepared to deliver deterministic matches and, more importantly, how they reconcile identity without seeing a user logging in across devices. Doing great probabilistic matching necessitates not only strong algorithms but, more importantly, scale of users which breeds precision models. What is the size of their “truth set” of user data with which to probabilistically determine user identity? The quality and scale of that data will determine your choice.
- Data Governance: I think the biggest question to ask a potential DMP vendor is their philosophy on data ownership. For both marketers and publishers, audience data is likely one of their top three assets. Trusting such data to a technology vendor is not something to be considered lightly. How is that data stored? What are the policy controls available to help you share that data with trusted partners? What about privacy and governance? How can my platform help me activate data in different places, where different rules about PII and data collection and storage apply? Knowing the answers to these before you buy can save lots of heartache (and legal fees) later. More importantly, how independent is your data? Is your partner also in the business of selling media or data? That can create some conflicts of interest—especially if your data might be valuable to a competitor. Finally, what if you want your data back? You have the right to get it out quickly, and in a useable format.
The bad news is that choosing a DMP isn’t any easier than it was three years ago. It’s a lot more complex, and you really need to dig in deeply to understand the very small nuances between platforms that appear, on the surface, to be very much the same. The good news is that there is a great deal of selection available, and some very high quality vendors to choose from. Take your time, put your vendors through a very rigorous process that includes asking the questions outlined above, and choose wisely!
[This post originally appeared in the EConsultancy blog on 5.11.15]
If you read AdExchanger regularly, you might think that nearly every global marketer has a programmatic trading strategy. They also seem to be leveraging data management technology to get the fabled “360-degree view” of their customers, to whom they are delivering concise one-to-one omnichannel experiences.
The reality is that most marketers are just starting to figure this out. Their experience ranges from asking, “What’s a DMP?” to “Tell me your current thinking on machine-derived segmentation.”
A small, but significant, number of major global marketers are aggressively leaning into data-driven omnichannel marketing, pioneering a trend that is not going anywhere anytime soon. Over the next five years, nearly every global marketer will have a data-management platform (DMP), programmatic strategy and “chief marketing technologist,” a hybrid chief marketing officer/chief information officer that marries marketing and technology. These are exciting times for people in data-driven marketing.
So, what are marketers looking for from technology today? Although these conversations ultimately become technical in nature, you soon discover that marketers want some pretty basic, “table stakes” type of stuff.
Better Segmentation Through First-Party Data
Marketers spend a lot of time building customer personas. Once a customer is in their customer relationship management (CRM) database and generates some sales data, it’s pretty easy to understand who they are, what they like to buy and where they generally can be found. From a programmatic perspective, these are the equivalent of a car dealer’s “auto intenders,” neatly packaged up by ad networks and data providers to be targeted in exchanges.
That’s still available today, but the amazing amount of robotic traffic, click fraud and media arbitrage has made marketers realize just how loose some segment definitions may be. Data companies have a great deal of incentive to create and sell lots of auto intenders, so marketers are starting to look deeper at how such segments are actually created. It turns out that some auto intenders are people who brushed past a car picture on the web, which lumped them into a $12 cost per mille (CPM) audience segment.
Those days seem to be coming to an abrupt close as marketers increasingly use their own data to curate such segments and premium publishers, which do have auto intenders among their readerships, use data-management tools to make highly granular segments available directly to the demand side. Marketers are now willing to pay premium prices for premium audiences in a dynamic being driven by more transparency into how audiences are created in the first place. Audiences comprised of first- and second-party data will win every time in a transparent ecosystem.
Less Waste, More Efficiency
Part and parcel of better audience segmentation is less waste and more media efficiency. The old saw, “I know half of my marketing works, I just don’t know which half,” goes away with good data and better attribution.
As an industry, we promised to eliminate waste 20 years ago. The banner ad was supposed to usher in a brave new world of media accountability, but we ended up creating a hell of a mess. Luckily, venture money backed “solutions” to the problems of click fraud, faulty measurement and endless complexity in digital marketing workflow.
Marketers don’t want to buy more technology problems they need to fix. And they don’t want to spend money chasing the same people around the web. They want to limit how much they spend trying to achieve reach. Data-management technology is starting to rein in wasteful spending, via tactics including global frequency management, more precise segmentation, overlap analysis and search suppression.
Marketers want to use data to be more precise. They are starting to leverage systems that help them understand viewability and get a better sense of attribution by moving away from stale last-click models. The days are numbered for marketers with black-box technology that creates a layer between their segmentation strategies and how performance is achieved against it.
One-To-One Communication Via Cross-Device Identity
Maybe the biggest trend and aspiration among marketers is the ability to truly achieve one-to-one marketing. A few years ago, that meant email, telemarketing and direct mail. Today, if you want to have a one-to-one customer relationship, you must be able to associate the “one” person with as many as five or six connected devices.
That is extremely difficult, mostly because we have been highly dependent on the browser-based “cookie” to determine identity. Cookie-based technologies evolved to ensure different cookies match up in different systems, but it’s a new world today.
Really understanding user identity means being able to reconcile different device signals with a universal ID. That means lots of cookies from different browsers, Safari’s unique browser signature, IDFAs, Android device IDs and even signals from devices like Roku, not to mention reliably “onboarding” anonymized offline data, such as CRM records.
Without device mapping, an individual looks like seven different devices to a marketer, making it impossible to deliver the “right message, right place, right time.” Frequency management is tougher, attribution models start to break and sequential messaging is hard to do. Marketers want a reliable way to reconcile user identity across devices so they can adapt their messages to your situation.
Marketers inject tons of dollars into the advertising ecosystem and expect detailed performance reports. Each dollar spent is an investment. Some dollars create sales results, but all dollars spent in addressable channels create some kind of data.
Surprisingly, that data is still mostly siloed, with social data signals not connected to display results. Much of it is delivered in the form of weekly spreadsheets put together by an agency account manager. It seems crazy that marketers can’t fully take advantage of all the data produced by their digital marketing, but that is still very much the reality of 2015.
Thankfully, that dynamic is changing quickly. Data technology is rapidly offering a “people layer” of intelligence across all channels. Data coming into a central system can look at campaign performance across many dimensions, but the key is aggregating that data at the people level. How did a segment of “shopping cart abandoners” perform on display vs. video?
Marketers now operate under the new but valid assumption that they will be able to track performance in this way. They are starting to understand that every addressable media investment can create more than just sales – it can produce data that helps them get smarter about their media investments going forward.
It’s a great time to be a data-driven marketer.
[This post originally appeared in AdExchanger on 4.6.15]
A recent analyst report made an astute observation that all marketers should consider: It’s not about “digital marketing” anymore – it’s about marketing in a digital world. The nuance there is subtle, but the underlying truth is huge. The world has changed for marketers, and it’s more complicated than ever.
Most consumers spend more time on web-connected devices than television, creating a fragmented media landscape where attention is divided by multiple devices and thousands of addressable media outlets. For marketers, the old “AIDA” (attention, interest, desire and action) funnel persists, but fails in the face of the connected consumer.
When television, print and radio dominated, moving a consumer from product awareness to purchase had a fairly straightforward playbook. Today’s always-on, connected consumer is on a “customer journey,” interacting with a social media, review sites, pricing guides, blogs and chatting with friends to decide everything from small supermarket purchases to big investments like a new house or car.
Marketers want to be in the stream of the connected consumer and at key touch points on the customer journey. But, in order to understand the journey and be part of it, they must be able to map people across their devices. This is starting to be known as cross-device identity management (CDIM), and it is at the core of data-driven marketing.
In short, identity lies at the heart of successful people data activation.
Until very recently, managing online identity was largely about matching a customer’s online cookie with other cookies and CRM data, in order to ensure the desktop computer user was aligned with her digital footprint. Today, the identity landscape is highly varied, necessitating matching ID signals from several different browsers, device IDs from mobile phones and tablets, IDs from streaming devices and video game consoles and mobile app SDKs.
Matching a single user across their various connected devices is a challenge. Matching millions of users across multiple millions of devices is both a big data and data science challenge.
Real one-to-one marketing is only possible when the second party – the customer – is properly identified. This can be done using deterministic data, or information people volunteer about themselves, in a probabilistic manner, where the marketer guesses who the person is based on certain behavioral patterns and signals. Most digital marketing companies that offer identity management solutions take what data they have and use a proprietary algorithm to try and map device signals to users.
The effectiveness of device identity algorithms depends on two factors: the quality of the underlying deterministic data – the “truth set” – and its scale.
Data Quality Matters
There is data, and then there is data. The old software axiom of “garbage in, garbage out” certainly applies to cross-device user identity. Truly valuable deterministic data include things like age, gender and income data. In order to get such data, web publishers must offer their visitors a great deal of value and be trusted to hold such information securely. Therefore, large, trusted publishers – often with subscription paywalls – are able to collect highly valuable first-party user data.
Part of the quality equation also relates to the data’s ability to unlock cross-device signals. Does the site have users that are logged in across desktop, mobile phone and tablet? If so, those signals can be aggregated to determine that Sally Smith is the same person using several different devices. Publishers like The Wall Street Journal and The New York Times meet these criteria.
Scale Is Critical
In order to drive the best probabilistic user matches, algorithms need huge sets of data to learn from. In large data sets, even small statistical variances can yield surprising insights when tested repeatedly. The larger the set of deterministic data –the “truth” of identity – the better the machine is able to establish probability. A platform seeing several million unique users and their behavioral and technographic signatures may find similarities, but seeing billions of users will yield the minuscule differences that unlock the identity puzzle. Scale breeds precision, and precision counts when it comes to user identity.
As digital lives evolve beyond a few devices into more connected “things,” having a connected view of an individual is a top priority for marketers that want to enable the one-to-one relationship with consumers. Reliably mapping identity across devices opens up several possibilities.
Global Frequency Management: Marketers that leverage multiple execution platforms, including search, email, display, video and mobile, have the ability to limit frequency in each platform. That same user, however, looks like five different people without centralized identity management.
Many marketers don’t understand what ideal message frequency looks like at the start of a campaign, and most are serving ads far above the optimal effective frequency, resulting in large scale waste. Data management platforms can control segment membership across many different execution platforms and effectively cap user views at a “global” level, ensuring the user isn’t over-served in one channel and underserved in another.
Sequential Messaging: Another benefit of cross-device identity is that a user can be targeted with different ads based on where they are in the consumer journey. Knowing where a consumer is in an established conversion path or funnel is a critical part of creative decisioning. Optimizing the delivery of cross-channel messages at scale is what separates tactical digital marketers and enterprise-class digital companies that put people data at the heart of everything they do.
Customer Journey Modeling: Without connecting user identity in a centralized platform, understanding how disparate channels drive purchase intent is impossible. Today’s models bear the legacy of desktop performance metrics, such as last click, or have been engineered to favor display tactics, including first view. The true view of performance must involve all addressable channels, and even consider linear media investment that lacks deterministic data. This is challenging but all but impossible without cross-device identity management in place.
The ubiquity of personal technology has transformed today’s consumers into “digital natives” who seamlessly switch between devices, controlling the way they transmit and receive information. Marketers and publishers alike must adapt to a new reality that puts them in control of how editorial and advertising content is accessed. Delivering the right consumer experience is the new battleground for CMOs. Unlocking identity is the first step in winning the war.
[This post originally appeared in AdExchanger on 3.16.15]
Twenty years after the first banner ad, the programmatic media era has firmly taken hold. The Holy Grail for marketers is a map to the “consumer journey,” a circuitous route filled with multiple addressable customer touchpoints. With consumers spending more of their time on mobile devices – and interacting with brands like never before through social channels, review sites, pricing comparison sites and apps – how can marketers influence customers everywhere they encounter a brand?
It’s a tough nut to crack, but starting to become an achievable reality to companies dedicated to collecting, understanding and activating their data. Marketers are starting to turn towards data management platforms (DMP), which help them connect people with their various devices, develop granular audience segments, gain valuable insights and integrate with various platforms where they can activate that data. In addition to technology, marketers also have to configure their entire enterprises to align with the new data-driven realities on the ground.
The question is: Where do marketers turn for help with this challenging, enterprise-level transition?
Many argue that agencies cannot support the type of deep domain expertise needed for the complicated integrations, data science and modeling that has become an everyday issue in modern marketing. But should data management software selection and integration be the sole province of the Accentures and IBMs of the world, or is there room for agencies to play?
For lots of software companies, having an agency in between an advertiser and their marketing platform sounds like a problem to overcome, rather than a solution. Many ad tech sellers out there have lamented the process of the dreaded agency “lunch and learn” to develop a software capability “point of view” for a big client.
Yet, there are highly compelling ways agencies add value to the software selection process. The best agencies insert themselves into the data conversation and use their media and creative expertise to influence what DMPs marketers choose, as well as their role within the managed stack.
From Digital To Enterprise
It makes perfect sense that agencies are involved with data management. The first intersection of data and media added the “targeting” column to the digital RFP. Agencies have started to evolve beyond the Excel-based media planning process to start their plans with an audience persona that is developed in conjunction with their clients. Today, plans begin with audience data applied to as many channels as are reachable. Audience data has moved beyond digital to become universal.
Agencies have also been at the tip of the spear, both from an audience research standpoint (understanding where the most relevant audiences can be found across channels) and an activation standpoint (applying huge media budgets to supply partners). Since they are on the front lines of where media dollars are expressed, they often get the first practical look at where data impacts consumer engagement. During and after campaigns conclude, the agency also owns the analytics piece. How did this channel, partner and creative perform? Why?
Having formerly limited agencies to doing campaign development and execution, marketers are now turning to the collected expertise of their agency media and analytics teams and asking them to embed the culture of audience data into their larger organization. When it’s time to select the DMP—the internal machine that will drive the people-based marketing enterprise—the agency is naturally called upon.
Data Management Is About Ownership
Although a small portion of innovative marketers have begun leveraging DMP technology and taken media execution “in-house,” the vast majority stills relies on agencies and ad tech platform partners to operate their stacks through a managed services approach. Whether a marketer should own the capability to manage its own ad technology stack is a matter of choice, but data ownership shouldn’t be. Brands may not want to own the process of applying audience data to cross-channel media, but they absolutely must own their data.
Where Agencies Play in Data Management
The Initial Approach: Most agencies have experience leveraging marketers’ first-party data through retargeting on display advertising. In an initial DMP engagement, marketers will rely on their agencies to build effective audience personas, map those to available attributes that exist within the marketer’s taxonomy and apply the segments to existing addressable channels. Marketers can and should rely on past campaign insights, attribution reports and other data insights from their agencies when test-driving DMPs.
Connect the Dots: For most marketers, agencies have been the de-facto connector of their diverse systems. Media teams operate display, video and mobile DSPs, ad serving platforms, and attribution tools. Helping a marketer and their DMP partner tie these execution platforms together, understand audience data, and the performance data generated from campaigns is a critical part of a successful DMP implementation.
Operator: Last, but not least, is the agency as operator of the DMP. Marketers want their data safely protected in their own DMP, with strong governance rules around how first-party data is shared. They also need a hub for utilizing third-party data and integrating it with various execution and analytics platforms. Marketers may not want to operate the DMP themselves, though. Agencies can win by helping marketers wring the most value from their platforms.
Marketers have strong expertise in their products, markets and customer base – and should focus on their core strengths to grow. Agencies are great at finding audiences, building compelling creative and applying marketing investment dollars across channels, but are not necessarily the right stewards of others’ data.
Future success for agencies will come from helping marketers implement their data management strategy, align their data with their existing technology stack and return insights that drive ongoing results.
[This post originally appeared in AdExchanger on 2.2.15]
With companies like Kraft and Kellogg’s starting to leverage the programmatic pipes for equity advertising, we are starting to hear a lot of buzz about the potential for “programmatic branding,” or the use of ad tech pipes to drive upper-funnel consumer engagement.
It makes sense. Combine 20 years in online infrastructure investment with rapidly shifting consumer attention from linear to digital channels, and you have the perfect environment to test whether or not digital advertising can create “awareness” and “interest,” the first two pieces of the age old “AIDA” funnel.
The answer, put simply, is yes.
Online reach is considerably less expensive than linear reach, and we are starting to have the ability to reliably measure how that brand engagement is generated. Marketers don’t just want an “always-on” stream of brand advertising that comes with measurement – they also need it. With attention rapidly shifting from traditional channels, investments in linear television are starting to return fewer sales.
But most marketers are just starting to gain the digital competency to make programmatic branding a reality. That competency is called data management – the ability to segment, activate and analyze consumer audiences in a reliable way at scale.
The most fundamental problem with digital branding is that it is truly a one-to-one marketing exercise. If we dream of the “right message, right person, right time,” then matching a user with her devices is table stakes for programmatic branding. How do I know that Sally Smith on desktop is the same as Sally Smith on tablet?
Cross-device identity management is the key. Device IDs must be mapped to cookies, other mobile identifiers and Safari browser signals to get a sense of who’s who. Once you unlock user identity, many amazing things become possible.
Global Frequency Capping
One of the reasons programmatic branding has yet to gain serious ground with marketers is because of waste. This is both real, including all those wasted impressions due to invisible ads or robotic traffic, and perceived, such as impressions that are ineffective due to frequency issues.
Smart technology and market pricing solves the first problem, while data management solves the second. Assuming the marketer understands the ideal effective frequency of impressions per channel, or on a global basis, a DMP can manage how many impressions an individual sees by controlling segment membership in various platforms. Let’s say, for example, the ideal frequency for cereal advertising aimed at moms is 30 per day across channels. The advertiser knows showing fewer than 30 impressions reduces effectiveness, while more than 30 impressions has a negligible impact. Advertisers using multiple channels, such as direct-to-publisher, plus mobile, video and display DSPs, are likely overserving impressions in each channel and possibly underserving in key channels like video. Connecting user identity helps control global frequency and can save literally millions of dollars, while optimizing the effectiveness of cross-channel advertising.
If “right person” technology is enabled as above, the next logical step is to try and get to “right place and right time.” Data management can enable this holy grail of branding, helping marketers create relevance for consumers as they embark on the customer journey. What brand marketers have dreamed of is now possible and starting to happen.
Dad, in the auto-intender bucket, is exposed to a 15-second pre-roll ad before logging into his newspaper subscription on his tablet in the morning. The message is reinforced by more equity display ads he sees in the afternoon at work. And while checking messages on his mobile phone on the way home, he receives an offer for $500 off with a qualified test drive. After Dad hits the dealership and checks in through the CRM system, he receives an email thanking him for his visit and reminding him of the $500 coupon he earned.
These tactics are not possible without tying user identity and systems together. Doing so not only enables sequential messaging, but also the ability to test and measure different approaches through A/B testing.
How about attribution? It’s impossible to perform cross-channel attribution without knowing who saw what ad. At the end of the day, it’s really about the insights.
Procter & Gamble is famous for spending millions of dollars every year to understand the “moment of truth,” or why people choose Tide over another detergent. Although they know consumer segmentation and behavior better than anyone, even the biggest brand marketers struggle to gain quality insights from digital channels.
Data management is starting to make a more reliable view possible. Brand advertising is just another form of investment. Money is the input. The output is sales and, just as important, the data on what drove those sales. In the past, brand marketers relied on panel-based measurement to judge campaign effectiveness. Now, data management helps brands understand which channels drove results and how each contributed.
It is early days for truly reliable cross-channel attribution modeling, but we are finally starting to see the death of the “last-click” model. Smart marketers use data to author their own flexible attribution models, making sure all channels involved receive variable credit for driving the final action. In the near future, machine learning will help drive dynamic models, which flex over time as new signals are acquired. We will then start to see just how effective – or not – tactics like standard display advertising are for driving upper-funnel engagement.
Is 2015 the year for programmatic branding? For marketers that are leveraging data management to enable the best practices outlined above, the answer is yes. The more accurately marketers can map online user identity and understand results, the more investment will flow from linear to addressable channels.
[This post originally appeared on 1.4.2015 in AdExchanger]
In this increasingly cross-device world, marketers have been steadily losing the ability to connect with consumers in meaningful ways. Being a marketer has gone from three-martini lunches where you commit to a year’s worth of advertising in November to a constant hunt for new and existing customers along a multifaceted “customer journey” where the message is no longer controlled.
Consumers’ attention migrates from device to device, where they spread their limited attention among multiple applications. It’s become a technology game to try and track them down, and starting to become a big data game to serve them the “right message, at the right place, at the right time.”
Modern ad tech is supposed to be the marketer’s savior, helping him sort out how to migrate budgets from traditional media, such as TV, radio and print, to the addressable channels where people now spend all of their time. Marketers and their agencies need a technology “stack,” but they end up with a hot mess of different solutions, including various DSPs for multiple channels, content marketing software and ad servers.
Operating and managing all of them is possible, but laborious and difficult to do right. Worse still, these systems are nearly impossible to connect. Am I targeting the same consumer over and over through various channels? How to manage messaging, frequency and sequencing of ads?
Since all of these systems purport to connect marketers to customers on the audience level, the coin of the realm is data. It’s not just “audience data” but actual data on the individuals the marketer wants to target.
Marketing is now a people game.
Yet, in the cross-channel, evolving world of addressable media, connecting people to their various devices is difficult. You need to see a lot of user data, and you have to not only collect web-based event data, but also mobile data where cookies don’t exist. Deterministic data, such as a website’s registration data, can lay the foundation for identity. When blended with probabilistic data and modeled from user behavior and other signals, it becomes possible to find an individual.
Right now, the overlords of the people marketing game are platforms like Google, where people are happy to stay logged in to their email application on desktop, mobile and tablet, or Facebook, which knows everything because we are nice enough to tell them. Regular publishers may be lucky enough to have subscription users that log in to desktop and mobile devices, but most publishers don’t collect such data. Their ability to deliver true one-to-one marketing to their advertisers is limited to their ability to identify users.
This dynamic rapidly makes the big “walled gardens” of the Internet the only place big marketers can go to unlock the customer journey. That might work for Google and Facebook shareholders and employees, but it’s not good for anyone else. In our increasingly data-dependent world, not all marketers are comfortable borrowing the keys to user identity from platforms that sell their customers advertising. Soon, everyone will have to either pay a stiff toll to access such user data, or come up with innovation that enables a different way to unlock people-centric marketing.
What is needed is an independent “truth set” that advertisers can leverage to match their anonymous traffic with rich customer profiles, so they can actually start to unlock the coveted “360-degree view of the user.” Not only does a large truth set of users create better match rates with first-party data to improve targeting, but it also holds the key to making things like lookalike modeling and algorithmic optimization work. Put simply, the more data the machine has to work with, the more patterns it finds and the better it learns. In the case of user identity, the probabilistic models most DMPs deploy today are very similar. Their individual effectiveness depends on the underlying data they can leverage to do their jobs.
In the new cross-device reality: If you can’t leverage a huge data set to target users, it’s time to take your toys and go home. Little Johnny doesn’t use his desktop anymore.
Think about the three principle assets most companies have: their brand, their intellectual property and products and their customer data. Why should a company make a third of their internal value dependent upon a third party, whether or not they pledge “no evil?” Those that offer a “triple play” of mobile, cable television and phone services are also part of the few companies that can match a user across various devices. The problem? They all sell, or facilitate the sale of, lots of advertising. Marketers are not sure they want to depend on them for unlocking the puzzle of user identity.
Some of the greatest providers of audience data are independent publishers who, banded together, can create great scale and assemble a truth set as great as Facebook and Google. Maybe it’s time to create a data alliance that breaks the existing paradigm. The “give to get” proposition would be simple: Publishers contribute anonymized audience identity data to a central platform and get access to identity services as a participant. This syndicate could enable the deployment of a universal ID that helps marketers match consumers to their devices and create an alternative to the large walled gardens.
The real truth is that, without banding together, even great premium publishers will have a hard time unlocking the enigma of cross-device identity for marketers. Why not build a garden with your neighbors, rather than play in somebody else’s?
[This post was originally published in AdExchanger on 12.11.14]
According to Blue Kai, I’m a tech-savvy, social media-using bookworm in the New York DMA, currently in the market for “entertainment.” At least that’s what my cookie says about me. Simply by going to the Blue Kai data exchange’s registry page, you can find out what data companies and resellers know about you, and your online behavior and intent.
In this brave new world of data-supported audience buying, every individual with an addressable electronic device has been stripped down to an anonymous cookie and is for sale. My cookie, when bounced off various data providers, also reveals that I’m male (Acxiom), have a competitive income (IXI), three children in my family (V12), a propensity for buying online (Targusinfo) and am in mid-management of a small business (Bizo). I’m also in-market for a car (Exelate) and considered to be a “Country Squire,” according to Nieslen’s Prizm, which is essentially a boring white guy from the suburbs who “enjoys country sports like golf and tennis.” Well, I’m horrible at tennis, but everything else seems to be accurate.
As a marketer, you now have an interesting choice. Instead of finding “Country Squire” or “Suburban Pioneer” on content-specific sites they’re known to visit, now I can simply buy several million of these people, and find them wherever they may be lurking on the Web. This explains why you suddenly see ads for BMWs above your Hotmail messages right after you looked at that nice diesel station wagon on the VW.com Web site.
Today’s real-time marketing ecosystem works fast and works smart. But, how do you decide whether to buy the cookie or the site?
Most marketers insist that audience buying is meant for performance campaigns. This is largely a pricing consideration. Obviously, if I want to sell sneakers to young men, it makes sense to buy data and find 18-35-year old males who are “sneaker intenders” based on their online behavior and profile, and reach them at scale across the ad exchanges. Combined data and media will likely be under a $4 CPM, and probably less since both the data and media can be bid upon in real time. For most campaigns with a CPA south of $20, you need to buy “cheap and deep” to optimize into that type of performance. It sounds pretty good on paper. There are a few problems with this, however:
What are they doing when you find them?
OK, so you found one of your carefully selected audience members and you know he’s been shopping for shoes. Maybe you even retargeted him after he abandoned his shopping cart at footlocker.com and dynamically presented him with an ad featuring the very sneakers he wanted to buy, and you did it all for a fraction of a cent. The problem is that you reached him on Hotmail and he’s engaged in composing an e-mail. What are the chances that he’s going to break task and get back into the mind-set of purchasing a pair of sneakers? Also, what kind of e-mail is he composing? Work? A consolation note to a friend who has lost a loved one? Obviously, you don’t know.
Maybe you reached that user on a less than savory site, or perhaps on a social media site where he’s engaged in a live chat session with a friend. In any case, you have targeted that user perfectly—and at just the wrong time. This type of “interruption” marketing is exactly what digital has promised us it wouldn’t be.
Perhaps a better conversion rate can be found on ESPN.com, or a content page about basketball, where that user is engaged in content appropriate to the brand.
How do you know where the conversion came from?
Depending on your level of sophistication and your digital analytics tool set, you may not be in the best position to understand exactly where your online sales are coming from. If you’re depending on click-based metrics, that is even more true. As a recent comScore article points out, the click is a somewhat misleading metric. Put simply, clicks on display ads don’t take branding or other Web behavior into account when measuring success.
Personally, I haven’t clicked on a display ad in years, but seeing them still drives me to act. Comparing offline sales life over a four-week period, comScore reports that pure display advertising provides average lift of 16 percent and pure SEM provides lift of 82 percent—but search and display combined provide sales lift of 119 percent. So you simply can’t look at display alone when judging performance—and you have to question whether you’re seeing performance lift because you’re targeting, or because your buyer has been exposed to a display ad multiple times. If it’s the latter, you may be inclined to save the cost of data and go even more “cheap and deep” to get reach and frequency.
How do you value an impression?
Obviously, the metric we all use is CPM, but sometimes the $30 CPM impression on ESPN.com is less expensive than the $2 RTB impression from AdX. Naturally, your analytics tools will tell you which ad and publisher produced the most conversions. Additionally, deep conversion path analysis can also tell you that “last impression” conversion made at Hotmail might have started on ESPN.com, so you know where to assign value.
But, in the absence of meaningful data, how do we really know how effective our campaign has been? I believe that display creates performance by driving brand value higher, and some good ways to measure that can now be found using rich media. When consumers engage within a creative unit, or spend time watching video content about your brand, they’re making a personal choice to spend time with your message. There’s nothing more powerful than that, and that activity not only drives sales, but helps create lifetime customers.
For today’s digital marketer, great campaigns happen when you understand your customer, find them both across the Web and on the sites for which they have an affinity—and find them when they are engaged in content that’s complementary to your brand message. Hmmm . . . that kind of sounds like what we used to do with print advertising and direct mail. And maybe it really is that simple after all.
Chris O’Hara is svp of sales and marketing for Traffiq. He may be reached through his blog at Chrisohara.com.
A lot of you guys make your living selling technology in the advertising and marketing technology space. It’s a great and noble occupation, but not for everyone. Our industry moves very fast, and software is always a stutter step behind. We are trying to solve problems for big brands and media companies, and a lot of what we sell sounds pretty much the same as the competition. Even if you truly have the best product, it’s really hard to get people’s attention. When you finally get it, it’s very hard to truly differentiate yourself and your products. In first meetings and big pitches, you have to leave the meeting accomplishing three basics: your potential customer should like you enough to work with you, trust you to do the work, and believe that your company can solve their problem.
In first meetings and big pitches, you have to leave the meeting accomplishing three basics: your potential customer should like you enough to work with you, trust you to do the work, and believe that your company can solve their problem. Like, trust and belief are pretty simple asks—but very hard to establish in meetings.
Does your typical one-hour meeting look like this?
- Get the monitor set up and internet access established (10 minutes)
- Go around the room with introductions (5 minutes)
- Salesperson introduces the meeting and explains why you are there (10 minutes)
- Salesperson gives the standard “about the company” pitch (15 minutes)
- Subject matter expert talks about some use cases and benefits (20 minutes)
- Demo (0 minutes. Oops. No time left for demo).
I have been in many of these meetings as a potential buyer, and I have also presided over quite a few of these meetings. Some are better than others, but for the most part, they are pretty terrible. Here are four things you can change up for your next meeting.
Stop the Slides
Here’s what happens when you deliver a slide presentation. If you show a slide with text on it, your audience will start reading it. In fact, they will finish reading it way before you stop delivering the content, and then they start thinking about what they are going to do for lunch. Maybe you think you’ve built the most perfect slide ever…full of compelling content and gleaming with ideas? Well, perhaps you have but you’ve alienated half of the room; the slide is the perfect level for the folks who already get it, and way too technical for the newbies (or vice versa). The approach here is to use a good headline and a gigantic picture of something interesting. Show a hammer, elephant, or a guy jumping out of a plane. The internet is full of great options. “Why is there a picture of a guy jumping out of a plane?” your prospect wonders. Your potential client will listen to you until he figures it out.
Grab a Marker
In the technology space, we sell a lot of complicated stuff, and we have a lot of ‘splaining to do in meetings, to borrow the popular Desi Arnaz phrase. Many of our potential customers don’t really know how the Internet works, and that’s okay. A 23-year old media planner at an agency isn’t immediately required to grok the differences between data integration types, but they still have influence over considerable budget dollars. What they need is some education, and that’s where your friend the whiteboard comes in. Why do mediocre actors salvage their careers on the stage? Because it’s harder. You have to know your material, deliver your lines, and there’s nowhere to hide. People respect that, and they will respect you when you close your laptop, pick up a dry erase marker and start explaining what your technology does, why it’s different, and how it will solve a problem. Plus, the element of theater is fun. People know exactly what you are going to say when you deliver a slide, so you will likely be judged on your delivery and the cut of your suit. Pick up a marker, and you will be judged by the size of your brain.
Show, Don’t Tell
Similar to the educational nature of whiteboarding, there is magic in a good software demo. After explaining all of the wonderful problems you are going to solve over 40 minutes, you will likely have a highly skeptical audience. Every other vendor has rolled in and also promised to solve the age-old “right person, right message, right time” conundrum, and you are just the latest in the pack. Whenever there is an opportunity to go into the software and demonstrate exactly what you are talking about, you should take it. “Did you ask about my integration with Amazon? Great, let me pull that up in our UI and show you exactly what to do.” As an industry, we also seem to suffer from using solutions engineers as a crutch. Guess what? If you need a highly technical person to walk through a few screens, then your client just found out that you have a product that only his most technical people can use. That’s a gigantic loser. If you sell software, you should be capable of giving a basic UI demo.
People are people, and they communicate best with storytelling. You don’t need to be a latter-day Walt Disney at your next meeting, but you do have to be able to tell a story similar to this: “Ron from Big Company has the same exact problem you guys are having. We worked with Ron and his team for 18 months and figured out exactly how to solve it. Ron is now an SVP. Hey, we should get you out to lunch with Ron, and he can tell you all about it.”
An old boss used to tell me that a sale needs to get your client “paid or made” We can certainly help people get paid by saving the money through efficiency, and “make” their careers with a successful implementation. People love to hear that similar people are having the same issues, and they don’t want to feel left behind. By golly, if this was good enough for Ron at Big Company it’s good enough for me. A good story should be realistic, inspire, differentiate your technology—but also be referenceable.
Because they will call Ron.