A lot of you guys make your living selling technology in the advertising and marketing technology space. It’s a great and noble occupation, but not for everyone. Our industry moves very fast, and software is always a stutter step behind. We are trying to solve problems for big brands and media companies, and a lot of what we sell sounds pretty much the same as the competition. Even if you truly have the best product, it’s really hard to get people’s attention. When you finally get it, it’s very hard to truly differentiate yourself and your products. In first meetings and big pitches, you have to leave the meeting accomplishing three basics: your potential customer should like you enough to work with you, trust you to do the work, and believe that your company can solve their problem.
In first meetings and big pitches, you have to leave the meeting accomplishing three basics: your potential customer should like you enough to work with you, trust you to do the work, and believe that your company can solve their problem. Like, trust and belief are pretty simple asks—but very hard to establish in meetings.
Does your typical one-hour meeting look like this?
I have been in many of these meetings as a potential buyer, and I have also presided over quite a few of these meetings. Some are better than others, but for the most part, they are pretty terrible. Here are four things you can change up for your next meeting.
Stop the Slides
Here’s what happens when you deliver a slide presentation. If you show a slide with text on it, your audience will start reading it. In fact, they will finish reading it way before you stop delivering the content, and then they start thinking about what they are going to do for lunch. Maybe you think you’ve built the most perfect slide ever…full of compelling content and gleaming with ideas? Well, perhaps you have but you’ve alienated half of the room; the slide is the perfect level for the folks who already get it, and way too technical for the newbies (or vice versa). The approach here is to use a good headline and a gigantic picture of something interesting. Show a hammer, elephant, or a guy jumping out of a plane. The internet is full of great options. “Why is there a picture of a guy jumping out of a plane?” your prospect wonders. Your potential client will listen to you until he figures it out.
Grab a Marker
In the technology space, we sell a lot of complicated stuff, and we have a lot of ‘splaining to do in meetings, to borrow the popular Desi Arnaz phrase. Many of our potential customers don’t really know how the Internet works, and that’s okay. A 23-year old media planner at an agency isn’t immediately required to grok the differences between data integration types, but they still have influence over considerable budget dollars. What they need is some education, and that’s where your friend the whiteboard comes in. Why do mediocre actors salvage their careers on the stage? Because it’s harder. You have to know your material, deliver your lines, and there’s nowhere to hide. People respect that, and they will respect you when you close your laptop, pick up a dry erase marker and start explaining what your technology does, why it’s different, and how it will solve a problem. Plus, the element of theater is fun. People know exactly what you are going to say when you deliver a slide, so you will likely be judged on your delivery and the cut of your suit. Pick up a marker, and you will be judged by the size of your brain.
Show, Don’t Tell
Similar to the educational nature of whiteboarding, there is magic in a good software demo. After explaining all of the wonderful problems you are going to solve over 40 minutes, you will likely have a highly skeptical audience. Every other vendor has rolled in and also promised to solve the age-old “right person, right message, right time” conundrum, and you are just the latest in the pack. Whenever there is an opportunity to go into the software and demonstrate exactly what you are talking about, you should take it. “Did you ask about my integration with Amazon? Great, let me pull that up in our UI and show you exactly what to do.” As an industry, we also seem to suffer from using solutions engineers as a crutch. Guess what? If you need a highly technical person to walk through a few screens, then your client just found out that you have a product that only his most technical people can use. That’s a gigantic loser. If you sell software, you should be capable of giving a basic UI demo.
People are people, and they communicate best with storytelling. You don’t need to be a latter-day Walt Disney at your next meeting, but you do have to be able to tell a story similar to this: “Ron from Big Company has the same exact problem you guys are having. We worked with Ron and his team for 18 months and figured out exactly how to solve it. Ron is now an SVP. Hey, we should get you out to lunch with Ron, and he can tell you all about it.”
An old boss used to tell me that a sale needs to get your client “paid or made” We can certainly help people get paid by saving the money through efficiency, and “make” their careers with a successful implementation. People love to hear that similar people are having the same issues, and they don’t want to feel left behind. By golly, if this was good enough for Ron at Big Company it’s good enough for me. A good story should be realistic, inspire, differentiate your technology—but also be referenceable.
Because they will call Ron.
How Granular Data Collection and a Robust Second-Party Data Strategy Changes the Game
The world’s largest marketers and media companies have strongly embraced data management technology to provide personalization for customers that demand Amazon-like experiences. As a single, smart hub for all of their owned data (CRM, email, etc)—and acquired data, such as 3rd party demographic data —DMPs go a long way towards building a sustainable, modern marketing strategy that accounts for massively fragmented digital audiences.
The good news is most enterprises have taken a technological leap of faith, and embraced a data strategy to help them navigate our digital future. The bad news is, the systems they are using today are deeply flawed and do not produce optimal audience segmentation.
A Little DMP History
Marketers were slower to embrace DMP technology, and they quickly grasped the opportunity too. Now, instead of depending on ad networks to aggregate reach for them, they started to assemble their own first-party data asset—overlapping their known users with publishers’ segments, and buying access to those more relevant audiences. The more cookies, mobile IDs, and other addressable keys they could collect, the bigger their potential reach. Since most marketers had relatively small amounts of their own data, they supplemented with 3rd-party data—segments of “intenders” from providers like Datalogix, Nielsen, and Acxiom.
The two primary use cases for DMPs have not changed all that much over the years: both sides want to leverage technology to understand their users (analytics) and grow their base of addressable IDs (reach). Put simply, “who are these people interacting with my brand, and how can I find more of them?” DMPs seem really efficient at tackling those basic use cases, until you find out that they were doing it the wrong way the whole time.
What’s the Problem?
To dig a bit deeper, the way first-generation DMPs go about analyzing and expanding audiences is through mapping cookies to a predetermined taxonomy, based on user behavior and context. For example, if my 17-year-old son is browsing an article on the cool new Ferrari online, he would be identified as an “auto intender” and placed in a bucket of other auto intenders. The system would not store any of the data associated with that browsing session, or additional context. It is enough that the online behavior met a predetermined set of rules for “auto-intender” to place that cookie among several hundred thousand other “auto- intenders.”
The problem with a fixed, taxonomy-based collection methodology is just that—it is fixed, and based on a rigid set of rules for data collection. Taxonomy results are stored (“cookie 123 equals auto-intender”)—not the underlying data itself. That is called “schema-on-write,” an approach that writes taxonomy results to an existing table when the data is collected. That was fine for the days when data collection was desktop-based and the costs of data storage were sky-high, but it fails in a mobile world where artificial intelligence systems crave truly granular, attribute-level data collected from all consumer interactions to power machine learning.
There is another way to do this. It’s called “schema-on-read,” which is the opposite of schema-on-write. In these types of systems, all of the underlying data is collected, and the taxonomy result is created upon reading all of the raw data. In this instance, say I collected everything that happened on a popular auto site like Cars.com? I would collect how many pages were viewed, dwell times on ads, all of the clickstream collected in the “build your own” car module, and the data from event pixels that collected how many pictures a user viewed of a particular car model. I would store all of this data so I could look it up later.
Then, if my really smart data science team told me that users who viewed 15 of the 20 car pictures in the photo carousel in one viewing session were 50% more likely to buy a car in the next 30 days than the average user, I would build a segment of such users by “reading” the attribute data I had stored. This notion—total data storage at the attribute (or “trait”) level, independent of a fixed taxonomy—is called completeness of data. Most DMPs don’t have it.
Why Completeness Matters
Isn’t one auto-intender as good as another, despite how those data were collected? No. Think about the other main uses of DMPs: overlap reporting and indexing. Overlap reporting seeks to overlay an enterprise’s first party data asset with another. This is like taking all the visitors to Ford’s website, and comparing that audience to every user on a non-endemic site, like the Wall Street Journal. Every auto marketer would love to understand which high-income WSJ readers were interested in their latest model. But, how can they understand the real intent of users if they are just tagged as “auto intenders?” How did the publisher come to that conclusion? What signals contributed to having that those users qualify as “intenders” in the first place? How long ago did they engage with an auto article? Was it a story about a horrific traffic crash, or an article on the hottest new model? Without completeness, these “auto intenders” become very vague. Without all of the attributes stored, Ford cannot put their data science team to work to better understand their true intent.
Indexing, the other prominent use case, scores user IDs based on their similarity to a baseline population. For example, a popular women’s publisher like Meredith might have an index score of 150 against a segment of “active moms.” Another way of saying this is that indexing helps understand the “momness” of those women, based on similarity to the overall population. Index scoring is the way marketers have been buying audience data for the last 20 years. If I can get good reach with an index score above 100 at a good price, then I’m buying those segments all day long. Most of this index-based buying happens with 3rd-party data providers who have been collecting the data in the same flawed way for years. What’s the ultimate source of truth for such indexing? What data underlies the scoring in the first place? The fact is, it is impossible to validate these relevancy scores with the granular, attribute-level data being available to analyze.
Therefore, it is entirely fair to say that most DMPs have excellent intentions, but lack the infrastructure to perform 100% of the most important things DMPs are meant to do: understand IDs, and grow them through overlap analysis and indexing. If the underlying data has been improperly collected (or not there at all), then any type of audience profiling by any means is fundamentally flawed.
What to do?
To be fair, most DMPs were architected during a time when it was unnecessary to collect data through a schema-on-read methodology—and extremely costly. Today’s unrelenting shift to AI-driven marketing necessitates this approach to data collection and storage, and older systems are tooling up to compete. If you want to create a consumer data platform (“CDP”), the hottest new buzzword in marketing, you need to collect data in this way. So, the industry is moving there quickly. That said, many marketers are still stuck in the 1990s. Older DMPs are somewhat like the technology mullet of marketing—businesslike in the front, with something awkward and hideous hidden behind.
Beyond licensing a modern, schema-on-read system for data management so marketers can collect their own data in a granular way, there is another way to do things like indexing and overlap analysis well: license data from other data owners who have collected their data in such a way. This means going well beyond leveraging commoditized third-party data, and looking at the world of second-party data. Done correctly, real audience planning starts with collecting your own data effectively and extends to leveraging similarly collected data from others—second party data that is transparent, exclusive, and unique.
Today’s consumers are highly demanding. They expect curated movie recommendations from Netflix, one-click restaurant reservations from OpenTable, on-demand limousine service from Uber, limitless housing options from AirBnB and the world of commerce available 24/7 from Amazon Prime. It’s a great time to be alive for a consumer, but perhaps the worst possible time for the CMO of any other company. Just think, Uber doesn’t own cars. They are a technology company built from the ground up to deliver personalized service at scale to consumers—that’s what today’s marketing is all about.
Only a few short years ago, CMOs had a difficult, but simpler, remit: build the brand and the consumers follow. Absolut vodka was about as undifferentiated a product as anything on the market, but great packaging and a clever ad campaign made it a power brand. It thrived because the world still worked on the principles of How Brands Grow, Byron Sharp’s 2010 book. Sharp posited that a marketer needed two things to succeed: availability in the consumer’s mind and availability of the product at the shelf. Brands like P&G’s Tide control lots of mindshare with mass media budgets, and P&G ensures it is widely available at every supermarket so a consumer can easily choose between it and Wisk at the “moment of truth.”
That system is dying rapidly, as mass media channels become fragmented into thousands of websites, apps, streaming media channels and experiences we don’t even understand yet. As a marketer, you can’t “buy eyeballs” today like you used to. This paradigm is largely responsible for the ever-shrinking average CMO tenure (from 44 months last year to only 42 months today). CMO’s must be prepared to insert themselves along steps of the consumer journey that move from channel to channel, and also have the ability to capture each tiny piece of digital exhaust that consumers’ gadgets and gizmos throw off, helping to inform their understanding of how they engage with a brand.
To make it clear, here’s a chart:
|OLD CMO||NEW CMO|
|Rents access to people||Owns people data|
|One-to-many marketing||One-to-one engagement|
|Big bets on limited channels||Small bets on dozens of channels|
|One “big tent” message for many||Dozens of messages for segments|
|Panel-based attribution||Real-time feedback|
|Agency defines strategy||Marketer owns strategy, agency executes|
Yesterday’s CMO would “buy eyeballs” with big TV and print campaigns, and use subscriber information as a proxy for targeted reach. Today’s CMO wants to own cookies and mobile keys so they can have a one-to-one conversation. Yesterday’s CMO looked at the performance at the end of a campaign, and optimized for the next one based on results from a survey. Today’s CMOs crave access to real-time performance data so they can optimize at run time. Things couldn’t be more different.
In this new norm, what should CMOs do to ensure they stay ahead of the curve? They have to change the way they think about consumer identity and how that impacts their work as marketers–and redefine the way they think about “marketing” in general.
Identity beyond IDs
A few months ago, I wrote that “identity is the new basis of competition” in marketing. That’s still true—you can’t build meaningful cross-channel experiences if you can’t tie people together with their devices. To that end, I recently was invited to an internal town hall with marketers from a large beauty company where the CMO announced that they just now eclipsed over 500 million addressable IDs in their data management platform. Her staff started clapping. Why? Because these weren’t known buyers, just cookies and mobile IDs—but they represented the ability for marketers to connect and build experiences for anonymous people who interacted with their website, mobile app or an ad. That has real, tangible value.
But, devices don’t buy things, people do. Just because you have a good device graph with billions of cookies, e-mail addresses and mobile keys doesn’t mean you have a good view of the people behind that information. Identity data must be augmented with data from systems of engagement to formulate a true view of the consumer. Every click, download, article read, and video view throws off digital exhaust that is filled with scraps of information that machines can use to paint a truer picture of a consumer’s identity. When marketers start valuing all data as a financial asset, they are starting the process of turning IDs into people.
For those of us in the industry, we can be forgiven if we think the world revolves around display, social, mobile and video advertising. We’ve gotten really good at delivering personalized digital experiences in real time, and we have a Lumascape full of clever technologies that are moving the needle for brands that are trying to reach connected consumers. CMOs must think outside the Lumascape, and connect these important addressable touchpoints to mass channels like TV, radio and print in order to deliver personalized experiences at scale.
More than Marketing
The problem is that our definition of marketing often misses the concept of touch points that can exist separately from marketing. These touch points can include interactions between salespeople and potential customers, what happens when a product is returned, conversations on community sites and forums where customers talk to each other about a brand, and also within the e-commerce experience when a consumer is making a purchase. These are arguably more valuable interactions with consumers than a digital banner ad or email because these are either people that are existing customers or those about to buy. They’re incredibly valuable to a brand and don’t involve the traditional notion of “marketing” whatsoever.
Let’s take a look at an example. I fly Delta because I love their app, and they reward my loyalty with special phone numbers so I can reach someone no matter how hairy things get throughout my travel experience. Every time I interact with their website, app and service representative is an opportunity for Delta to market to me—and also an opportunity for the brand to learn more about the way I fly and what matters most to me as a consumer. Getting it consistently right keeps me loyal, but getting even slightly wrong brings me one step closer to tweeting #DeltaStinks. Not fair, but that’s representative of brand relationships today.
To be successful, CMOs must expand their definition of identity. “Identity” is more than just an ID. It’s what is formed after capturing every possible insight from every interaction. And “marketing” is not just about cross-channel messaging, it’s about creating great consumer experiences with every touchpoint that happens including sales, service, commerce and more.
It’s a great time to be a data-driven marketer.
[This article originally appeared in AdExchanger on 10.16.2017]
The Rules of Data-Driven Marketing are Changing as Data Rights Management Takes Center Stage
Unless you’ve been living off the grid, you’ve seen the promise of “data as the new oil” slowly come to fruition over the last five years. Connected devices are producing data at a Moore’s Law-like rate, and companies are building the artificial intelligence systems to mine that data into fuel that will power our ascension into a new paradigm we can’t yet understand. Whether you are in the Stephen Hawking camp (“The development of full artificial intelligence could spell the end of the human race”) or the Larry Page camp (“artificial intelligence [is] the ultimate version of Google”), we can all agree that data is the currency in the AI future.
In our world, we are witnessing an incredible synthesis of fast-moving, data-driven advertising technology coming rapidly together with the slower (yet still data-driven) world of marketing technology. Gartner’s Marty Kihn thinks the only way these two worlds tie the knot for the long term is centered around data management platforms. I think he’s right, but I also think what we know as a DMP today will evolve quickly as the data it manages grows and its applications evolve alongside it.
I think the most immediate changes we will bear witness to in this ongoing evolution are the changes in how data—the lifeblood of modern marketing—will be piped among data owners and those who want to use it. Why? Because the way we have been doing it for the past 20 years in incredibly flawed, and second- and third-party data owners are getting the short end of the stick..
Unless you are Google, Facebook, Amazon or the United States government, you will never have enough data as a marketer. Big CPG companies have been collecting data for years (think of rewards programs and the like), but the tens or even hundreds of millions of addressable IDs they have managed to gather often pales in comparison to the billions of people who interact with their brands every day across the globe. To fill the gaps, they turn to second- and third-party sources of data for segmentation, targeting and analytics.
The real usage of the data was sometimes unknown. Many cookies got hijacked for use into other—even competitive—systems, and there was little transparency into what was happening with the underlying data asset. But, the checks still came every single month. The approach worked when the best data owners (quality publishers) had a thriving direct sales channel.
Fast-forward to today, the game has changed considerably. More than half of enterprise marketers own a DMP, and even smaller mid-market advertisers are starting to license data technology. Data is being valued as a true financial asset and differentiator. On the publisher’s side, manual sales continue to plummet as programmatic evolves and header bidding supercharges the direct model with big data technology. In short, marketers need more and more quality data to feed the machines they are building to compete, and publishers are getting better and more granular control over their data.
More importantly, data owners are beginning to organize around a core principle: Any system that uses my data for insights that doesn’t result in a purchase of that data is theft.
Theft is a strong word but, if we truly value data and agree that it’s a big differentiator, it’s hard to argue with. For years, data owners have accepted a system that allowed wide access to their data for modeling and analytics in return for the occasional check. For every cookie targeted in programmatic that was activated to create revenue, a million more were churned to power analytics in another system. Put simply from the data owner’s perspective, if you are going to use my data for analytics and activation, but only pay me for activation, that’s going to be a problem.
In order to fix this, the systems of the future have to offer the ability for data owners to provision their data in more granular ways. Data owners need complete control of the following:
How is the data being used? Is it for activation, lookalike modeling, analytics in a data warehouse, user matching, cross-device purposes or another use case? Data owners need to be able to approve the exact modalities in which the data are leveraged by their partners.
What is the business model? Is this a trade deal, paid usage, fixed-price or CPM? How long is the term—a single campaign, or a year’s worth of modeling? Data owners should be able to set their own price—directly with the buyer—with full transparency into all fees associating with piping the data to a partner,.
What is being shared? What attributes or traits are being shared? Is it just user IDs, or IDs loaded with valuable attributes, such as a device graph that links an individual to all the devices they use? Data owners need powerful tools that offer a granular level of control for controlling data at the attribute level, and deciding how much of their data they are willing to share–and at what price.
Outside of big data and blockchain conversations, the phrase “data provisioning” is rarely heard, but it’s about to be a big part of our advertising ecosystem. However, it is those very security concerns that have kept data sharing at scale from becoming a reality. The answer is an ecosystem that offers complete control and transparency–and a smart layer of software-enabled governance tools that can stay ahead of nuances in law, such as the new GDPR requirements require. As adtech and marketing tech continue to come together, and systems evolve in parallel with their ability to make the best use of data, the systems of the future must first ensure data security before data innovation can truly happen.
Data may be the new oil, but will it be run by adtech wildcatters, or will the rules be governed by the data owners themselves?
[This was originally published in AdExchanger on 9/26/17]