Classic Wrap Up Article with Typical Next-Year Guru Predictions

 

Guru-Ram-Das-picture

Everything I predicted came true, but I still cannot grow a manly beard. 

2015 was a fantastic year for many data-driven marketers, with data management platforms (DMPs), consultancies and marketers getting something nice under their trees.

 

Unfortunately, 2015 also saw legacy networks, supply-side platforms (SSPs) and some less nimble agencies receive coal in their respective stockings for failing to keep up with the rapidly changing paradigm as marketing and ad technology merge.

In the great tradition of end-of-year prediction articles, here’s my take on the year’s biggest developments and what we’ll see in 2016, including a rapid technology adoption from big marketers, a continuing evolution of the agency model and an outright revolution in how media is procured.

Agency Ascendant?

I thought 2015 was supposed to herald the “death of the digital agency model.” As agencies struggled to define their value proposition to big marketers that were increasingly bringing “programmatic in house,” agencies were reputed to be on the ropes. Massive accounts with billions of dollars in marketing spend were reviewed, while agencies churned through cash pitching to win new business – or at least trying keep old business.

The result? Agencies swapped a ton of money, but were abandoned by no serious marketers. Agencies got a lot smarter, and starting spinning new digital strategies and DMP practices to combat the likes of system integrators and traditional consultancies. And the band played on.

In 2016, we will continue to see agencies strengthen their digital strategy bench, start moving “marketing automation” practices into the DMP world and offer integration services to help marketers build bespoke “stack” solutions. Trading desks will continue to aggressively pursue unique relationships with big publishers and start to embrace new media procurement methodologies that emphasize their skillset, rather than the bidded approach in open exchanges (more on that below).

Marketers Hug Big Data

Marketers started to “cross the chasm” in 2015 and more widely embrace DMPs. It’s no longer just “early adopters” such as Kellogg’s that are making the market. Massive top-100 firms have fully embraced DMP tech and are starting to treat online data as fuel for growth.

Private equity and activist investors continue to put the squeeze on CPG companies, which have turned to their own first-party data to find media efficiency as they try to control the one line item in the P&L usually immune to risk management: marketing spend.

Media and entertainment companies are wrangling their consumer data to fuel over-the-top initiatives, which put a true first-party relationship with their viewers front and center. Travel companies are starting to marry their invaluable CRM data to the anonymous online world to put “butts in seats” and “heads in beds.”

If 2015 saw 15% of the Fortune 500 engage with DMPs, 2016 is when the early majority will surge and start to make the embrace of DMP tech commonplace. The land grab for 24-month SAAS contracts is on.

Busy Consultants

It used to be a that a senior-level digital guy would get sick of his job and leave it (or his job would leave him), leading to a happy consultant walking around advising three or four clients on programmatic strategy. In 2015, that still exists but we’ve seen a rise in scale to meet the needs of a rapidly changing digital landscape.

Marketers and publishers are hiring boutique consultancies left and right to get on track (see this excellent, if not comprehensive, list). Also, big boys, including Accenture, Boston Consulting Group and McKinsey, are in the game, as are large, media-centric firms, such as MediaLink.

These shops are advising on data strategy, programmatic media, organizational change management and privacy. They are helping evaluate expensive SAAS technology, including DMPs and yield management solutions, and also doing large systems integrations required to marry traditional databases with DMPs.

Match Rates (Ugh)

Perhaps unpublicized, with the exception of a few nerdy industry pieces, we saw in 2015 a huge focus on “match rates,” or the ability for marketers to find matches for their first-party data in other execution systems.

Marketers want to activate their entire CRM databases in the dot-com space, but are finding only 40% to 50% of cookies that map to their valuable lists. When they try to map those cookies to a DSP, more disappointment ensues. As discussed in an earlier article, match rates are hard to get right, and require a relentless focus on user matching, great “onboarding services,” strong server-to-server connections between DMPs and DSPs (and other platforms) and a high frequency of user matching.

This was the year that marketers got disappointed in match rates and started to force the industry to find better solutions. Next year, huge marketers will take bold steps to actually share data and create an available identity map of consumers. I think we will see the first real data consortium emerge for the purposes of creating an open identity graph. That’s my big prediction – and hope – for 2016.

Head For The Headers

2015 was the year of “header bidding,” the catch-all phrase for a software appliance that gives publishers the chance to offer favored demand-side partners a “first look” at valuable inventory. I am not sure if “header bidding” will ultimately become the de facto standard for “workflow automation,” but we seem to be relentlessly marching back to a world in which publishers and marketers take control of inventory procurement and get away from the gamesmanship inherent in exchange-based buying.

Big SSPs and networks that have layered bidding tech onto open exchanges are struggling. Demand-side platforms are scrambling to add all sorts of bells and whistles to their “private marketplaces,” but the industry evolves.

Next year, we will see the pace of innovation increase, and we have already seen big trade desks make deals with DMPs to access premium publisher inventory. It’s nice to see premium publisher inventory increase in value – and I believe it will only continue to do so.

2016 will be the year of “second-party data” and the winners will be the ones with the technology installed to easily transact on it.

2015 was a great year for data-driven marketing, and 2016 will be even more fun. Stay safe out there.

This post originally appeared in AdExchanger on 12/17/2015

Advertisements

Five controversial predictions for programmatic advertising in 2016

 

robot-blog-flyer

This picture is as bad as my prediction that beacons would provide marketers with scaled, closed-loop attribution in 2016. Didn’t happen. Not even close.  

Programmatic advertising continued to creep into the generalist marketer’s consciousness in 2015.

 

If you’re interested, we recently wrote up a handy digest of some of 2015’s programmatic trends.

But enough of looking backwards, let’s look in the crystal ball and ask ‘What’s in store for 2016?’

Yet again, I’ve recruited two experts to help. Chris O’Hara, VP Strategic Accounts at Krux Digital (and Econsultancy’s programmatic guru – see his articles and research here), and James Bourner, Head of Display at Jellyfish.

Here’s what they had to say…

Real-world attribution may become… well, a reality

Chris O’Hara, VP Strategic Accounts at Krux Digital

One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.”

Attitudinal “brand lift” studies and latent post-campaign sales attribution modeling have been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. e.g. “Did my Facebook ad move any items off the shelf?”

We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to be able to get ecommerce-like attribution for Corn Flakes soon.

In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store.

Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made.

These marketing fantasies are becoming more real every day.

7/11

Cross-device targeting is important, but so is improving mobile inventory

James Bourner, Head of Display at Jellyfish

2016 will be the year of mobile: Just kidding!

Although on a more serious note, 2016 will be the year of more measured and more integrated mobile activity – we have only really just started to get to grips with cross-device targeting and tracking on a macro level.

While the big companies who are making a play for control of the ad tech industry will put a lot of emphasis on cross-device targeting and tracking in their battle plans, I think there will be a lot of improvements to the quality of inventory, especially in apps.

A lot of mobile supply is from developers, not traditional publishers, which has led to quality issues.

However, as we are now becoming very discerning in what we buy in mobile hopefully the developers will respond to the data and not be tempted to place banners quite so close to where accidental clicks may occur!

Google has been trying to prevent accidental clicks since 2012.

google combating accidental clicks

Frequency management will reduce waste and improve UX

Chris O’Hara, VP Strategic Accounts at Krux Digital

Before marketers could effectively map users to all of their various devices (cross-device identity management) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual.

Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more.

What if the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication.

This means big money for marketers, who can finally start to control their messaging – putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen.

It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.

tap

Increased creativity will harness new forms of inventory

James Bourner, Head of Display at Jellyfish

From the buy side we will be looking forward to more video-on-demand inventory and new areas of supply opening up, especially for the UK market, which is hugely exciting.

Closely linked to this will be far more involvement from the creative guys.

There have been rumblings in the programmatic community for some time that we do not exploit creativity enough and we need to encourage our creative counterparts to the possibilities of programmatic, whether that be simply more permutations of ads to complement targeting or a more subtle but fundamental shift in some of the tools used to build creative.

Additionally, 2016 will see more large and impactful formats, skins and take-over placements served programmatically. This is obviously excellent for both media planners and buyers as well as the creative teams.

On the subject of placements there will be a proliferation of in-feed display (or native-type placements) becoming available programmatically. 2016 will also see more connected TV and digital radio exchanges being added into the programmatic supply line.

Programmatic out of home has been on the horizon for a while but I would predict connected TV will be the faster growing element of these.

An in-stream Guardian ad format. James expects more in-feed display to be available programmatically.

guardian ad unit

We should probably let the machines decide

Chris O’Hara, VP Strategic Accounts at Krux Digital

The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, and test-and-learn their way to gain more precision.

Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will start doing the heavy lifting.

What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a huge edge in execution.

In other words, now we can look at a small group of 1,000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common.

Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segments automatically generated from that data, and expand it from there.

This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.

robot

Econsultancy runs Creatve Programmatic, a one day conference in London, as well as providing training on programmatic advertising.

Trends in Programmatic Buying

 

thefuture

The digital marketing future we were promised years ago looks pretty lame in retrospect. This is an image of a trading desk supervisor at Razorfish, circa 2013.

2015 has been one of the most exciting years in digital driven marketing to date. Although publishers have been leading the way in terms of building their programmatic “stacks” to enable more efficient selling of digital media, marketers are now catching up. Wide adoption of data management platforms has given rise to a shift in buying behaviors, where data-driven tactics for achieving effectiveness and efficiency rule. Here’s a some interesting trends that have arisen.

 

Purchase-Based Targeting

Remember when finding the “household CEO” was as easy as picking a demographic target? Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen a them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT). This trend has existed in categories like Automotive and Travel, but is now being seen in CPG. Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them. These purchase-defined segments are a more precise starting point in digital segmentation—and can be augmented by behavioral and contextual data attributes to achieve scale. The big winners here are the folks who actually have the in-store purchase information, such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing.

Programmatic Direct

For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them. It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP an exchange, SSP, and other assorted technologies to get to space on a publisher website. Marketers have seen $10 CPMs turn into just $2 of working media. Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure. A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching. This is especially effective where marketers seek as “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.

Global Frequency Management

The rise in DMPs has also led to another fast-growing trend: global frequency management. Before marketers could effectively map users to all of their various devices (cross-device identity management, or CDIM) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual. Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more. What is the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication. This means big money for marketers, who can finally start to control their messaging—putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen. It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.

Taking it in-House

More and more, we are seeing big marketers decide to “take programmatic in house.” That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data. This trend isn’t as explosive as one might think, based on the industry trades—but it is real and happening steadily. What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks. Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available. Even the oldest school marketer can license their way into a technology stack any agency would be proud of. The only thing really holding back the trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders! When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.

Closing the Loop with Data

One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.” Attitudinal “brand lift” studies, and latent post-campaign sales attribution modeling has been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. “Did my Facebook ad move any items off the shelf?” We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to able to get eCommerce-like attribution for corn flakes soon. In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store. Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display; and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made. These marketing fantasies are becoming more real every day.

Letting the Machines Decide

What’s next? The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, an test-and-learn their way to gain more precision. Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will atart doing the heavy lifting. What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a hige edge in execution. In other words, now we can look at a small group of 1000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common. Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segment automatically generated from that data, and expend it from there. This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.

It’s a great time to be a data-driven marketer!

 

Programmatic trends in 2015

2015-trends-blogBoy oh boy, 2015 was a big year for advertising debate.

To try and bring some closure to a year of fervid discussion on the Econsultancy blog, we asked two experts on performance marketing to give us their view on programmatic in 2015.

And if you want to learn more on this topic, book yourself a place at our Programmatic Training Course.

Is there a creativity vacuum?

David Carr, Strategy Director at DigitasLBi 

As we all raced to keep up with the exponential increases in options and terminology, maybe new realism began to creep in.

Had media left creative behind? Was programmatic only about cheaper buys and cheaper dynamic creative optimization with production efficiencies, real-time price updates and maybe a “personalized” colour-way and call to action based on someone’s browsing history?

When you asked around the industry for great creative examples the same ones would come back: Axe Brasil’s Romeo Reboot with its 100,000 dynamic videos, Diesel Decoded’s 400 bespoke copylines and the Amanda Foundation’s digital “Pawprint” work.

Yet programmatic is not just about direct response and CPAs. Programmatic is people. Programmatic allows creative to build a tailored story arch for the individual.

This makes brand ideas and human truths more important than ever to stimulate and organize the work making it consistent, relevant and distinct.

It means rethinking storytelling through a lens of data and technology to give personalization at scale and enable a brand relationship that learns – not just buying on a DSP.

Axe Brasil’s Romeo Reboot was an example of dynamic video.

romeo reboot

Brands seek transparency and control

David Carr, Strategy Director at DigitasLBi 

It is this technology lens that means new ways of organizing an agency are needed along with new client-agency relationships.

Creative, media and technology need to be re-integrating or at least work far more closely together. This way savings from spend can be used to create more effective work and technology can give greater transparency.

When at most 45 cents in the dollar reaches publishers and even an in-house or managed service on a shared platform leads to unknown ad-tech, DSP sell-side, reseller SSP and primary SSP fees plus data leakage to competitor algorithms, transparency is vital.

Taking a brand-first approach where clients control the bidding strategy AND the tech roadmap while not being lumbered with platform development and management might be a solution here?

Chris O’Hara, VP Strategic Accounts at Krux Digital (author of Econsultancy’s Programmatic Branding report)

More and more, we are seeing big marketers decide to “take programmatic in house.”

That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data.

This trend isn’t as explosive as one might think, based on the industry trades – but it is real and happening steadily.

What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks.

Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available.

Even old school marketers can license their way into a technology stack any agency would be proud of.

The only thing really holding back this trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders!

When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.

Only app development is outsourced more than display advertising (source:Organisational Structures and Resourcing Best Practice Guide)

outsourced disciplines

Users suffer (especially on mobile) without union of creative, data and tech

David Carr, Strategy Director at DigitasLBi 

As all media continued to go mobile in 2015 the underbelly of programmatic was exposed. Hundreds of competing cookies on a page with javascript that bloated page weights above 1mb – if they even allowed the page to render at all – became a too common occurrence.

In this context programmatic became not just the future of ad buying but perhaps the best advert for Adblockers you could have.

Creative, data and technology consolidation for a mobile world is one potential solution but ultimately the only way that programmatic can live up to its promise is for all three to work together.

That way we can get back to people. Where do they go, what are they interested in, how do they respond to content and messages and how do we offer them something useful, usable and delightful?

Targeting with purchase data improves segmentation

Chris O’Hara, VP Strategic Accounts at Krux Digital

Remember when finding the “household CEO” was as easy as picking a demographic target?

Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT).

This trend has existed in categories like automotive and travel, but is now being seen in consumer packaged goods.

Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them.

These purchase-defined segments are a more precise starting point in digital segmentation – and can be augmented by behavioral and contextual data attributes to achieve scale.

The big winners here are the folks who actually have the in-store purchase information (such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing).

archery targets

Programmatic direct as a route through complexity

Chris O’Hara, VP Strategic Accounts at Krux Digital

For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them.

It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP, an exchange, SSP, and other assorted technologies to get to space on a publisher website.

Marketers have seen $10 CPMs turn into just $2 of working media.

Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure.

A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching.

This is especially effective where marketers seek an “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.

For more on programmatic in 2015, see other blog posts and research by Chris O’Hara.

Ben Davis

Published 7 December, 2015 by Ben Davis @ Econsultancy

Ben Davis is a senior writer at Econsultancy. He lives in Manchester. You can contact him at ben.davis@econsultancy.com, follow at @herrhuld or connect via LinkedIn.

545 more posts from this author

Match Game 2015

 

Match game

Ask me what my match rates are. I have no clue, and neither do you.

 

If you work in digital marketing for a brand or an agency, and you are in the market for a data management platform, you have probably asked a vendor about match rates. But, unless you are really ahead of the curve, there is a good chance you don’t really understand what you are asking for. This is nothing to be ashamed of – some of the smartest folks in the industry struggle here. With a few exceptions, like this recent post, there is simply not a lot of plainspoken dialogue in the market about the topic.

Match rates are a key factor in deciding how well your vendor can provide cross-device identity mapping in a world where your consumer has many, many devices. Marketers are starting to request “match rate” numbers as a method of validation and comparison among ad tech platforms in the same way they wanted “click-through rates” from ad networks a few years ago. Why?

As a consumer, I probably carry about twelve different user IDs: A few Chrome cookies, a few Mozilla cookies, several IDFAs for my Apple phone and tablets, a Roku ID, an Experian ID, and also a few hashed e-mail IDs. Marketers looking to achieve true 1:1 marketing have to reconcile all of those child identities to a single universal consumer ID (UID) to make sure I am the “one” they want to market to. It seems pretty obvious when you think about it, but the first problem to solve before any “matching” tales place whatsoever is a vendor’s ability to match people to the devices and browser attached to them. That’s the first, most important match!

So, let’s move on and pretend the vendor nailed the cross-device problem—a fairly tricky proposition for even the most scaled platforms that aren’t Facebook and Google. They now have to match that UID against the places where the consumer can be found. The ability to do that is generally understood as a vendor’s “match rate.”

So, what’s the number? Herein lies the problem. Match rates are really, really hard to determine, and they change all the time. Plus, lots of vendors find it easier to say, “Our match rate with TubeMogul is 92%” and just leave it at that—even though it’s highly unlikely to be the truth. So, how do you separate the real story from the hype and discover what a vendor’s real ability to match user identity is? Here are two great questions you should ask:

What am I matching?

This is the first and most obvious question: Just what are you asking a vendor to match? There are actually two types of matches to consider: A vendor’s ability to match a bunch of offline data to cookies (called “onboarding”), and a vendor’s ability to match a set of cookie IDs to another set of cookie IDs.

First, let’s talk about the former. In onboarding—or matching offline personally identifiable information (PII) identities such as an e-mail with a cookie—it’s pretty widely accepted that you’ll manage to find about 40% of those users in the online space. That seems pretty low, but cookies are a highly volatile form of identity, prone to frequent deletion, and dependent upon a broad network of third parties to fire “match pixels” on behalf of the onboarder to constantly identify users. Over time, a strong correlation between the consumer’s offline ID and their website visitation habits—plus rigor around the collection and normalization of identity data—can yield much higher offline-to-online match results, but it takes effort. Beware the vendor who claims they can match more than 40% of your e-mails to an active cookie ID from the get-go. Matching your users is a process, and nobody has the magic solution.

As far as cookie-to-cookie user mapping, the ability to match users across platforms has more to do with how frequently the your vendors fire match pixels. This happens when one platform (a DMP) calls the other platform (the DSP) and asks, “Hey, dude, do you know this user?” That action is a one-way match. It’s even better when the latter platform fires a match pixel back—“Yes, dude, but do you know this guy?”—creating a two-way identity match. Large data platforms will ask their partners to fire multiple match pixels to make sure they are keeping up with all of the IDs in their ecosystem. As an example, this would consist of a DMP with a big publisher client who sees most of the US population firing a match pixel for a bunch of DSPs like DataXu, TubeMogul, and the Trade Desk at the same time. Therefore, every user visiting a big publisher site would get that publisher’s DMP master ID matched with the three separate DSP IDs. That’s the way it works.

Given the scenario I just described, and even accounting for a high degree of frequency over time, match rates in the high 70 percentile are still considered excellent. So consider all of the work that needs to go into matching before you simply buy a vendor’s claim to have “90%” match rates in the cookie space. Again, this type of matching is also a process—and one involving many parties and counterparties—and not just something that happens overnight by flipping a switch, so beware of the “no problem” vendor answers.

What number are you asking to match?

Let’s say you are a marketer and you’ve gathered a mess of cookie IDs through your first-party web visitors. Now, you want to match those cookies against a bunch of cookie IDs in a popular DSP. Most vendors will come right out and tell you that they have a 90%+ match rate in such situations. That may be a huge sign of danger. Let’s think about the reality of the situation. First of all, many of those online IDs are not cookies at all, but Safari IDs that cannot be matched. So eliminate a good 20% of matches right off the bat. Next, we have to assume that a bunch of those cookies are expired, and no longer matchable, which adds another 20% to the equation. I could go on and on but, as you can see, I’ve just made a pretty realistic case for eliminating about 40% of possible matches right off the bat. That means a 60% match rate is pretty damn good.

Lots of vendors are actually talking about their matchable population of users, or the cookies you give them that they can actually map to their users. In the case of a DMP that is firing match pixels all day long, several times a day with a favored DSP, the match rate at any one time with that vendor may indeed be 90-100%–but only of the matchable population. So always ask what the numerator and denominator represent in a match question.

You might ask whether or not this means the popular DMP/DSP ”combo” platforms come with higher match rates, or so-called “lossless integration” since both the DMP and DSP carry an single architecture an, therefore, a unified identity. The answer is, yes, but that offers little differentiation when two separate DMP/DSP platforms are closely synched and user matching.

In conclusion

Marketers are obsessing over match rates right now, and they should be. There is an awful lot of “FUD” (fear, uncertainty, and doubt) being thrown around by vendors around match rates—and also a lot of BS being tossed around in terms of numbers. The best advice when doing an evaluation?

  • Ask what kind of cross-device graph your vendor supports. Without the fundamental ability to match people to devices, the “match rate” number you get is largely irrelevant.
  • Ask what numbers your vendor is matching. Are we talking about onboarding (matching offline IDs to cookies) or are we talking about cookie matching (mapping different cookie IDs in a match table)?
  • Ask how they are matching (what is the numerator and what is the denominator?)
  • Never trust a number without an explanation. If your vendor tells you “94.5%” be paranoid!
  • And, ask for a match test. The proof is on the pudding!