Five Principles of Modern Marketing

Every marketer and media company these days is trying to unlock the secret to personalization. Everyone wants to be the next Amazon, anticipating customer wants and desires and delivering real-time customization.

Actually, everyone might need to be an Amazon going forward; Harris Interactive and others tell us that getting customer experience wrong means up to 80% of customers will leave your brand and try another – and it takes seven times more money to reacquire that customer than it did initially.

How important is personalization? In a recent study, 75% marketers of marketers said that there’s no such thing as too much personalization for different audiences, and 94% know that delivering personalized content is important to reaching their audiences.

People want and expect personalization and convenience today, and brands and publishers that cannot deliver it will suffer similar fates. However, beyond advanced technology, what do you need to believe to make this transformation happen? What are the core principles a company needs to adhere to, in order to have a shot at transforming themselves into customer-centric enterprises?

Here are five:

Put People First

It’s a rusty old saw but, like any cliché, it’s fundamentally true. For years, we have taken a very channel-specific view of engagement, thinking in terms of mobile, display, social and video. But those are channels, apps and browsers. Browsers don’t buy anything; people do.

A people-centric viewpoint is critical to being a modern marketer. True people-based marketing needs to extend beyond advertising and start to include things like sales, service and ecommerce interactions – every touchpoint people have with brands.

People – customers and consumers – must reside at the center of everything, and the systems of engagement we use to touch them must be tertiary. This makes the challenges of identity resolution the new basis of competition going forward.

Collect Everything, Measure Everything

A true commitment to personalized marketing means that you have to understand people. For many years, we have assigned outsized importance to small scraps of digital exhaust such as clicks, views and likes as signals of brand engagement and intent. Mostly, they’ve lived in isolation, never informing a holistic view of people and their wants and desires.

Now we can collect more of this data and do so in real time. Modern enterprises need to become more obsessive about valuing data. Every scrap of data becomes a small stitch in a rich tapestry that forms a view of the customer.

We laughed at the “data is the new oil” hyperbole a few years back – simply because nobody had a way to store and extract real value from the sea of digital ephemera. Today is vastly different because we have both the technology and processes to ingest signals at scale – and use artificial intelligence to refine them into gold. Businesses that let valuable data fall to the floor without measuring them might already be dead, but they just don’t know it yet.

Be A Retailer

A lot of brands aren’t as lucky as popular hotel booking sites. To book a room, you need to sign up with your email. Once you become a user, the company collects data on where you like to go, how often you travel, how much you pay for a room and even what kind of mattress you prefer. Any brand would kill for that kind of one-to-one relationship with a customer.

Global CPG brands touch billions of lives every day, yet often have to pay other companies to learn how their marketing spend affected sales efforts. Brands must start to own customer relationships and create one-to-one experiences with buyers. We are seeing the first step with things like Dash buttons and voice ordering, though still through a partner, but we will see this extend even further as brands change their entire business models to start to own the retail relationship with people. The key pivot point will come when brands actually value people data as an asset on their balance sheets.

See The World Dynamically

The ubiquity of data has led to an explosion of microsegmentation. I know marketers and publishers that can define a potential customer to 20 individual attributes. But people can go from a “Long Island soccer mom” on Monday to an “EDM music lover” on Friday night. Today’s segmentation is very much static – and very ineffective for a dynamic world where things change all the time.

To get the “right message, right place, right time” dynamic right, we need to understand things like location, weather, time of day and context – and make those dynamic signals part of how we segment audiences. To be successful, marketers and media companies must commit to thinking of customers as the dynamic and vibrant people they are and enable the ability to collect and activate real-time data into their segmentation models.

Think Like A Technologist

Finally, to create the change described above requires a commitment to understanding technology. You can’t do “people data” without truly understanding data management technology. You can’t measure everything without technology that can parse every signal. To be a retailer, you have to give customers a reason to buy directly from you. Thinking about customers dynamically requires real-time systems of collection and activation.

But technology and the people to run it are expensive investments, often taking months and years to show ROI, and the technology changes at the velocity of Moore’s Law. It’s a big commitment to change from diaper manufacturer to marketing technologist, but we are starting to understand that it is the change required to survive an era where people are in control.

Some say that it wasn’t streaming media technology that killed Blockbuster, but the fact that people hated their onerous late fees. It was probably both of those things. Tomorrow’s Blockbusters will be the companies that cannot apply these principles of modern, personalized marketing – or do not want to make the large investments to do so.

[This article originally appeared in AdExchanger on 8/7/2017.]

DMPs are Dead. Long Live DMPs.

 

King

Much like latter day King of Rock and Roll, Elvis Presley, today’s ubiquitous data management platforms will eventually die as an independent buying category but live on in the greater consciousness. And karate. 

Gartner’s Marty Kihn recently made an argument that ad tech and mar tech would not come together, contrary to what he had predicted a few years ago. When Marty speaks about ad tech, people listen.

 

Like many people, when I read the headline, I thought to myself, “That makes no sense!” But those who read the article more closely understand that the disciplines of ad tech and mar tech will certainly be bound closer together as systems align – but the business models are totally incompatible.

Advertising technology and the ecosystem that supports it, both from a commercial business model perspective (percentage of media spend billed in arrears) and the strong influence of agencies in the execution process, has meant that the alignment with software-as-a-service (SaaS) marketing technology is not just an engineering problem to solve.

Marketing leaders and brands need to change the way they do their P&L and budgeting and reevaluate business process flows both internally and with outside entities such as agencies to ensure that even if the technology may be right, the execution needs to be optimal to achieve the desired results.

There are also plenty of technical hurdles to overcome to truly integrate mar tech and ad tech – most notably, finding a way to let personally identifiable information and anonymous data flow from system to system securely. While those technical problems may be overcome through great software engineering, the business model challenge is a more significant hurdle.

I remember getting some advice from AdExchanger contributor Eric Picard when we worked together some years ago. I was working at a company that had a booming ad tech business with lots of customers and a great run rate, operating on the typical ad network/agency percentage-of-spend model.

At the time, we were facing competition from every angle and getting disrupted quickly. Eric’s suggestion was to transform the company to a platform business, license our technology for a fixed monthly fee and begin to build more predictable revenues and a dedicated customer base. That would have meant parting ways with our customers who would not want to pay us licensing fees and rebuilding the business from scratch.

Not an easy decision, but one we should have taken at the time. Eric was 100% right, but transforming a “run rate” revenue ad tech business into a SaaS business takes a lot of guts, and most investors and management didn’t sign up for that in the first place.

This is a long way of saying that Marty is right. There are tons of ad tech businesses that simply cannot transform themselves into marketing software stacks, simply because it requires complete change – from a structural financial perspective (different business model) and a people perspective (different sales skills required).

[This post appeared in AdExchanger on 5/9/2017]

What is the future of DMPs?

In the 1989 film “Back to the Future II,” Marty McFly traveled to Oct. 21, 2015, a future with flying cars, auto-drying clothes and shoes that lace automatically.

What is the future of data management platforms? This is a question I get asked a lot.

The short answer is that DMPs are now part of larger marketing stacks, and brands realize that harnessing their data is a top priority in order to deliver more efficient marketing.

This is a fast-moving trend in which companies are licensing large enterprise stacks and using systems integrators to manage all marketing—not just online advertising.

As detailed in Ad Age (Marketing clouds loom), the days of turning to an agency trade desk or demand side platform (DSP) to manage the “digital” portions of advertising are fading rapidly as marketers are intent on having technology that covers more than just advertising.

Building consumer data platforms

A few years ago, a good “stack” might have been a connected DMP, DSP and ad server. A really good stack would feature a viewability vendor and start a dynamic creative optimization (DCO). The focus then was on optimizing for the world of programmatic buying and getting the most out of digital advertising as consumers’ attention shifted online, to mobile and social, rather than television.

Fast forward a few years, and the conversations we are having with marketers are vastly different. As reported in AdExchanger, more than 40% of enterprise marketers license a DMP, and another 20% will do so within the next 12 months. DMP owners and those in the market for one are increasingly talking about more than just optimizing digital ads. They want to know how to put email marketing, customer service and commerce data inside their systems. They also want data to flow from their systems to their own data lakes.

Many are undertaking the process of building internal consumer data platforms (CDPs), which can house all of their first-party data assets—both known and pseudonymous user data.

We are moving beyond ad tech. Quickly.

Today, when those in the market are considering licensing a “DMP” they are often thinking about “data management” more broadly. Yes, they need a DMP for its identity infrastructure, ability to connect to dozens of different execution systems and its analytical capabilities. But they also need a DMP to align with the systems they use to manage their CRM data, email data, commerce systems, and marketing automation tools.

Data-driven marketing no longer lives in isolation. After I acquire a “luxury sedan intender” online, I want to retarget her—but I also want to show her a red sedan on my website, e-mail her an offer to come to the dealership, serve her an SMS message when she gets within range of the dealership to give her a test drive incentive, and capture her e-mail address when she signs up to talk to a salesperson. All of that needs to work together.

Personalization demands adtech and martech come together

We live in a world that demands Netflix and Amazon-like instant gratification at all times. It’s nearly inconceivable to a Millennial or Generation Z if a brand somehow forgets that they are a loyal customer because they have so many choices and different brands that they can switch to when they have a bad experience.

This is a world that requires adtech and martech to come together to provide personalized experiences—not simply to create more advertising lift, but as the price of admission for customer loyalty.

So, when I am asked, what is the future of DMPs, I say that the idea of licensing something called a “DMP” will not exist in a few years.

DMPs will be completely integrated into larger stacks that offer a layer of data management (for both known and unknown data) for the “right person;” an orchestration layer of connected execution systems that seek to answer the “right message, right time” quandary; and an artificial intelligence layer, which is the brains of the operation trying to figure out how to stitch billions of individual data points together to put it all together in real time.

DMPs will never be the same, but only in the sense that they are so important that tomorrow’s enterprise marketing stacks cannot survive without integrating them completely, and deeply.

[This post was originally published 11 May, 2017 by Chris O’Hara in Econsultancy blog]

The Technology Layer Cake

spumoni-layer-cakeI saw a great presentation at this year’s Industry Preview where Brian Anderson of LUMA Partners presented on the future of marketing clouds. His unifying marketechture drawings looked like an amalgamation of various whiteboarding sessions I have had recently with big enterprise marketers, many of whom are building the components of their marketing “stacks.” Marketers are feverishly licensing offerings from all kinds of big software companies and smaller adtech and martech players to build a vision that can be summed up like this:

The Data Management Layer

Today’s “stack” really consists of three individual layers when you break it down. The first layer, Data Management (DM), contains all of the “pipes” used to connect people identity together. Every cloud needs to take data in from all kinds of sources, such as internet cookies, mobile IDs, hashed e-mail identity keys, purchase data, and the like. Every signal we can collect results in a richer understanding of the customer, and the DM layer needs access to rich sets of first, second, and third-party data to paint the clearest picture.

The DM layer also needs to tie every single ID and attribute collected to an individual, so all the signals collected can be leveraged to understand their wants and desires. This identity infrastructure is critical for the enterprise; knowing that you are the same guy who saw the display ad for the family minivan, and visited the “March Madness Deals” page on the mobile app goes a long way to attribution. But the DM layer cannot be constrained by anonymous data. Today’s marketing stacks must leverage DMPs to understand pseudonymous identity, but must find trusted ways to mix PII-based data from e-mail and CRM systems. This latter notion has created a new category—the “Customer Data Platform” (CDP), and also resulted in the rush to build data lakes as a method of collecting a variety of differentiated data for analytics purposes.

Finally, the DM layer must be able to seamlessly connect the data out to all kinds of activation channels, whether they are e-mail, programmatic, social, mobile, OTT, or IOT-based. Just as people have many different ID keys, people have different IDs inside of Google, Facebook, Pinterest, and the Wall Street Journal. Connecting those partner IDs to an enterprises’ universal ID solves problems with frequency management, attribution, and offers the ability to sequence messages across various addressable channels.

You can’t have a marketing cloud without data management. This layer is the “who” of the marketing cloud—who are these people and what are they like?

The Orchestration Layer

The next thing marketers need to have (and they often build it first, in pieces) is an orchestration layer. This is the “When, Where, and How” of the stack. E-mail systems can determine when to send that critical e-mail; marketing automation software can decide whether to put someone in a “nurture” campaign, or have a salesperson call them right away; DSPs decide when to bid on a likely internet surfer, and social management platforms can tell us when to Tweet or Snap. Content management systems and site-side personalization vendors orchestrate the perfect content experience on a web page, and dynamic creative optimization systems have gotten pretty good at guessing which ad will perform better for certain segments (show the women the high-heeled shoe ad, please).

The “when” layer is critical for building smart customer journeys. If you get enough systems connected, you start to realize the potential for executing on the “right person, right message, right time” dynamic that has been promised for many years, but never quite delivered at scale. Adtech has been busy nailing the orchestration of display and mobile messages, and the big social platforms have been leveraging their rich people data to deliver relevant messages. However, with lots of marketing money and attention still focused on e-mail and broadcast, there is plenty of work to be done before marketers can build journeys that feature every touchpoint their customers are exposed to.

Marketers today are busy building connectors to their various systems and getting them to talk to each other to figure out the “when, where, and how” of marketing.

The Artificial Intelligence Layer

When every single marketer and big media company owns a DMP,and has figured out how to string their various orchestration platforms together, it is clear that the key point of differentiation will reside in the AI layer. Artificial intelligence represents the “why” problem in marketing—why am I e-mailing this person instead of calling her? Should I be targeting this segment at all? Why does this guy score highly for a new car purchase, and this other guy who looks similar doesn’t? What is the lifetime value of this new business traveler I just acquired?

While the stacks have tons of identity data, advertising data, and sales data, they need a brain to analyze all of that data and decide how to use it most effectively. As marketing systems become more real-time and more connected to on-the-go customers than ever before, artificial intelligence must drive millions of decisions quickly, gleaned from billions of individual data points. How does the soda company know when to deliver an ad for water instead of diet soda? It requires understanding location, the weather, the person, and what they are doing in the moment. AI systems are rapidly building their machine learning capabilities and connecting into orchestration systems to help with decisioning.

All Together Now

The layer cake is a convenient way to look at what is happening today. The vision for tomorrow is to squish the layer cake together in such a way that enterprises get all of that functionality in a single cake. In four or five years, every marketing orchestration system will have some kind of built-in DMP—or seamless connections to any number of them. We see this today with large DSPs; they all need an internal data management system for segmentation. Tomorrow’s orchestration systems will all have built-in artificial intelligence as a means for differentiation. Look at e-mail orchestration today. It is not sold on its ability to deliver messages to inboxes, but rather on its ability to provide that service in a smarter package to increase open rates and provide richer analytics.

It will be fun to watch as these individual components come together to form the marketing clouds of the future. It’s a great time to be a data-driven marketer!

[This post was originally published April 4, 2017 on Econsultancy blog

Deepening The Data Lake: How Second-Party Data Increases AI For Enterprises

chrisohara_managingdata_updated

I have been hearing a lot about data lakes lately. Progressive marketers and some large enterprise publishers have been breaking out of traditional data warehouses, mostly used to store structured data, and investing in infrastructure so they can store tons of their first-party data and query it for analytics purposes.

“A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed,” according to Amazon Web Services. “While a hierarchical data warehouse stores data in files or folders, a data lake uses a flat architecture to store data.”

A few years ago, data lakes were thought to be limited to Hadoop applications (object storage), but the term is now more broadly applied to an environment in which an enterprise can store both structured and unstructured data and have it organized for fast query processing. In the ad tech and mar tech world, this is almost universally about first-party data. For example, a big airline might want to store transactional data from ecommerce alongside beacon pings to understand how often online ticket buyers in its loyalty program use a certain airport lounge.

However, as we discussed earlier this year, there are many marketers with surprisingly sparse data, like the food marketer who does not get many website visitors or authenticated customers downloading coupons. Today, those marketers face a situation where they want to use data science to do user scoring and modeling but, because they only have enough of their own data to fill a shallow lake, they have trouble justifying the costs of scaling the approach in a way that moves the sales needle.

chris_ohara1

Figure 1: Marketers with sparse data often do not have enough raw data to create measureable outcomes in audience targeting through modeling. Source: Chris O’Hara.

In the example above, we can think of the marketer’s first-party data – media exposure data, email marketing data, website analytics data, etc. – being the water that fills a data lake. That data is pumped into a data management platform (pictured here as a hydroelectric dam), pumped like electricity through ad tech pipes (demand-side platforms, supply-side platforms and ad servers) and finally delivered to places where it is activated (in the town, where people live).

As becomes apparent, this infrastructure can exist with even a tiny bit of water but, at the end of the cycle, not enough electricity will be generated to create decent outcomes and sustain a data-driven approach to marketing. This is a long way of saying that the data itself, both in quality and quantity, is needed in ever-larger amounts to create the potential for better targeting and analytics.

Most marketers today – even those with lots of data – find themselves overly reliant on third-party data to fill in these gaps. However, even if they have the rights to model it in their own environment, there are loads of restrictions on using it for targeting. It is also highly commoditized and can be of questionable provenance. (Is my Ferrari-browsing son really an “auto intender”?) While third-party data can be highly valuable, it would be akin to adding sediment to a data lake, creating murky visibility when trying to peer into the bottom for deep insights.

So, how can marketers fill data lakes with large amounts of high-quality data that can be used for modeling? I am starting to see the emergence of peer-to-peer data-sharing agreements that help marketers fill their lakes, deepen their ability to leverage data science and add layers of artificial intelligence through machine learning to their stacks.

chris_ohara2

Figure 2: Second-party data is simply someone else’s first-party data. When relevant data is added to a data lake, the result is a more robust environment for deeper data-led insights for both targeting and analytics. Source: Chris O’Hara.

In the above example (Figure 2), second-party data deepens the marketer’s data lake, powering the DMP with more rich data that can be used for modeling, activation and analytics. Imagine a huge beer company that was launching a country music promotion for its flagship brand. As a CPG company with relatively sparse amounts of first-party data, the traditional approach would be to seek out music fans of a certain location and demographic through third-party sources and apply those third-party segments to a programmatic campaign.

But what if the beer manufacturer teamed up with a big online ticket seller and arranged a data subscription for “all viewers or buyers of a Garth Brooks ticket in the last 180 days”? Those are exactly the people I would want to target, and they are unavailable anywhere in the third-party data ecosystem.

The data is also of extremely high provenance, and I would also be able to use that data in my own environment, where I could model it against my first-party data, such as site visitors or mobile IDs I gathered when I sponsored free Wi-Fi at the last Country Music Awards. The ability to gather and license those specific data sets and use them for modeling in a data lake is going to create massive outcomes in my addressable campaigns and give me an edge I cannot get using traditional ad network approaches with third-party segments.

Moreover, the flexibility around data capture enables marketers to use highly disparate data sets, combine and normalize them with metadata – and not have to worry about mapping them to a predefined schema. The associative work happens after the query takes place. That means I don’t need a predefined schema in place for that data to become valuable – a way of saying that the inherent observational bias in traditional approaches (“country music fans love mainstream beer, so I’d better capture that”) never hinders the ability to activate against unforeseen insights.

Large, sophisticated marketers and publishers are just starting to get their lakes built and begin gathering the data assets to deepen them, so we will likely see a great many examples of this approach over the coming months.

It’s a great time to be a data-driven marketer.

Follow Chris O’Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter.

How AI will Change UX

user_experience_sarah_weise In 1960, the US Navy coined a design principle: Keep it simple, stupid.

When it comes to advertising and marketing technology, we haven’t enjoyed a lot of “simple” over the last dozen years or so. In an increasingly data-driven world where delivering a relevant customer experience makes all the difference, we have embraced complexity over simplicity, dealing in acronyms, algorithms and now machine learning and artificial intelligence (AI).

When the numbers are reconciled and the demand side pays the supply side, what we have been mostly doing is pushing a lot of data into digital advertising channels and munching around the edges of performance, trying to optimize sub-1% click-through rates.

That minimal uptick in performance has come at the price of some astounding complexity: ad exchanges, third-party data, second-price auctions and even the befuddling technology known as header bidding. Smart, technical people struggle with these concepts, but we have embraced them as the secret handshake in a club that pays it dues by promising to manage that complexity away.

Marketers, however, are not stupid. They have steadily been taking ownership of their first-party data and starting to build marketing tech stacks that attempt to add transparency and efficiency to their outbound marketing, while eliminating many of the opaque ad tech taxes levied by confusing and ever-growing layers of licensed technology. Data management platforms, at the heart of this effort to take back control, have seen increased penetration among large marketers – and this trend will not stop.

This is a great thing, but we should remember that we are in the third inning of a game that will certainly go into extra innings. I remember what it was like to save a document in WordPerfect, send an email using Lotus Notes and program my VCR. Before point-and-click interfaces, such tasks were needlessly complex. Ever try to program the hotel’s alarm clock just in case your iPhone battery runs out? In a world of delightful user experience and clean, simple graphical interfaces, such a task becomes complex to the point of failure.

Why Have We Designed Such Complexity Into Marketing Technology?

We are, in effect, giving users who want big buttons and levers the equivalent graphical user interface of an Airbus A380: tons of granular and specific controls that may take a minute to learn, but a lifetime to master.

How can we change this? The good news is that change has already arrived, in the form of machine learning and artificial intelligence. When you go on Amazon or Netflix, do you have to program any of your preferences before getting really amazing product and movie recommendations? Of course not. Such algorithmic work happens on the back end where historical purchases and search data are mapped against each other, yielding seemingly magical recommendations.

Yet, when airline marketers go into their ad tech platform, we somehow expect them to inform the system of myriad attributes which comprise someone with “vacation travel intent” and find those potential customers across multiple channels. Companies like Expedia tell us just what to pay for a hotel room with minimal input, but we expect marketers to have internal data science teams to build propensity models so that user scores can be matched to a real-time bidding strategy.

One of the biggest trends we will see over the next several years is what could be thought of as the democratization of data science. As data-driven marketing becomes the norm, the winners and losers will be sorted out by their ability to build robust first-party data assets and leverage data science to sift the proverbial wheat from the chaff.

This capability will go hand-in-hand with an ability to map all kinds of distinct signals – mobile phones, tablets, browsers, connected devices and beacons – to an actual person. This is important for marketers because browsers and devices never buy anything, but customers do. Leading-edge companies will depend on data science to learn more about increasingly hard-to-find customers, understand their habits, gain unique insights about what prompts them to buy and leverage those insights to find them in the very moment they are going to buy.

In today’s world, that starts with data management and ends with finding people on connected devices. The problem is that executing is quite difficult to automate and scale. Systems still require experts that understand data strategy, specific use cases and the value of an organization’s siloed data when stitched together. Plus, you need great internal resources and a smart agency capable of execution once that strategy is actually in place.

However, the basic data problems we face today are not actually that complicated. Thomas Bayes worked them out more than 300 years ago with a series of probabilistic equations we still depend on today. The real trick involves packaging that Bayesian magic in such a way that the everyday marketer can go into a system containing “Hawaiian vacation travel intenders” for a winter travel campaign and push a button that says, “Find me more of these – now!”

Today’s problem is that we depend on either a small amount of “power users” – or the companies themselves – to put all of this amazing technology to work, rather than simply serving up the answers and offering a big red button to push.

A Simpler Future For Marketers?

Instead of building high-propensity segments and waiting for users to target them, tomorrow’s platforms will offer preselected lists of segments to target. Instead of having an agency’s media guru perform a marketing-mix model to determine channel mix, mar tech stacks will simply automatically allocate expenditures across channels based on the people data available. Instead of setting complex bid parameters by segment, artificial intelligence layers will automatically control pricing based on bid density, frequency of exposure and propensity to buy – while automatically suppressing users who have converted from receiving that damn shoe ad again.

This is all happening today, and it is happening right on time. In a world with only tens of thousands of data scientists and enough jobs for millions of them, history will be written by the companies clever enough to hide the math on the server side and give users the elegance of a simple interface where higher-level business decisions will be made.

We are entering into a unique epoch in our industry, one in which the math still rules, but the ability of designers to make it accessible to the English majors who run media will rule supreme.

It’s a great time to be a data-driven marketer! Happy New Year.

Follow Chris O’Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter. 

(Interview) On Beacons and DMPs

how-beacons-might-alter-the-data-balance-between-manufacturers-and-retailersHow Beacons Might Alter The Data Balance Between Manufacturers And Retailers

As Salesforce integrates DMP Krux, Chris O’Hara considers how proximity-based personalization will complement access to first-party data. For one thing, imagine how coffeemakers could form the basis of the greatest OOH ad network.

How CRM and a DMP can combine to give a 360-degree view of the customer

360-degree-gif-01For years, marketers have been talking about building a bridge between their existing customers, and the potential or yet-to-be-known customer.

Until recently, the two have rarely been connected. Agencies have separate marketing technology, data and analytics groups. Marketers themselves are often separated organizationally between “CRM” and “media” teams – sometimes even by a separate P&L.

Of course, there is a clearer dividing line between marketing tech and ad tech: personally identifiable information, or PII. Marketers today have two different types of data, from different places, with different rules dictating how it can be used.

In some ways, it has been natural for these two marketing disciplines to be separated, and some vendors have made a solid business from the work necessary to bridge PII data with web identifiers so people can be “onboarded” into cookies.

After all, marketers are interested in people, from the very top of the funnel when they visit a website as an anonymous visitor, all the way down the bottom of the funnel, after they are registered as a customer and we want to make them a brand advocate.

It would be great — magic even — if we could accurately understand our customers all the way through their various journeys (the fabled “360-degree view” of the customer) and give them the right message, at the right place and time. The combination of a strong CRM system and an enterprise data management platform (DMP) brings these two worlds together.

Much of this work is happening today, but it’s challenging with lots of ID matching, onboarding, and trying to connect systems that don’t ordinarily talk to one another. However, when CRM and DMP truly come together, it works.

What are some use cases?

Targeting people who haven’t opened an email

You might be one of those people who don’t open or engage with every promotional email in your inbox, or uses a smart filter to capture all of the marketing messages you receive every month.

To an email marketer, these people represent a big chunk of their database. Email is without a doubt the one of the most effective digital marketing channels, even though as few as 5% of people who engage are active buyers. It’s also relatively fairly straightforward way to predict return on advertising spend, based on historical open and conversion rates.

The connection between CRM and DMP enables the marketer to reach the 95% of their database everywhere else on the web, by connecting that (anonymized) email ID to the larger digital ecosystem: places like Facebook, Google, Twitter, advertising exchanges, and even premium publishers.

Understanding where the non-engaged email users are spending their time on the web, what they like, their behavior, income and buying habits is all now possible. The marketer has the “known” view of this customer from their CRM, but can also utilise vast sets of data to enrich their profile, and better engage them across the web.

Combining commerce and service data for journeys and sequencing

When we think of the customer journey, it gets complicated quickly. A typical ad campaign may feature thousands of websites, multiple creatives, different channels, a variety of different ad sizes and placements, delivery at different times of day and more.

When you map these variables against a few dozen audience segments, the combinatorial values get into numbers with a lot of zeros on the end. In other words, the typical campaign may have hundreds of millions of activities — and tens of millions of different ways a customer goes from an initial brand exposure all the way through to a purchase and the becoming a brand advocate.

How can you automatically discover the top 10 performing journeys?

Understanding which channels go together, and which sequences work best, can add up to tremendous lift for marketers.

For example, a media and entertainment company promoting a new show recently discovered that doing display advertising all week and then targeting the same people with a mobile “watch it tonight” message on the night of it aired produced a 20% lift in tune-in compared to display alone. Channel mix and sequencing work.

And that’s just the tip of the iceberg — we are only talking about web data.

What if you could look at a customer journey and find out that the call-to-action message resonated 20% higher one week after a purchase?

A pizza chain that tracks orders in its CRM system can start to understand the cadence of delivery (e.g. Thursday night is “pizza night” for the Johnson family) and map its display efforts to the right delivery frequency, ensuring the Johnsons receive targeted ads during the week, and a mobile coupon offer on Thursday afternoon, when it’s time to order.

How about a customer that has called and complained about a missed delivery, or a bad product experience? It’s probably a terrible idea to try and deliver a new product message when they have an outstanding customer ticket open. Those people can be suppressed from active campaigns, freeing up funds for attracting net new customers.

There are a lot of obvious use cases that come to mind when CRM data and web behavioral data is aligned at the people level. It’s simple stuff, but it works.

As marketers, we find ourselves seeking more and more precise targeting but, half the time, knowing when not to send a message is the more effective action.

As we start to see more seamless connections between CRM (existing customers) and DMPs (potential new customers), we imagine a world in which artificial intelligence can manage the cadence and sequence of messages based on all of the data — not just a subset of cookies, or email open rate.

As the organizational and technological barriers between CRM and DMP break down, we are seeing the next phase of what Gartner says is the “marketing hub” of interconnected systems or “stacks” where all of the different signals from current and potential customers come together to provide that 360-degree customer view.

It’s a great time to be a data-driven marketer!

Chris O’Hara is the head of global marketing for Krux, the Salesforce data management platform.

(Coverage) Salesforce Bolsters Einstein AI With Heavy-Duty Data Management

Through its acquisition of Krux, Salesforce is combining its artificial intelligence (AI) layer with deeper data management in Salesforce Marketing Cloud.
pbc

Customer Relationship Management and Data Management come together in a delicious way.

Today at its Salesforce World Tour stop in New York, the company began to roll back the curtain on how its AI and data layers will work together. Salesforce announced new AI, audience segmentation, and targeting features for Marketing Cloud based on its recent acquisition of data management platform Krux. The company’s new Marketing Cloud features, available today, add more data-driven advertising tools and an Einstein Journey Insights dashboard for monitoring end-to-end customer engagement in everything from e-commerce to email marketing.

Salesforce unveiled its Einstein AI platform this year, baking predictive algorithms, machine and deep learning, as well as other data analysis features throughout its Software-as-a-Service (SaaS) cloud. Einstein is essentially an AI layer between the data infrastructure underneath and the Salesforce apps and services on top. The CRM giant is no stranger to big money acquisitions, most recently scooping up Demandware for $2.8 billion and making a play for LinkedIn before Microsoft acquired it. The Krux acquisition gives Salesforce a new, data-driven customer engagement vector.

“We’re working to apply AI to all our applications,” said Eric Stahl, Senior Vice President of Marketing Cloud. “In Marketing Cloud, Krux now gives us the ability to do things like predictive journeys to help the marketer figure out which products to recommend. We can do complex segmentation, inject audiences into various ad networks, and do large-scale advertising informed by Sales Cloud and Service Cloud data.”

As Salesforce and Krux representatives demonstrated Krux and how it fits into the Marketing Cloud, the data management platform acted more like a business intelligence (BI) or data visualization tool than a CRM or marketing platform. Chris O’Hara, head of Global Data Strategy at Krux, talked about the massive quantities of data the platform manages, including an on-demand analytics environment of 20 petabytes (PB)—the entire internet archive is only 15 PB.

526951-krux-data-pattern-analysis“This is our idea of democratizing data for business users who don’t have a PhD in data science,” said O’Hara. You can use Krux machine-learned segments to find out something you don’t know about your audience, or do a pattern analysis [screenshot above] to understand the attributes of those users that correlate greatly. We’re hoping to use those kinds of signals to power Einstein and do things like user scoring and propensity modeling.

The Einstein Journey Insights feature is designed to analyze “hundreds of millions of data points” to identify an optimal customer conversion path. In addition to its Krux-powered Marketing Cloud features, Salesforce also announced a new conversational messaging service called LiveMessage this week for its Salesforce Service Cloud. LiveMessage integrates SMS text and Facebook Messenger with the Service Cloud console for interactions between customers and a company’s helpdesk bots.

The more intriguing implications here are what Salesforce might do with massively scaled data infrastructure like Krux beyond the initial integration. According to O’Hara, in addition to its analytics environment, Krux also processes more than more than 5 billion monthly CRM records and 4.5 million data capture events every minute, and maintains a native device graph of more than 3.5 billion active devices and browsers per month. Without getting into specifics, Salesforce’s Stahl said there will be far more cross-over between Krux data management and Einstein AI to come. In the data plus AI equation, the potential here is exponential scale.

 

CMOs and CIOs need to be more aligned

A survey of both senior marketing and IT professionals has revealed that there are significant differences between these two core business functions in their perception of organizational priorities and the quality of digital infrastructure. Governance frameworks to ensure better alignment between the CMO and CIO are often lacking.

The Backbone of Digital report, freely available from ClickZ (registration required), has also found that, compared to their colleagues in marketing,  IT professionals have a much rosier view of the customer experience their companies are delivering across digital channels.

Below I have outlined more detail around three key findings from the research which is sponsored by communications infrastructure services company Zayo.

IT pros have exaggerated view of the quality of their companies’ current infrastructure

According to the research, 88% of IT respondents describe their company’s infrastructure as ‘cutting-edge’ or ‘good’, compared to only 61% of marketing-focused respondents, a massive difference of 27 percentage points.

The research also looks at the ability of tech infrastructure to deliver across a range of marketing communications channels, with IT respondents and marketers both asked to rate performance.

Both marketers and IT professionals felt that the best engagement and experience is delivered across desktop, cited as ‘excellent’ or ‘good’ by 71% and 93% of these groups respectively, but trailed by other channels including mobile website, mobile app, desktop display, mobile display, social and push messaging.

zayo-communications-infrastructure-figure

Across the board it is evident that those working in IT have a much more optimistic view of how well they are delivering across the full gamut of digital channels compared to their IT counterparts.

It seems likely that those working in more customer-facing departments, i.e. marketers (generally), are much more likely to be aware of deficiencies impacting customer experience which can adversely affect business performance and brand reputation (and often their own bonuses).

A lack of co-operation is undermining excellence in digital delivery

Just 19% of marketers strongly agree with the statement “marketing and IT work closely together to ensure the best possible delivery of product/service”, and only 11% strongly agreed that they “have a clear governance framework to ensure that CIOs/CTOs and CMOs work together effectively”, suggesting a lack of alignment around marketing and IT business objectives.

This compares to 45% of IT professionals who strongly agreed that “marketing and IT work closely together to ensure the best possible network performance”, and a similar percentage (46%) who strongly agreed that they “have a clear governance framework to ensure that front-end business applications and back-end infrastructure work together effectively”.

While there are differing perceptions about the extent of marketing and IT co-operation, the report concludes that business objectives need to be much better aligned to ensure closer harmony across these core business functions.  If a framework to facilitate this is not put in place at the top of the organization, it becomes exponentially more difficult to implement lower down.

Speed of data-processing is crucial – real-time means real-time

Marketers are increasingly aware that the proliferation of data sources at their disposal is only of use to their businesses if they can analyse that information at high speed and transform it into the kind of intelligence that can then manifest itself as the most relevant and personalized messaging or call to action for any given site visitor.

According to Mike Plimsoll, Product and Industry Marketing Director at Adobe:

“A couple of years ago the marketing leaders at our biggest clients typically expected that data could be processed within 24 hours and that was fine.

“Now when we talk to our clients the expectation is that data is processed instantly so that when, for example, a customer engages with them on the website, the offer has been instantly updated based on something they’ve just done on another channel. All of a sudden ‘real-time’ really does mean ‘real-time’.”

The ability to harness ‘big data’ has become a pressing concern for IT departments as their colleagues in marketing departments seek to ensure they can take advantage of both structured and unstructured data and ensure the requisite speeds for real-time optimization of targeting, messaging and pricing.

More than half of IT respondents (56%) said that the ability to manage and optimize for big data was currently a ‘very relevant’ topic for their organization, in addition to 37% who said it was ‘quite relevant’.

zayo-big-data-figure

According to Chris O’Hara, Head of Global Data Strategy at Krux Digital:

“Today, consumers that are used to perfect product recommendations from Amazon and movie recommendations from Netflix expect their online experiences to be personal, email messages to be relevant, and web experiences customized.

“Delivering good customer experience has the dual effect of increasing sales lift, and also reducing churn by keeping customers happy. Things like latency, performance, and data management are all part and parcel of delivering on that concept.”

Please download our Backbone of Digital research which, as well as a survey of marketing and IT professionals, is also based on in-depth interviews with senior executives at a number of well known organizations.

Dynamic Real Time Segmentation

What-is-Real-Time-MarketingThe term “real time” is bandied about in the ad technology space almost as heavily as the word “programmatic.”

Years later, the meaning of programmatic is finally starting to be realized, but we are still a few years away from delivering truly real-time experiences. Let me explain.

Real-Time Programmatic

The real-time delivery of targeted ads basically comes down to user matching. Here is a common use case: A consumer visits an auto site, browses a particular type of minivan, leaves the site and automatically sees an ad on the very next site he or she visits. That’s about as “real-time” as it gets.

How did that happen? The site updated the user segment to include “minivan intender,” processed the segment immediately and sent that data into a demand-side platform (DSP) where the marketer’s ID was matched with the DSP’s ID and delivered with instructions to bid on that user. That is a dramatic oversimplification of the process but clearly many things must happen very quickly – within milliseconds – and perfectly for this scenario to occur.

Rocket Fuel, Turn and other big combo platforms have an advantage here because they don’t need to match users across an integrated data-management platform (DMP) and DSP. As long as marketers put their tags on their pages and stay within the confines of a single execution system, this type of retargeting gets close to real time.

However, as soon as the marketer wants to target that user through another DSP or in another channel, user matching comes back into play. That means pushing the “minivan intender” ID into a separate system, but the “real-time” nature of marketing starts to break down. That’s a big problem because today’s users move quickly between channels and devices and are not constrained by the desktop-dominated world of 10 years ago.

User matching has its own set of challenges, from a marketer’s ability to match users across their devices to how platforms like DMPs match their unique IDs to those of execution platforms like DSPs. Assuming the marketer has mapped the user to all of his or her device IDs, which is a daunting challenge, the marketer’s DMP has to match that user as quickly as possible to the execution platform where the ads are going to be targeted and run.

Let’s think about how that works for a second. Let’s say the marketer has DMP architecture in the header of the website, which enables a mom to be placed in the “minivan” segment as soon as the page loads. After processing the segment, it must be immediately sent to the DSP. Now the DSP has to add that user (or bunch of users) to their “minivan moms” segment. If you picture the internet ID space as a big spreadsheet, what is happening is that all the new minivan moms are added to the DSP’s big existing table of minivan moms so they are part of the new targeting list.

Some DSPs, such as The Trade Desk, TubeMogul and Google’s DBM, do this within hours or minutes. Others manage this updating process nightly by opening up a “window” where they accept new data and process it in “batches.” Doesn’t sound very “real-time” at all, does it?

While many DMPs can push segments in real time, the practical issue remains the ability of all the addressable channels a marketer wants to target to “catch” that data and make it available. The good news is that the speed at which execution channels are starting to process data is increasing every day as older ad stacks are re-engineered with real-time back-end infrastructure. The bad news is that until that happens, things like global delivery management and message sequencing across channels will remain overly dependent upon how marketers choose to provision their “stacks.”

The Future Is Dynamic

Despite the challenges in the real-life execution of real-time marketing, there are things happening that will put the simple notion of retargeting to shame. Everything we just discussed depends on a user being part of a segment. I probably exist as a “suburban middle-aged male sports lover with three kids” in a variety of different systems. Sometimes I’m an auto intender and sometimes I’m a unicorn lover, depending on who is using the family desktop, but my identity largely remains static. I’m going to be middle aged for a long time, and I’m always going to be a dad.

But marketers care about a lot more than that. The beer company wants to understand why sometimes I buy an ice-cold case of light beer (I’m about to watch a football game, and I might drink three or four of them with friends) and when I buy a six-pack of their craft-style ale (I’m going to have one or two at the family dinner table).

The soda company is competing for my “share of thirst” with everything from coffee to the water fountain. They want to know what my entry points are for a particular brand they sell. Is it their sports drink because I’m heading to the basketball court on a hot day, or is it a diet cola because I’m at the baseball game? The coffee chain wants to know whether I want a large hot coffee (before work) or an iced latte macchiato (my afternoon break).

This brings up the idea of dynamic segmentation: Although I am always part of a static segment, the world changes around me in real time. The weather changes, my location changes, the time changes and the people around me change constantly. What if all of that dynamic data could be constantly processed in the background and appended to static segments at the moment of truth?

In a perfect world, where the machines all talked to each other in real time and spoke the same language, this might be called real-time dynamic segmentation.

This is the future of “programmatic,” whatever that means.

[This originally appeared in AdExchanger on 8/31/2016]

Data Science is the New Measurement

tumblr_m9hc4jz_pp_x1qg0ltco1_400It’s a hoary old chestnut, but “understanding the customer journey” in a world of fragmented consumer attention and multiple devices is not just an AdExchanger meme. Attribution is a big problem, and one that marketers pay dearly for. Getting away from last touch models is hard to begin with. Add in the fact that many of the largest marketers have no actual relationship with the customer (such as CPG, where the customer is actually a wholesaler or retailer), and its gets even harder. Big companies are selling big money solutions to marketers for multi-touch attribution (MTA) and media-mix modeling (MMM), but some marketers feel light years away from a true understanding of what actually moves the sales needle.

As marketers are taking more direct ownership of their own customer relationships via data management platforms, “consumer data platforms” and the like, they are starting to obtain the missing pieces of the measurement puzzle: highly granular, user-level data. Now marketers are starting to pull in more than just media exposure data, but also offline data such as beacon pings, point-of-sale data (where they can get it), modeled purchase data from vendors like Datalogix and IRI, weather data and more to build a true picture. When that data can be associated with a person through a cross-device graph, it’s like going from a blunt 8-pack of Crayolas to a full set of Faber Castells.

Piercing the Retail Veil

Think about the company that makes single-serve coffee machines. Some make their money on the coffee they sell, rather than the machine—but they have absolutely no idea what their consumers like to drink. Again, they sell coffee but don’t really have a complete picture of who buys it or why. Same problem for the beer or soda company, where the sale (and customer data relationship) resides with the retailer. The default is to go to panel-based solutions that sample a tiny percentage of consumers for insights, or waiting for complicated and expensive media mix models to reveal what drove sales lift. But what if a company could partner with a retailer and a beacon company to understand how in-store visitation and even things like an offline visit to a store shelf compared with online media exposure? The marketer could use geofencing to understand where else consumers shopped, offer a mobile coupon so the user could authenticate upon redemption, get access to POS data from the retailer to confirm purchase and understand basket contents—and ultimately tie that data back to media exposure. That sounds a lot like closed-loop attribution to me.

Overcoming Walled Gardens

Why do specialty health sites charge so much for media? Like any other walled garden, they are taking advantage of a unique set of data—and their own data science capabilities—to better understand user intent. (There’s nothing wrong with that, by the way). If I’m a maker of allergy medicine, the most common trigger for purchase is probably the onset of an allergy attack, but how am I supposed to know when someone is about to sneeze? It’s an incredibly tough problem, but one that the large health site can solve, largely thanks to people who have searched for “hay fever” online. Combine that with a 7-day weather forecast, pollen indices, and past search intent behavior, and you have a pretty good model for finding allergy sufferers. However, almost all of that data—plus past purchase data—can be ingested and modeled inside a marketer DMP, enabling the allergy medicine manufacturer to segment those users in a similar way—and then use an overlap analysis to find them on sites with $5 CPMs, rather than $20. That’s the power of user modeling. Why don’t site like Facebook give marketers user-level media exposure data? The question answers itself.

Understanding the Full Journey

Building journeys always falls down due to one missing piece of the puzzle or another. Panel-based models continually overemphasize the power of print and linear television. CRM-based models always look at the journey from the e-mail perspective, and value declared user data above all else. Digital journeys can get pretty granular with media exposure data, but miss big pieces of data from social networks, website interactions, and things that are hard to measure (like location data from beacon exposure). What we are starting to see today is, through the ability to ingest highly differentiated signals, marketers are able to combine granular attribute data to complete the picture. Think about the data a marketer can ingest: All addressable media exposure (ad logs), all mobile app data (SDKs), location data (beacon or 3rd party), modeled sales data (IRI or DLX), actual sale data (POS systems), website visitation data (javascript on the site), media performance data (through click and impression trackers), real people data through a CRM (that’s been hashed and anonymized), survey data that been mapped to a user (pixel-enabled online survey), and even addressable TV exposure (think Comscore’s Rentrak data set). Wow.

Why is “data science the new measurement?” Because, when a marketer has all of that data at their fingertips, something close to true attribution becomes possible. Now that marketers have the right tools to draw with, the winners are going to be the ones with the most artists (data scientists).

It’s a really interesting space to watch. More and more data is becoming available to marketers, who are increasingly owning the data and technology to manage it, and the models are growing more powerful and accurate with every byte of data that enters their systems.

It’s a great time to be a data-driven marketer!

[This post originally appeared in AdExchanger on 8/12/16]

Banning the Banner Ad

 

millennials-want-to-be-treated-like-adults-in-the-workplace.jpgAs a longtime digital practitioner, I sometimes feel ashamed that I haven’t clicked on many banner ads in the last 10 years or so. It’s not that I don’t like banner ads. I recognize that advertising is the thing that supports all of the great content I read. I don’t even mind lots of ads in my paid, expensive, print and online versions of The Wall Street Journal – I sometimes even read them.

But standard banners rarely get any consideration or clicks from me, unless they are incredibly relevant. Standard banner ads aren’t particularly engaging – and the marketers buying them are getting frustrated with an ecosystem rife with fraud, technology taxes and nonhuman traffic.

Some of the world’s top marketers are actively working on “ban-the-banner” initiatives, driven by the theory that nothing but engagement matters – a KPI more easily correlated to watching an entire video than the much-maligned click. They believe great brands should tell great stories, so it seems obvious that the scant real estate and functionality offered by banner slots makes creating consumer engagement difficult, if not impossible.

At the intersection of an amazing technology-driven programmatic buying landscape and the increasingly creative social-led atmosphere of the new web is online video. It kind of snuck up on us, steadily creeping into our social feeds, blogs and favorite website destinations. That’s a very good thing: The reason linear television continues to command the lion’s share of media dollars is because people like to be entertained, and watching something is so much easier than reading something. Watching is a passive experience, but an emotional one.

Video is a place where brands can tell amazing stories, make a great pitch and drive consumer engagement. After years of perfecting 15-second, 30-second and one-minute spots, media agencies are eager to leverage their linear creative in new formats to reach audiences that seem to be abandoning traditional television in droves. This, coupled with a few other factors, is causing advertisers to rapidly move away from animated banners to video.

Millennials Don’t Like Television

Perhaps the most pressing dynamic forcing more video adoption among marketers is thatmillennials – who comprise an estimated 80 million-plus US consumers and will spend $200 billion next year – don’t really watch television anymore.

This is both from a delivery and physical dynamic; they do not watch video on television sets as much as they consume it on tablets, phones and other devices, and they also prefer on-demand viewing to scheduled programming. It makes sense. Even a 13” laptop with a retina display beats a 70” HDTV when held several inches in front of one’s face.

And the rise of streaming services has matured, giving consumers a legitimate option to “unplug” from traditional cable services and consume all the content they want on demand. Marketers must adapt to a reality that makes mobile the top priority for younger consumers, and adjust to the fact that many of the places millennials consume their content are relatively ad-free zones – at least in terms of traditional advertising units. It just so happens that video advertising fits into this new world nicely.

Time-Spent: The New Currency

If video ad delivery is going to be the mainstream unit, then it also follows that things like impressions and clicks are becoming irrelevant quickly. For video, the coin of the realm is time spent, and it is actually a pretty strong sign of engagement and valuable proxy for brand sentiment.

While it may be true that we are forced to consume some pre-roll before engaging with organic content, the best part of online video is that consumers waiting for content on their iPhones are much less likely to take a trip to the kitchen for a snack, as they do when standard commercials come on the tube. Instead of a solid three-minute block of commercials, they only have to engage with a single ad. Also, that ad can be tailored to individual preferences. That means even more engagement, less ad abandonment and a lot of measurability.

Data management platforms are helping marketers segment audiences that are prone to engage through an entire length of video and understand the types of content that produce longer viewing times and true engagement – and modeling those audiences to find lookalikes.

Linear And Online Video Must Connect For True Attribution

Probably the greatest thing about online video is the hope of leveraging data to connect online audiences with linear ones, and getting a better sense of media mix modeling and multitouch attribution.

Comcast certainly gets it. Connecting set-top box data with online ad serving means being able to touch a consumer with video across multiple screens – and bring real measurement of audiences that are increasingly device agnostic. Large telecoms, such as Verizon, are acquiring companies that provide the “last mile” of value from their broadband pipes, and that mile is as much about online video ad delivery as it is about website content.

The battle to do this correctly will be won at the “people level,” which is why we are seeing such a pitched battle of cross-device graphs; unless marketers can connect people with all of their devices, true attribution is simply impossible, rather than just being hard.

It’s an interesting time for modern marketers and publishers as they try and grow out of what we will see as the very early days of addressable advertising, and into a world dominated by on-demand content across a multitude of screens. The common denominator is video advertising, and I’m going long on the companies in the ecosystem that are going to power this new reality.

[This article originally appeared in AdExchanger on 6.30.16]

New Whitepaper: Agencies and DMP!

RoleOfTheAgencyInDataManagementWe’ve just published our latest best practice guide, entitled ‘The Role of the Agency in Data Management.’

The report looks at the challenges and opportunities for agencies that want to become trusted stewards of their clients’ data.

I sat down with the author, Chris O’Hara, to find out more.

Q. It seems like the industry press is continually heralding the decline of media agencies, but they seem to be very much alive. What’s your take on the current landscape?

For a very long time, agencies have been dependent upon using low-cost labor for media planning and other low-value operational tasks.

While there are many highly-skilled digital media practitioners – strategists and the like – agencies still work against “cost-plus” models that don’t necessarily map to the new realities in omnichannel marketing.

Over the last several years as marketers have come to license technology – data management platforms (DMP) in particular – agencies have lost some ground to the managed services arms of ad tech companies, systems integrators, and management consultancies.

Q. How do agencies compete?

Agencies aren’t giving up the fight to win more technical and strategic work.

Over the last several years, we have seen many smaller, data-led agencies pop up to support challenging work – and we have also seen holding companies up-level staff and build practice groups to accommodate marketers that are licensing DMP technology and starting to take programmatic buying “in-house.”

It’s a trend that is only accelerating as more and more marketer clients are hiring Chief Data Officers and fusing the media, analytics, and IT departments into “centers of excellence” and the like.

Not only are agencies starting to build consultative practices, but it looks like traditional consultancies are starting to build out agency-like services as well.

Not long ago you wouldn’t think of names like Accenture, McKinsey, Infinitive, and Boston Consulting Group when you think of digital media, but they are working closely with a lot of Fortune 500 marketers to do things like DMP and DSP (demand-side platform) evaluations, programmatic strategy, and even creative work.

We are also seeing CRM-type agencies like Merkle and Epsilon acquire technologies and partner with big cloud companies as they start to work with more of a marketer’s first-party data.

As services businesses, they would love to take share away from traditional agencies.

Q. Who is winning?

I think it’s early days in the battle for supremacy in data-driven marketing, but I think agencies that are nimble and willing to take some risk upfront are well positioned to be successful.

They are the closest to the media budgets of marketers, and those with transparent business models are really strongly trusted partners when it comes to bringing new products to market.

Also, as creative starts to touch data more, this gives them a huge advantage.

You can be as efficient as possible in terms of reaching audiences through technology, but at the end of the day, creative is what drives brand building and ultimately sales.

Q. Why should agencies embrace DMPs? What is in it for them? It seems like yet another platform to operate, and agencies are already managing DSPs, search, direct buys, and things like creative optimization platforms.

Ultimately, agencies must align with the marketer’s strategy, and DMPs are starting to become the single source of “people data” that touch all sorts of execution channels, from email to social.

That being said, DMP implementations can be really tough if an agency isn’t scoped (or paid) to do the additional work that the DMP requires.

Think about it: A marketer licenses a DMP and plops a pretty complicated piece of software on an agency team’s desk and says, “get started!”

That can be a recipe for disaster. Agencies need to be involved in scoping the personnel and work they will be required to do to support new technologies, and marketers are better off involving agencies early on in the process.

Q. So, what do agencies do with DMP technology? How can they succeed?

As you’ll read in the new guide, there are a variety of amazing use cases that come out of the box that agencies can use to immediately make an impact.

Because the DMP can control for the delivery of messages against specific people across all channels, a really low-hanging fruit is frequency management.

Doing it well can eliminate anywhere from, 10-40% of wasteful spending on media that reaches consumers too many times.

Doing analytics around customer journeys is another use case – and one that attribution companies get paid handsomely for.

With this newly discovered data at their fingertips, agencies can start proving value quickly, and build entire practice groups around media efficiency, analytics, data science – even leverage DMP tech to build specialized trading desks. There’s a lot to take advantage of.

Q. You interviewed a lot of senior people in the agency and marketer space. Are they optimistic about the future?

Definitely. It’s sort of a biased sample, since I interviewed a lot of practitioners that do data management on a daily basis.

But I think ultimately everyone sees the need to get a lot better at digital marketing and views technology as the way out of what I consider to be the early and dark ages of addressable marketing.

The pace of change is very rapid, and I think we are seeing that people who really lean into the big problems of the moment like cross-device identity, location-based attribution, and advanced analytics are future-proofing themselves.

CX: The CFO’s Best Friend!

BFFS

When CFO’s embrace data and use it to drive customer experience, good times ensue.

 

Although it’s starting to become a well-worn aphorism, “data is the new oil” resonates more than ever. Like oil, data is an abundant resource, but it doesn’t become useful until it is refined for use and turned into fuel.

Without the proper refinement, big data may be worthless. The stock of big data unicorn Palantir, for example, sunk on news that it lost key client relationships due to a lack of perceived value. The company collected abundant data from CPG companies but was unable to apply it to practical use cases, according to a recent article.

Marketers are starting to turn away from using abundant, yet commoditized, third-party data sources in exchanges and move toward creating peer-to-peer data relationships and leveraging second-party data for targeting. This speaks to the refinement of targeting data: Better quality in the raw materials always yields more potent fuel for performance. Not all data is the same, and not every technology platform can spin data straw into gold.

Marketers have been using available data for addressable marketing for years, but now are starting to mine their own data and get value from the information they collect from registrations, mobile applications, media performance and site visitation. Data management platforms (DMPs) are helping them collect, refine, normalize and associate their disparate first-party data with actual people for targeting.

This is a beautiful thing. Technology is enabling marketers to mine their own data and own it. Yet many marketers are still just scraping the surface of what they can do, and using data primarily for the targeting of addressable media.

Some, however, are starting to deliver customer experiences that go beyond targeting display advertising by using data to shape the way consumers interact with their brands beyond media.

The case for personalization – customer experience management, or CX – is palpable. When the Watermark Group studied [PDF] the cumulative stock performance of Forrester Research-rated “leaders” or “laggards” in customer experience, the results were staggering. During a period in which the S&P 500 grew by 72%, those focused on personalized experiences outperformed the market by 35%, and the laggards underperformed by 45% on average. That’s a delta of nearly 80% in stock price performance between the winners and losers.

Moreover, 89% of customers who have a, unsatisfactory experience will leave a brand, according to a recent study; the cost of reacquiring a churned customer can run up to seven times the amount it took to win a new customer.

The stakes could not be higher for marketers and publishers looking to drive bottom-line performance. For many companies, whether they are marketing print or online subscriptions, promoting their content or selling products off the shelf, it’s hard to justify the heavy costs associated with licensing platforms to gather the right data and use that data to drive relevant customer experiences to their CFOs. Yet, when looking at big company priorities on multiple surveys, the desire to “create more relevant customer experiences” is right up there with “earn more revenue” and “increase profits.” Why?

The simple answer is that customer experience has an enormous impact on both revenue and profitability. Giving new customers the right experience provides a higher probability of winning them, and giving existing customers relevant experiences reduces churn – and creates opportunities to sell them more products, more often. When both top-line revenue and profitability can be driven through a single initiative, most CFOs start to invest and will continue to invest as results confirm the initial thesis.

Take the “heavy user” of a quick-service restaurant who dines several times a week and consistently transacts an over-average per-visit receipt. QSRs understand the impact these valued customers have on the bottom line. These users provide a strong baseline of predicable revenue, are usually the first to try new product offerings and respond to market-facing initiatives, such as discounting and couponing, which can strategically increase short-term receipts. Smart marketers should not be content to sit back and let this valuable segment remain stagnant or find new offerings with a competitive restaurant. They must show these users that they are valued, ensure they retain or increase store visits and keep them away from the hamburger next door.

That can be as simple as offering a coupon for a regular’s favorite order. Or it can be as complex as developing a mobile application that enables the customer to order his food in advance and pick it as soon as it’s ready.

Since the restaurant collects point-of-sale data and has authenticated user registration data from the mobile app, it can now personalize the customer’s order screen with his most popular orders to shorten the mobile ordering experience. Perhaps the app can offer special discounts to frequent diners for trying – and rating – new menu items. When on the road, the app can recommend other locations and direct him right to the drive-in window through popular map APIs. The possibilities are endless when you start to imagine how data can drive your next customer interaction.

Marketers and publishers are quickly embracing their first-party data and aligning it with powerful applications that drive customer experience, increase profits, reduce customer churn and boost lifetime value.

It’s a great time to be a data-driven marketer.

[This post originally appeared in AdExchanger on 5/23/16]

DMPs Go Way Beyond Segmentation

AboveAndBeyondAny AdExchanger reader probably knows more about data management technology than the average Joe, but many probably associate data management platforms (DMPs) with creating audience segments for programmatic media.

While segmentation, audience analytics, lookalike modeling and attribution are currently the primary use cases for DMP tech, there is so much more that can be done with access to all that user data in one place. These platforms sitting at the center of a marketer’s operational stack can make an impact far beyond paid media.

As data platforms mature, both publishers and marketers are starting to think beyond devices and browsers, and putting people in the center of what they do. Increasingly, this means focusing on giving the people what they want. In some cases that means no ads at all, while in others it’s the option to value certain audiences over others and serve them an ad first or deliver the right content – not just ads – based on their preferences.

Beyond personalization, there are DMP plays to be made in the areas of ad blocking and header bidding.

Ad Blocking

DMPs see a lot of browsers and devices on a monthly basis and strive to aggregate those disparate identities into a single user or universal consumer ID. They are also intimately involved in the serving of ads by either ingesting ad logs, deploying pixels or having a server-to-server connection with popular ad servers. This is great for influencing the serving of online ads across channels, but maybe it can help with one of the web’s most perplexing problems: the nonserving of ads.

With reports of consumers using applications to block as many as 10% of ads, wouldn’t it be great to know exactly who is blocking those ads? For publishers, that might mean identifying those users and suppressing them from targeting lists so they can help marketers get a better understanding of how much reach they have in certain audience segments. Once the “blockers” are segmented, publishers can get a fine-grained understanding of their composition, giving them insights about what audiences are more receptive to having ad-free or paid content experiences.

A lot of these issues are being solved today with specialized scripts that either aren’t very well coded, leading to page latency, or are scripted in-house, adding to complexity. Scripts trigger the typical “see ads or pay” notifications, which publishers have seen become more effective over time. The DMP, already installed and residing in the header across the enterprise, can provision this small feature alongside the larger application.

Header Bidding

Speaking of DMP architecture being in the header, I often wonder why publishers who have a DMP installed insist on deploying a different header-bidding solution to manage direct deals. Data management tech essentially operates by placing a control tag within the header of a publisher website, enabling a framework that gives direct and primary access to users entering the page. Through an integration with the ad server, the DMP can easily and quickly decide whether or not to deliver high-value “first looks” at inventory.

Today, the typical large publisher has a number of supply-side platforms (SSPs) set up to handle yield management, along with possibly several pieces of infrastructure to manage that critical programmatic direct sale. Publishers can reduce complexity and latency by simply using the pipes that have already been deployed for the very reason header bidding exists: understanding and managing the serving of premium ads to the right audiences.

Maybe publishers should be thinking about header bidding in a new way. Header-bidding tags are just another tag on the page. Those with tag management-enabled DMPs could have their existing architecture handle that – a salient point made recently by STAQ’s James Curran.

Curran also noted that the DMP has access, through ad log ingestion, to how much dough publishers get from every drop in the waterfall, including from private marketplace, programmatic direct header and the open exchanges. Many global publishers are looking at the DMP inside their stack as a hub that can see the pricing landscape at an audience level and power ad servers and SSPs with the type of intelligent decisioning that supercharges yield management.

Personalization

In ad technology, we talk a lot about the various partners enabling “paid, owned and earned” exposures to consumers, but we usually think of DMPs as essential only for the paid part.

But the composition of a web page, for example, is filled with dozens of little boxes, each capable of serving a display ad, video ad, social widget or content. Just as the DMP can influence the serving of ads into those little boxes, it can also influence the type of content that appears to each user. The big automaker might want to show a muscle car to that NASCAR Dad when he hits the page or a shiny new SUV to the Suburban Mom who shuttles the kids around all day.

Or, a marketer with a lot of its own content (“brands are publishers,” right?) may want to recommend its own articles or videos based on the browsing behavior of an anonymous user. The big global publisher may want to show a subscriber of one magazine a series of interesting articles from its other publications, possibly outperforming the CPA deals it has with third parties for subscription marketing.

This one-to-one personalization is possible because DMPs can capture not only the obvious cookie data but also the other 60% of user interactions and data, including mobile apps, mobile web, beacon data and even modeled propensity data from a marketer or publisher’s data warehouse.

Wouldn’t it be cool to serve an ad for a red car when the user has a statistically significant overlap with 10,000 others who have purchased red cars in the past year? That’s how to apply data science to drive real content personalization, rather than typical retargeting.

These are just some of the possibilities available when you start to think as the DMP as not just a central part of the ad technology “stack” but the brains behind everything that can be done with audiences. This critical infrastructure is where audience data gets ingested in real time, deployed to the right channels at speed and turned into insights about people. In a short period of time, the term “DMP” will likely be shorthand for the simple audience targeting use case inside of the data-driven marketing hub.

It’s a great time to be a data-driven marketer.

Follow Chris O’Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter. 

Big Data (for Marketing) is Real!

MachineLearningWe’ve been hearing about big data driving marketing for a long time, and to be honest, most is purely aspirational.

Using third-party data to target an ad in real time does deploy some back-end big-data architecture for sure. But the real promise of data-driven marketing has always been that computers, which can crunch more data than people and do it in real time, could find the golden needle of insight in the proverbial haystack of information.

This long-heralded capability is finally moving beyond the early adopters and starting to “cross the chasm” into early majority use among major global marketers and publishers. 

Leveraging Machine Learning For Segmentation 

Now that huge global marketers are embracing data management technology, they are finally able to start activating their carefully built offline audience personas in today’s multichannel world.

Big marketers were always good at segmentation. All kinds of consumer-facing companies already segment their customers along behavioral and psychographic dimensions. Big Beer Company knows how different a loyal, light-beer-drinking “fun lover” is from a trendsetting “craft lover” who likes new music and tries new foods frequently. The difference is that now they can find those people online, across all of their devices.

The magic of data management, however, is not just onboarding offline identities to the addressable media space. Think about how those segments were created. Basically, an army of consultants and marketers took loads of panel-based market data and gut instincts and divided their audience into a few dozen broad segments.

There’s nothing wrong with that. Marketers were working with the most, and best, data available. Those concepts around segmentation were taken to market, where loads of media dollars were applied to find those audiences. Performance data was collected and segments refined over time, based on the results.

In the linear world, those segments are applied to demographics, where loose approximations are made based on television and radio audiences. It’s crude, but the awesome reach power of broadcast media and friendly CPMs somewhat obviate the need for precision.

In digital, those segments find closer approximation with third-party data, similar to Nielsen Prizm segments and the like. These approximations are sharper, but in the online world, precision means more data expense and less reach, so the habit has been to translate offline segments into broader demographic and buckets, such as “men who like sports.”

What if, instead of guessing which online attributes approximated the ideal audience and creating segments from a little bit of data and lot of gut instinct, marketers could look at all of the data at once to see what the important attributes were?

No human being can take the entirety of a website’s audience, which probably shares more than 100,000 granular data attributes, and decide what really matters. Does gender matter for the “Mom site?”Obviously. Having kids? Certainly. Those attributes are evident, and they’re probably shared widely across a great portion of the audience of Popular Mom Site.

But what really defines the special “momness” of the site that only an algorithm can see? Maybe there are key clusters of attributes among the most loyal readers that are the things really driving the engagement. Until you deploy a machine to analyze the entirety of the data and find out which specific attributes cluster together, you really can’t claim a full understanding of your audience.

It’s all about correlations. Of course, it’s pretty easy to find a correlation between only two distinct attributes, such as age and income. But think about having to do a multivariable correlation on hundreds of different attributes. Humans can’t do it. It takes a machine-learning algorithm to parse the data and find the unique clusters that form among a huge audience.

Welcome to machine-discovered segmentation.

Machines can quickly look across the entirety of a specific audience and figure out how many people share the same attributes. Any time folks cluster together around more than five or six specific data attributes, you arguably have struck gold.

Say I’m a carmaker that learned that some of my sedan buyers were men who love NASCAR. But I also discovered that those NASCAR dads loved fitness and gaming, and I found a cluster of single guys who just graduated college and work in finance. Now, instead of guessing who is buying my car, I can let an algorithm create segments from the top 20 clusters, and I can start finding people predisposed to buy right away.

This trend is just starting to happen in both publishing and marketing, and it has been made available thanks to the wider adoption of real big-data technologies, such as Hadoop, Map Reduce and Spark.

This also opens up a larger conversation about data. If I can look at all of my data for segmentation, is there really anything off the table?

Using New Kinds Of Data To Drive Addressable Marketing 

That’s an interesting question. Take the company that’s manufacturing coffee machines for home use. Its loyal customer base buys a machine every five years or so and brews many pods every day.

The problem is that the manufacturer has no clue what the consumer is doing with the machine unless that machine is data-enabled. If a small chip enabled it to connect to the Internet and share data about what was brewed and when, the manufacturer would know everything their customers do with the machine.

Would it be helpful to know that a customer drank Folgers in the morning, Starbucks in the afternoon and Twinings Tea at night? I might want to send the family that brews 200 pods of coffee every month a brand-new machine after a few years for free and offer the lighter-category customers a discount on a new machine.

Moreover, now I can tell Folgers exactly who is brewing their coffee, who drinks how much and how often. I’m no longer blind to customers who buy pods at the supermarket – I actually have hugely valuable insights to share with manufacturers whose products create an ecosystem around my company. That’s possible with real big-data technology that collects and stores highly granular device data.

Marketers are embracing big-data technology, both for segmentation and to go beyond the cookie by using real-world data from the Internet of Things to build audiences.

It’s creating somewhat of a “cluster” for companies that are stuck in 2015.

How Political Campaigns are Putting DMPs to Work

SpaceyWe have all heard about the Democratic Party’s skill with data, and there is no doubt the Obama campaign’s masterful use of first-party registration data to drive online engagement, raise funds and influence political newbies helped put him over the line.

Four years later, the dynamics are mostly similar, but we have moved into a world where mobile is dominant, more young new voters are highly engaged and the standard segmentation – at least on the Republican side – might as well be thrown out the window.

In other words, everyone is getting influenced on their mobile phone, especially through news and social channels. There are a ton more mobile-first, new voters out there, and nobody is really sure which voters make up this weird new Trump segment.

To get a handle on this, political advertisers need to properly onboard and analyze their data to identify who they should target, where they live and what they like.

Understand Voter Identity

In politics, a strong “ground game” is key. That means real, old-school retail politics, such as knocking on doors and getting voters in specific precincts out on Election Day. All campaigns have the voter rolls and can do their fill of direct mail, robocalls and door knocking.

But how to influence voters well before Election Day who are tethered to their devices all day and night? It requires a digital strategy that can reach voters across the addressable channels they are on, including display, video, mobile and email. This strategy should leverage an identity graph to ensure the right messaging is hitting the same voter – at the right cadence.

Maybe “Joe the Firefighter,” a disaffected moderate Democrat who has had it with the Clintons, visited the Donald’s website and is ready to “Make America great again.” Before cross-device capabilities were strong, you could only retarget Joe the next time you saw his cookie online.

Today, Joe can get an equity message reinforced on display (“Make America great again!”), a mobile “nudge” to take action when we see Joe on his tablet at night (“Donate now!”) and follow up with an email a few days before the big rally (“Come see the Donald at the Civic Center!”).

Beyond this capability is the incredibly important task of laddering up individual identity into householding, so we can understand the composition of Joe’s family, since households often vote together and contain more than one registered voter.

Nail Geographic Targeting by County and District

Since “all politics is local,” it follows that all digital advertising should be locally targeted. This is table stakes for digital providers that work with campaigns, and targeting down to the ZIP+4 level has brought a level of precision to district-level outreach that approaches direct mail.

But direct mail (household targeting) is the crown jewel and digital is still trying to cross that divide, but is held back by a fragmented ecosystem of identity and, more importantly, privacy considerations.

This has always been a key consideration, given the fact that a small percentage of key districts can flip the presidency to one party or another.

Affiliation Modeling Through Behavior

Sometimes getting an understanding of someone’s party affiliation is super obvious, such as “liking” a specific candidate on social media. But, sometimes, a user’s affinity has to be derived through attributes derived through his or her behavior and the context of content consumed over time.

Data management platforms are bringing more precision to this type of modeling. Functionality, such as algorithmic segmentation, is helping digital analysts go beyond the basics. It’s fairly easy to correlate two or three attributes, such as income and gender, to estimate party affiliation. In this cycle, for example, we have seen a strong bias toward Trump from lower-income males with less than a college degree.

However, it’s hard for humans to correlate eight or more distinct attributes. Maybe those lower-education, low-income, rural males who love NASCAR actually lean toward Bernie Sanders in certain districts. Letting the machines crunch the numbers can give digital campaign managers an unseen advantage, and that capability has just now become available at scale.

“In 2016, relying on TV advertising to sway voters is no longer a solid campaign tactic,” JC Medici, Rocket Fuel’s national director of politics and advocacy, told me via email. “To secure the White House in November, candidates must now add a strong digital media strategy by utilizing best-in-class AI, correlated with strong voter and propensity data assets to ensure they are delivering ads to the right voter, on the right screen, at the right time.”

Social Affinity

One of the hot new areas for political campaign targeting is social affinity, the idea that there is a mutual affinity that can be measured between interests.

Yes, when someone “likes” Hillary, you have an obvious target. But, how about those folks who haven’t stated an obvious choice? Maybe 80% of Hillary fans also liked cat shelters, yellow dresses and Chris Rock.

When strong correlations between deterministic social behavior are shown, it becomes fairly easy to leverage that data for targeting – and make informed choices regarding media. People who liked Hillary also like certain TV shows, actors, causes and websites. Campaign managers can leverage data from Affinity Answers, Affinio and other companies to understand these relationships and exploit them to build support for candidates, while leveraging the ability to geotarget at very granular levels on Facebook.

The Free State Project, an organization committed to getting 20,000 “liberty-loving” people to move to New Hampshire and work toward limited government, just reached its goal – talk about a tough conversion. President Carla Gericke credits the use of data-driven targeting on Facebook for the achievement.

Speaking of social, it is also highly important to get the context right.

“Programmatic has introduced two new challenges: bots (who don’t vote) and brand safety,” Trust Metrics CRO Marc Goldberg told me. “In the age of immediate and shocking news, it has become more important that a political ad does not end up next to porn, hate or issues that are contradictory to the politician’s beliefs. One screen shot and bam, you are on Twitter.”

Onboarding And Offboarding 

Perhaps the most critical functionality for digital political campaigns continues to be the ability to “onboard” offline data, such as phone numbers, email addresses and party affiliation, and match it to an online ID for targeting purposes. This is essentially table stakes, considering the years of political investment in collecting offline records for phone banks and direct mail campaigns.

Previously, the onboarding of such data was limited to associating it with an active cookie for retargeting use. But with the emergence of real cross-channel device graphs, this data can now be tied to a universal consumer ID that is persistent and collects attributes over time.

Simply put, that onboarded email – now a UID – can be mapped to a number of identities, including Apple and Android mobile identifiers, third-party IDs from Experian and the like and device IDs from Roku and other OTT devices. In other words, the device graph enables that email to be associated with the voter’s omnichannel footprint, giving campaigns the ability to sequentially target messages, map creative to execution channels and truly understand attribution.

What’s even more exciting is the idea of offboarding some digital data back into the CRM. How valuable would it be to know that a potential voter watched an entire YouTube video on a candidate after being reached by the phone bank? Certain types of behavioral data, depending on compliance with privacy policies, can be brought back into the CRM to impact the effectiveness of offline voter outreach.

It is fair to say that 2016 is the most exciting campaign season we’ve had in a generation – and it’s only the primary season. As data-driven marketers, we will see campaigns push the limit in applying big marketing dollars to digital channels, trying to unlock new, mobile-first millennial voters, while persuading independents through more addressable advertising then ever.

It’s a great time to be a data-driven marketer.

CPG goes DMP

a95d8358e93c679917a7785d9cce4a35If you think about the companies with perhaps least amount of consumer data, you would automatically think about consumer packaged goods (CPG) manufacturers. Hardly anybody registers for their website or joins their loyalty clubs; moms don’t flock to their branded diaper sites; and they are at arms-length from any valuable transaction data (store sales) until well after the fact. So, with little registration, website, or offline sales data, why are so many large CPG firms licensing an expensive first-party data management platform?

While CPG companies will never have the vast amounts of point-of-sale, loyalty-card, app, and website data that a big box retailer might have, they do spend a ton of dough on media. And, as we all know, with large media expenditures come tons of waste. Combine this with the increasingly large investment and influence that activist investors and private equity companies have in CPG, and you can see where this leads. PE companies have installed zero-based budgeting that forces CPG concerns to rationalize every penny of the marketing budget—which, until lately, has been subject to the Wannamaker Rule (“I know half of my budget is working, but not which half”). Enter the DMP for measurement and global frequency control, cutting off and reallocating potentially millions of dollars in “long tail” spending. Now, the data that the CPG marketer actually has in abundance (media exposure data), can be leveraged to the hilt.

This first and most obvious CPG use case has been discussed extensively in past articles. But there is much more to data management for CPG companies. Here are just a few tactics big consumer marketers have written into their data-driven playbooks:

 

The Move to Purchase-Based Targeting (PBT)

Marketers have come a long way from demographic targeting. Yes, gender, age and income are all reliable proxies for finding those “household CEOs,” but we live in complicated times and “woman, aged 25-54, with 2 children in household” is still a fairly broad way to target media in 2016. Today, men are increasingly as likely to go grocery shopping on a Thursday night. Marketers saw this and shifted more budget to behavioral, psychographic, and contextual targeting—but finding cereal buyers using proxies such as site visitation sharpened the targeting arrow only slightly more than demography.

Packaged goods marketers have long understood the value of past purchases (loyalty cards and coupons), but until the emergence of data management technologies, have struggled to activate audiences based on such data. Now, big marketers can look at online coupon redemption or build special store purchase segments (Datalogix, Nieslen Catalina, News America Marketing) and create high value purchase-based segments. The problem? Such seed segments are small, and must be modeled to achieve scale. Also, by the time the store sales data comes in, it’s often far too late to optimize a media plan. That said, CPG marketers are finding that product purchasers share key data attributes that reveal much about their household composition, behavior, and—most interestingly—affinity for a company’s other products. It may not seem obvious that a shopping basket contains diapers and beer—until you understand that Mom sent Dad out to the store to pick up some Huggies, and he took the opportunity to grab a cold six-pack of Bud Light. These insights are shaping modern digital audience segmentation strategy, and those tactics are becoming more and more automated through the use of algorithmic modeling and machine-learning. CPG has seen the future, and it is using PBT to increase relevant reach.

Optimizing Category Reach

CPG marketers are constantly thinking about how to grow the amount of product they sell, and those thoughts typically vary between focusing on folks who are immensely loyal (“heavy” category buyers) versus those who infrequently purchase (“light” or “medium” category buyers). Who to target? It’s an interesting question, and one answered more decisively with purchase-based sales data.

Take the large global soda company as an example. Their average amount of colas their customer consumes is 15 a year, but that is an immensely deceptive number. The truth is that the company has a good amount of “power users” who drink 900 colas a year (two and a half per day), and a lot of people who may only drink 2-3 colas during the entire year. Using the age-old “80/20 Rule” as a guideline, you would perhaps be inclined to focus most of the marketing budget on the 20% of users who supposedly make up 80% of sales volume. However, closer examination reveals that heavy category buyers may only be driving as little as 50% of total purchase volume. So, the marketer’s quandary is, “Do I try and sell the heavy buyer his 901st cola, or do I try and get the light buyer to double his purchase from two to four colas a year?”

Leveraging data helps CPG companies not have to decide. Increasingly, companies are adopting frequency approaches that identify the right amount of messaging to nurture the heavy users (maybe 2-3 messages per user, per month) and bring light buyers to higher levels of purchase consideration (up to 20 messages per month). Moreover, by using DMP technology to segment these buyers based on their category membership, creatives can be adjusted based on the audience. Heavy buyers get messages that reinforce why the love the brand (“share the love”), and light buyers can receive more convincing messages (“tastes better”).

Increasing Lift through Cross-Channel Messaging

CPG marketers have some highly evolved models that show just how much lift a working media dollar has on sales, and they use this guide to decision on media investment by both channel and partner. With the power of DMPs for cross-channel measurement, CPG companies are finally able to apply even small insights they can to tweak sales lift.

What if the data reveal that a 50% mixture of equity and direct response ad creatives lifts coupon downloads by 200%? In other words, instead of just showing “Corn Flakes are Yummy” ads, you mixed in a few “Buy Flakes now at Kroger and save!” creatives afterwards, and you saw a huge impact on your display performance? Sadly, this simple insight was not available before data management platforms corralled cross-channel spending and associated it with an individual, but now these small insights are adding up to appreciable sales lift.

In another example, a large CPG company sees massive lift in in-store coupon redemptions by running branded display ads on desktop all throughout the week—but giving a “mobile nudge” on the smartphone on Friday night when it’s time to fill the pantry. This cross-channel call-to-action has seen real results, and only involves grabbing a brand-favorable consumer’s attention on another device to create a big impact. Again, a simple tactic—but also impossible without the power of a DMP.

CPG marketers have been able to achieve a ton of progress by working with relatively sparse amounts of data. What can you do with yours?

 

 

 

Data Triangulation: How Second-Party Data Will Eat The Digital World

bearsharkMarketers are getting frustrated with spending up to 60% of their working media dollars to fund intermediaries between themselves and their publishing partners. By the time a marketer pays his agency, trading desk, exchange, third-party data provider, and subsidizes the publisher’s ad serving stack, dollars turn into dimes. Marketers want less fraud, more people, less ad tech, and to put more media dollars to work to drive performance. Quality publishers, who for so long sacrificed control for access to an always-on stream of programmatic cash, are now seeing balance return, as shady sources of inventory leave the ecosystem and start to create scarcity for premium supply.

Publishers with desired audiences are starting to leverage hacks like “header bidding” and private marketplaces to get more control and capture more revenue from transactions. But they are also starting to look at data-only transactions among trusted demand-side partners. Now that marketers are catching up with DMP technology, securely sharing audiences becomes possible, opening up a new era where “second party” data is poised to reign supreme. Before we talk about how that happens, let’s first define some data terms:

A Primer

First-party data is proprietary data that marketers and publishers have collected – with permission, of course – and, therefore, own. It can be cookies collected from a site visit, offline data onboarded into addressable IDs and even data from marketing campaigns. Second-party data is simply someone else’s first-party data. Second-party data gets created any time two companies strike up a deal for data that is not publicly available. The most common use case is that of a marketer – say a big airline –getting access to data for a publisher’s frequent travelers. Big Airline might say to Huge News Site with business travelers, “Let’s user match, so every time I see one of my frequent flyers on your site, I can serve him an ad.” Huge News Site may decide to allow Big Airline to target its users wherever they are found (a “bring your own data” deal) or make such a deal incumbent upon buying media. Either way, Big Airline now has tons of really valuable Huge News Site reader data available in its data-management platform (DMP) for modeling, analysis and targeting.

Despite the much heralded death or merely diminution of third-party data, it is still a staple of addressable media buying. This is data that is syndicated and made available for anyone to buy. This data could describe user behavior (Polk “auto intenders” of various stripes) or bucket people into interesting addressable segments based on their life circumstances (Nieslen “Suburban Strivers”), describe a user’s income level (Acxiom or Experian) or tell you where a user likes to go via location data (PlaceIQ or Foursquare). Most demand-side platforms (DSPs) make a wide variety of this data available within their platforms for targeting, and DMPs enable users to leverage third-party data for segment creation – usually allowing free usage for analytics and modeling purposes and getting paid upon successful activation. Data Quality And Scale So, which kind of data is the best? When asked that question by a marketer, the right question is inevitably, “all of it.” But, since that’s an annoying answer, let’s talk about the relative scale and value of each type of data. It’s easily visualized by this wonderfully over-simplified triangle:

TRIANGLE OF DATA

First-party data is the most limited in scope, yet the most powerful. For marketers –especially big CPG marketers who don’t get a lot of site traffic – first-party data is incredibly sparse but is still the absolute most valuable signal to use for modeling. Marketers can analyze first-party data attributes to understand what traits and behaviors consumers have in common and expand their reach using second- or third-party data. Retail and ecommerce players are more fortunate. A Big Box Store has first-party data out the wazoo: loyalty card data, point-of-sale system data, app data, website registration data, site visit data and maybe even credit card data if it owns and operates a finance arm. It can leverage a DMP to understand how media exposure drove a store visit, where customers were in the store (beacons!), what was purchased, how many coupons were remitted and whether or not they researched their purchase on the site. Talk about getting “closed loop” sales attribution. The power of first-party data is truly amazing.

The biggest problem with third-party data is that all of my competitors have it. In programmatic marketing, that means both Ford and Chevy are likely bidding on the same “auto-intender” and driving prices up. The other problem is that I don’t know how the data was created. What attributes went into deciding whether or not this “auto intender” is truly in-market for a car? There are no real rules about this stuff. A guy who read the word “car” in an article might be an “auto-intender” just as someone who looked a four-door sedans three times in the last 30 days on reputable auto sites. Quality varies. That being said, there is huge value in having third-party data at your disposal. Ginormous Music App, for example, has built a service that is essentially a DMP for music; it knows how to break down every song, assign very granular attributes to it and delivers highly customized listening experiences for free and paid users. Those users are highly engaged, have demonstrated a willingness to buy premium services and are, by virtue of their mobile device, easily found at precise geolocations. Yet, for all of that, the value to a marketer of a Maroon Five segment is rather small. Everyone likes Maroon Five, from grandmothers to tweens to Dads. A Maroon Five segment provides little value to an advertiser. Yet, if Ginormous Music App could push its app-based user data (IDFAs) into the cookie space and find a user match, it could effectively use third-party data to understand the income, behavior and general profile of many Maroon Five fans. And that’s what their advertisers like to buy. That’s pretty damn valuable.

So, how about “second-party” data? These are the “frequent business travelers” on Huge News Site and the “Mitsubishi intenders” on Large Auto Site. These are real users, with true demonstrated intent and behavior that has been validated on real properties. One of the most valuable things about audiences built on second-party data is that there is usually transparency regarding how those users found their way into a segment.

The ironic and kind of beautiful thing about the emergence of second-party data is that it is most often merely a connection to a premium publisher’s users. However, it can be uncoupled from a publisher’s media sales practice. Marketers, increasingly sick of all the fraud and junk in the programmatic ecosystem are turning toward second-party data to access the same audiences they bought heavily in print 30 years ago. This time, however, they are starting to get both the quality – and the quantitative results – they were looking for. On the flip side, quality publishers are starting to understand that, when offered in a strict, policy-controlled environment that protects their largest asset – audience data – they can make way more money with data deals than media deals.

Put simply, second-party data is heralding a return to the good old days when big marketers depended on relationships with big publishers as the stewards of audiences, and they created deep, direct relationships to ensure an ongoing value exchange. Today, that exchange increasingly happens through web-based software rather than martini lunches.

[This article originally appeared in AdExchanger on 1/25/16]

Classic Wrap Up Article with Typical Next-Year Guru Predictions

Guru-Ram-Das-picture2015 was a fantastic year for many data-driven marketers, with data management platforms (DMPs), consultancies and marketers getting something nice under their trees.

Unfortunately, 2015 also saw legacy networks, supply-side platforms (SSPs) and some less nimble agencies receive coal in their respective stockings for failing to keep up with the rapidly changing paradigm as marketing and ad technology merge.

In the great tradition of end-of-year prediction articles, here’s my take on the year’s biggest developments and what we’ll see in 2016, including a rapid technology adoption from big marketers, a continuing evolution of the agency model and an outright revolution in how media is procured.

Agency Ascendant?

I thought 2015 was supposed to herald the “death of the digital agency model.” As agencies struggled to define their value proposition to big marketers that were increasingly bringing “programmatic in house,” agencies were reputed to be on the ropes. Massive accounts with billions of dollars in marketing spend were reviewed, while agencies churned through cash pitching to win new business – or at least trying keep old business.

The result? Agencies swapped a ton of money, but were abandoned by no serious marketers. Agencies got a lot smarter, and starting spinning new digital strategies and DMP practices to combat the likes of system integrators and traditional consultancies. And the band played on.

In 2016, we will continue to see agencies strengthen their digital strategy bench, start moving “marketing automation” practices into the DMP world and offer integration services to help marketers build bespoke “stack” solutions. Trading desks will continue to aggressively pursue unique relationships with big publishers and start to embrace new media procurement methodologies that emphasize their skillset, rather than the bidded approach in open exchanges (more on that below).

Marketers Hug Big Data

Marketers started to “cross the chasm” in 2015 and more widely embrace DMPs. It’s no longer just “early adopters” such as Kellogg’s that are making the market. Massive top-100 firms have fully embraced DMP tech and are starting to treat online data as fuel for growth.

Private equity and activist investors continue to put the squeeze on CPG companies, which have turned to their own first-party data to find media efficiency as they try to control the one line item in the P&L usually immune to risk management: marketing spend.

Media and entertainment companies are wrangling their consumer data to fuel over-the-top initiatives, which put a true first-party relationship with their viewers front and center. Travel companies are starting to marry their invaluable CRM data to the anonymous online world to put “butts in seats” and “heads in beds.”

If 2015 saw 15% of the Fortune 500 engage with DMPs, 2016 is when the early majority will surge and start to make the embrace of DMP tech commonplace. The land grab for 24-month SAAS contracts is on.

Busy Consultants

It used to be a that a senior-level digital guy would get sick of his job and leave it (or his job would leave him), leading to a happy consultant walking around advising three or four clients on programmatic strategy. In 2015, that still exists but we’ve seen a rise in scale to meet the needs of a rapidly changing digital landscape.

Marketers and publishers are hiring boutique consultancies left and right to get on track (see this excellent, if not comprehensive, list). Also, big boys, including Accenture, Boston Consulting Group and McKinsey, are in the game, as are large, media-centric firms, such as MediaLink.

These shops are advising on data strategy, programmatic media, organizational change management and privacy. They are helping evaluate expensive SAAS technology, including DMPs and yield management solutions, and also doing large systems integrations required to marry traditional databases with DMPs.

Match Rates (Ugh)

Perhaps unpublicized, with the exception of a few nerdy industry pieces, we saw in 2015 a huge focus on “match rates,” or the ability for marketers to find matches for their first-party data in other execution systems.

Marketers want to activate their entire CRM databases in the dot-com space, but are finding only 40% to 50% of cookies that map to their valuable lists. When they try to map those cookies to a DSP, more disappointment ensues. As discussed in an earlier article, match rates are hard to get right, and require a relentless focus on user matching, great “onboarding services,” strong server-to-server connections between DMPs and DSPs (and other platforms) and a high frequency of user matching.

This was the year that marketers got disappointed in match rates and started to force the industry to find better solutions. Next year, huge marketers will take bold steps to actually share data and create an available identity map of consumers. I think we will see the first real data consortium emerge for the purposes of creating an open identity graph. That’s my big prediction – and hope – for 2016.

Head For The Headers

2015 was the year of “header bidding,” the catch-all phrase for a software appliance that gives publishers the chance to offer favored demand-side partners a “first look” at valuable inventory. I am not sure if “header bidding” will ultimately become the de facto standard for “workflow automation,” but we seem to be relentlessly marching back to a world in which publishers and marketers take control of inventory procurement and get away from the gamesmanship inherent in exchange-based buying.

Big SSPs and networks that have layered bidding tech onto open exchanges are struggling. Demand-side platforms are scrambling to add all sorts of bells and whistles to their “private marketplaces,” but the industry evolves.

Next year, we will see the pace of innovation increase, and we have already seen big trade desks make deals with DMPs to access premium publisher inventory. It’s nice to see premium publisher inventory increase in value – and I believe it will only continue to do so.

2016 will be the year of “second-party data” and the winners will be the ones with the technology installed to easily transact on it.

2015 was a great year for data-driven marketing, and 2016 will be even more fun. Stay safe out there.

This post originally appeared in AdExchanger on 12/17/2015

Trends in Programmatic Buying

thefuture2015 has been one of the most exciting years in digital driven marketing to date. Although publishers have been leading the way in terms of building their programmatic “stacks” to enable more efficient selling of digital media, marketers are now catching up. Wide adoption of data management platforms has given rise to a shift in buying behaviors, where data-driven tactics for achieving effectiveness and efficiency rule. Here’s a some interesting trends that have arisen.

Purchase-Based Targeting

Remember when finding the “household CEO” was as easy as picking a demographic target? Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen a them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT). This trend has existed in categories like Automotive and Travel, but is now being seen in CPG. Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them. These purchase-defined segments are a more precise starting point in digital segmentation—and can be augmented by behavioral and contextual data attributes to achieve scale. The big winners here are the folks who actually have the in-store purchase information, such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing.

Programmatic Direct

For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them. It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP an exchange, SSP, and other assorted technologies to get to space on a publisher website. Marketers have seen $10 CPMs turn into just $2 of working media. Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure. A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching. This is especially effective where marketers seek as “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.

Global Frequency Management

The rise in DMPs has also led to another fast-growing trend: global frequency management. Before marketers could effectively map users to all of their various devices (cross-device identity management, or CDIM) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual. Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more. What is the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication. This means big money for marketers, who can finally start to control their messaging—putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen. It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.

Taking it in-House

More and more, we are seeing big marketers decide to “take programmatic in house.” That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data. This trend isn’t as explosive as one might think, based on the industry trades—but it is real and happening steadily. What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks. Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available. Even the oldest school marketer can license their way into a technology stack any agency would be proud of. The only thing really holding back the trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders! When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.

Closing the Loop with Data

One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.” Attitudinal “brand lift” studies, and latent post-campaign sales attribution modeling has been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. “Did my Facebook ad move any items off the shelf?” We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to able to get eCommerce-like attribution for corn flakes soon. In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store. Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display; and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made. These marketing fantasies are becoming more real every day.

Letting the Machines Decide

What’s next? The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, an test-and-learn their way to gain more precision. Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will atart doing the heavy lifting. What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a hige edge in execution. In other words, now we can look at a small group of 1000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common. Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segment automatically generated from that data, and expend it from there. This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.

It’s a great time to be a data-driven marketer!

 

Match Game 2015

Match gameIf you work in digital marketing for a brand or an agency, and you are in the market for a data management platform, you have probably asked a vendor about match rates. But, unless you are really ahead of the curve, there is a good chance you don’t really understand what you are asking for. This is nothing to be ashamed of – some of the smartest folks in the industry struggle here. With a few exceptions, like this recent post, there is simply not a lot of plainspoken dialogue in the market about the topic.

Match rates are a key factor in deciding how well your vendor can provide cross-device identity mapping in a world where your consumer has many, many devices. Marketers are starting to request “match rate” numbers as a method of validation and comparison among ad tech platforms in the same way they wanted “click-through rates” from ad networks a few years ago. Why?

As a consumer, I probably carry about twelve different user IDs: A few Chrome cookies, a few Mozilla cookies, several IDFAs for my Apple phone and tablets, a Roku ID, an Experian ID, and also a few hashed e-mail IDs. Marketers looking to achieve true 1:1 marketing have to reconcile all of those child identities to a single universal consumer ID (UID) to make sure I am the “one” they want to market to. It seems pretty obvious when you think about it, but the first problem to solve before any “matching” tales place whatsoever is a vendor’s ability to match people to the devices and browser attached to them. That’s the first, most important match!

So, let’s move on and pretend the vendor nailed the cross-device problem—a fairly tricky proposition for even the most scaled platforms that aren’t Facebook and Google. They now have to match that UID against the places where the consumer can be found. The ability to do that is generally understood as a vendor’s “match rate.”

So, what’s the number? Herein lies the problem. Match rates are really, really hard to determine, and they change all the time. Plus, lots of vendors find it easier to say, “Our match rate with TubeMogul is 92%” and just leave it at that—even though it’s highly unlikely to be the truth. So, how do you separate the real story from the hype and discover what a vendor’s real ability to match user identity is? Here are two great questions you should ask:

What am I matching?

This is the first and most obvious question: Just what are you asking a vendor to match? There are actually two types of matches to consider: A vendor’s ability to match a bunch of offline data to cookies (called “onboarding”), and a vendor’s ability to match a set of cookie IDs to another set of cookie IDs.

First, let’s talk about the former. In onboarding—or matching offline personally identifiable information (PII) identities such as an e-mail with a cookie—it’s pretty widely accepted that you’ll manage to find about 40% of those users in the online space. That seems pretty low, but cookies are a highly volatile form of identity, prone to frequent deletion, and dependent upon a broad network of third parties to fire “match pixels” on behalf of the onboarder to constantly identify users. Over time, a strong correlation between the consumer’s offline ID and their website visitation habits—plus rigor around the collection and normalization of identity data—can yield much higher offline-to-online match results, but it takes effort. Beware the vendor who claims they can match more than 40% of your e-mails to an active cookie ID from the get-go. Matching your users is a process, and nobody has the magic solution.

As far as cookie-to-cookie user mapping, the ability to match users across platforms has more to do with how frequently the your vendors fire match pixels. This happens when one platform (a DMP) calls the other platform (the DSP) and asks, “Hey, dude, do you know this user?” That action is a one-way match. It’s even better when the latter platform fires a match pixel back—“Yes, dude, but do you know this guy?”—creating a two-way identity match. Large data platforms will ask their partners to fire multiple match pixels to make sure they are keeping up with all of the IDs in their ecosystem. As an example, this would consist of a DMP with a big publisher client who sees most of the US population firing a match pixel for a bunch of DSPs like DataXu, TubeMogul, and the Trade Desk at the same time. Therefore, every user visiting a big publisher site would get that publisher’s DMP master ID matched with the three separate DSP IDs. That’s the way it works.

Given the scenario I just described, and even accounting for a high degree of frequency over time, match rates in the high 70 percentile are still considered excellent. So consider all of the work that needs to go into matching before you simply buy a vendor’s claim to have “90%” match rates in the cookie space. Again, this type of matching is also a process—and one involving many parties and counterparties—and not just something that happens overnight by flipping a switch, so beware of the “no problem” vendor answers.

What number are you asking to match?

Let’s say you are a marketer and you’ve gathered a mess of cookie IDs through your first-party web visitors. Now, you want to match those cookies against a bunch of cookie IDs in a popular DSP. Most vendors will come right out and tell you that they have a 90%+ match rate in such situations. That may be a huge sign of danger. Let’s think about the reality of the situation. First of all, many of those online IDs are not cookies at all, but Safari IDs that cannot be matched. So eliminate a good 20% of matches right off the bat. Next, we have to assume that a bunch of those cookies are expired, and no longer matchable, which adds another 20% to the equation. I could go on and on but, as you can see, I’ve just made a pretty realistic case for eliminating about 40% of possible matches right off the bat. That means a 60% match rate is pretty damn good.

Lots of vendors are actually talking about their matchable population of users, or the cookies you give them that they can actually map to their users. In the case of a DMP that is firing match pixels all day long, several times a day with a favored DSP, the match rate at any one time with that vendor may indeed be 90-100%–but only of the matchable population. So always ask what the numerator and denominator represent in a match question.

You might ask whether or not this means the popular DMP/DSP ”combo” platforms come with higher match rates, or so-called “lossless integration” since both the DMP and DSP carry an single architecture an, therefore, a unified identity. The answer is, yes, but that offers little differentiation when two separate DMP/DSP platforms are closely synched and user matching.

In conclusion

Marketers are obsessing over match rates right now, and they should be. There is an awful lot of “FUD” (fear, uncertainty, and doubt) being thrown around by vendors around match rates—and also a lot of BS being tossed around in terms of numbers. The best advice when doing an evaluation?

  • Ask what kind of cross-device graph your vendor supports. Without the fundamental ability to match people to devices, the “match rate” number you get is largely irrelevant.
  • Ask what numbers your vendor is matching. Are we talking about onboarding (matching offline IDs to cookies) or are we talking about cookie matching (mapping different cookie IDs in a match table)?
  • Ask how they are matching (what is the numerator and what is the denominator?)
  • Never trust a number without an explanation. If your vendor tells you “94.5%” be paranoid!
  • And, ask for a match test. The proof is on the pudding!

DMP 4-5-6

NEXTLEVELAs I’ve previously discussed, there are several basic use cases of the modern data management platform (DMP) for marketers. They include getting “people data” from addressable devices into a single system, controlling how it’s matched with different execution platforms and managing the frequency of messaging across devices.

In a world of ultra-fragmented device identity and multiple addressable media channels, you should be able to tie them together and make sure consumers get the optimal amount of messages. Big marketers use these tactics to save tons of money by chopping off the “long tail” of impressions, such when marketers deliver more than 30 impressions per user each month, and reinvesting to find more deduplicated reach.

There is so much more to the successful application of a DMP, though. The most cutting-edge marketers are taking DMPs to the next level, after investing the time in building consumer identity graphs and getting their match rates with execution platforms as high as possible.

There are several plays you can run when you start to dig in and put the data to work. 

Supercharge The Bidding Strategy

After identifying the long tail of impression frequency and diverting that investment into reach, where users are served up to three impressions per month, the key is driving users down into the sweet spot of frequency. This is where users are more likely to download more coupons, for example, or complete more video views.

If that sweet spot is between four and 20 impressions, marketers can adjust their strategy in biddable environments to ensure they are willing to pay more to “win” users who have only been exposed to three impressions so far. DMPs can match users with fidelity and deliver in near real time these types of targeting sets to multiple execution platforms, including those for display, video and search.

Optimize Partner Investment Through Reach Analysis  

It’s a great start to manage addressable media delivery on a global basis, but what happens after you have identified all of those wasted impressions?

Naturally, the money marketers are spending reaching consumers for the 100th time can be better spent looking for net new consumers. But how do you get them?

For a diaper manufacturer that wants to reach the estimated 6 million new mothers in market every year, it’s critically important to get to 100% reach against that audience. Many marketers start with a single, broad reach partner, such as Yahoo, and see how close they can get to total reach.

It’s fantastic to leverage big spending power to drive down prices and get massive customer service attention to spread a message to as many unique users as possible. But no single partner can get a marketer to 100%. That’s where the DMP comes in.

It’s not just about filling in the missing 25% of an audience that matters; the diaper manufacturer wants to hit those incremental moms across quality, well-lit sites. Determining where you can get a few more million deduplicated moms is the first step. The key is to then decide where to find them more effectively from an investment standpoint, which requires an overlap analysis.

Enhance Partner Selection Through Overlap Analysis 

Say our diapers manufacturer found 4 million new moms on Yahoo at a reasonable CPM. The DMP can then look across all addressable media investments and run a “Where are my people?” type of analysis.

Maybe this advertiser has another 20 partners on the plan after getting the bulk of unique reach from a single partner. How many more unique moms were found on Meredith? Moreover, how about finding moms on classic news and entertainment sites, such as NBC or Turner properties, or even non-endemic sites? Maybe there is an incremental 500,000 first-party “diaper moms” on a particular site, but now the advertiser can decide, based on performance KPIs and price, how valuable those particular moms are.

If those moms on a popular news site can be had for $5 CPM, maybe they are a more valuable reach vehicle than those found on the obvious “Moms.com” site. Without the DMP, they’ll never know.

Plus, marketers are also starting to optimize the way they procure such audiences, by leapfrogging over the existing ad tech ecosystem and doing audience-based programmatic direct buying using their new DMP pipes.

Understand KPIs Drivers Through Journey Building

Marketers that have deduplicated their audience and built an effective reach strategy can now go to the next level and start finding how those diaper moms moved from their first touch point in the customer journey to an actual action, such as downloading a retail coupon or requesting a sample package. When an audience is unified through a DMP, it’s possible to see the channels through which people move across their “customer journey” from awareness to action.

As an example, more large CPG companies are putting more investment into online video and, in fact, one of the world’s largest marketers has embraced a “ban the banner” approach and values engagement more than any other KPI – a metric more easily understood with video. With that in mind, a journey analysis can show marketers if seeing a few search impressions helped drive more completed views on (expensive) video and drive more brand engagement.

Did consumers download more coupons after viewing two equity (branding) impressions or before seeing the “buy now” (direct-response) message? The ability to understand how messages work together sequentially is the ultimate guide to being able to inform media investment strategy.

These are just a few of the next-level media use cases that can be accomplished once DMP fundamentals are put in place. DMPs are starting to shine a light on the “people data” that will drive the next decade of smart media investment. I think we will look back on the last 15 years of addressable marketing and wonder how we ever made such decisions without a clear view of audience first.

DMPs are starting to shine a light on the effectiveness of marketing, and giving marketers lots of new knobs and levers to pull.

It’s a great time to be a data-driven marketer.

Follow Chris O’Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter

Leapfrogging the Lumascape

Leapfrogging-baby-elephants-at-chester-zoo-by-Mike-ShawMarketers have always craved access to quality audience at scale. That was once as easy as scheduling buys on the top three broadcast networks and buying full-page ads in national newspapers. Today, the world is more complicated, with attention shifting into a splintered digital universe of thousands of channels across multiple media types.

Ad tech companies have tried to corral a massively expanding world of inventory in ad exchanges, along with the means to bid inside them. This “programmatic” world of inventory procurement is deeply flawed, yet still the best thing we have at the moment.

It’s flawed because it mostly offers access to commoditized “display” ad units of dubious value and struggles to deliver real audiences, rather than robots. But it’s also good because we have taken the first steps past a ridiculous paradigm of buying media through relationships and fax machines, while starting to bring an analytical discipline to media investment that is based on measurement.

So, as we sled the downward slope of the programmatic buying Hype Cycle, we are starting to see some new trends in inventory procurement – namely, a strategy that involves replacing some or all of the licensed programmatic architecture, as well a growing reliance on one’s own data.

But first, before we get into the nuts and bolts of how that works, some history:

The Monster We Created

After convincing ourselves of the lack of scalability in the direct model, where we would call an ad rep, we have set up a lot of distance between a marketer and their desired audience.

chrisoharachart1-921

Imagine I am a cereal manufacturer and have discovered through media mix modeling that digital moms on Meredith sites drive a lot of offline purchases. They are the “household CEOs” that drive grocery store purchasing, try new things and are influential among their peer group, in terms of recommending new products. In today’s new media procurement paradigm, there are many “friends” standing between my target and me:

  • Media agency: This is a must-have, unless marketers want to add another 100 people to their headcount with an expertise in media, but this adds 5% to 10% in costs to media buys.
  • Trading desk: Although many marketers are starting to take this functionality in-house, whether you trade internally or leverage an agency trading desk, you can expect 10% to 15% of media costs to go to the personnel needed to run this type of operation.
  • Demand-side platform (DSP): Don’t forget about the technology. A 15% bid reduction fee is usually required to leverage the smart tools necessary to find your inventory at scale across exchanges.
  • Private marketplace: But wait! We use private marketplaces to make exclusive deals among a small pool of preferred vendors. Yes, but they operate inside DSPs and carry transactional fees that can add between 5% and 10% extra.
  • Third-party data: You can’t target effectively without adding a nice layer of audience data on your buy, but expect to pay at least $1 CPM for the most basic demographic targeting – a significant percentage of cost even on premium buys.
  • Exchanges: Maybe you pay for this via your DSP, but someone is paying for a seat on an ad exchange and that cost is passed through a provider, which can add another several percentage points.
  • Supply-side platform (SSP): It’s not just the demand side that needs to leverage expensive technology to navigate the new world of digital media. Publishers pay up to 15% in fees to deploy SSPs, a smart inventory management technology to help them manage their “daisy chain” of networks and channel sales providers to get the best yield. This is baked into the media cost and passed along to the advertiser.
  • Ad server: Finally, the publisher pays a fee to get the ad delivered to the site. It is a somewhat small price, but one that is passed along to the advertiser, usually baked in to the media cost.

This is essentially the middle of a crowded LUMAscape, a bunch of different disintermediating technologies that stand between an advertiser and the publisher. Marketers pay for everything I just described. They may not license the publisher’s SSP for them, but they are subsidizing it. After running this gauntlet, marketers with $10 to spend on “cereal moms” end up with much less than half in media value – the amount the publisher ends up with after the disintermediation takes place. This can be anywhere from 10% to 40% of the working media spend.

That’s probably the biggest problem in ad tech right now.

We’ve essentially created a layer of technology so gigantic in between marketers and audiences, that 60% to 70% of media investment dollars land up in venture-funded technology companies’ hands, rather than the media owner creating the perceived value. How do we change that paradigm?

chrisoharachart2-921

Leapfrogging the Middleware

Data management technology is increasingly replacing some of the middleware in this procurement equation, effectively writing the third chapter in the saga we know as programmatic direct.

Here is a bit of background.

What I call “Programmatic Direct 1.0” was the short-lived period in which companies leveraging the DoubleClick for Publishers (DFP) ad-serving API built static marketplaces of premium inventory.

For example, a premium publisher like Forbes might decide to place a chunk of 500,000 home page impressions in a marketplace at a $15 CPM. Buyers could go into an interface, transact directly with the publisher and secure the inventory. The problem that inventory owners had a hard time valuing their future inventory and buyers weren’t keen to log into yet another platform to buy media. This phase effectively ended with the Rubicon Project buying several leaders in the space, ShinyAds and iSocket, and AdSlot taking over workflow automation software provider Facilitate Media. Suddenly, “programmatic direct” platforms started to live inside systems where media planners actually bought things.

Programmatic direct’s second act (2.0) is prevalent today. Companies use deal IDs or build PMPs within real-time systems and exchanges to have more control over procurement than what is available in an auction environment. Sellers can set prices and buyers can secure rights to inventory at a set, transparent cost. This works pretty well, but comes with the same gigantic stack of providers as before and includes additional transaction fees. This is akin to making a deal to buy a house directly from the owner, but agreeing to pay the real estate broker fee anyway. The thing about programmatic direct transactions is that they are fundamentally different than RTB because they don’t have to take place in “real time,” nor do they involve bidding. A brand-new set of pipes is required.

“Programmatic direct 3.0” – or whatever we decide to call it – looks a bit different. Let’s say the big cereal company uses a data-management platform (DMP) to collect its first-party data and creates segments of users from both offline user attributes and page-level attributes from site visitation behavior. The marketers have created a universal ID (UID) for every user. Let’s imagine they discovered 200,000 were females, 24 to 40 years old, living in two-child households with income greater than $150,000 and interested in health and fitness. Great.

Now imagine that a huge women’s interest site deployed its own first-party DMP and collected similar attributes about their users, who were assigned UIDs. If the marketer and publisher have the same enterprise data architecture, they could match their users, make a deal and discover that there’s an overlap of 125,000 of users on the site. Maybe the marketer agrees to spend $7 CPM to target those users, along with users who are statistically similar, every time they are seen on the site for November.

The DMP can push that segment directly into the publisher’s DFP. No trading desk fees, DSP fees, third-party data costs or SSPs involved. The same is true for a variety of companies that have built header bidding solutions, although they see less data than first-party DMPs.

With this 3.0 approach, most of the marketer’s $7 is spent on media, rather than a basket of technologies, and the publisher gets to keep quite a bit of that revenue.

Sounds like a good deal.

Follow Chris O’Hara (@chrisohara) and AdExchanger (@adexchanger) on Twitter. 

DMP 1-2-3

Almost every marketer is starting to lean into data management technology. Whether they are trying to build an in-house programmatic practice, use data for site personalization, or trying to obtain the fabled “360 degree user view,” the goal is to get a handle on their data and weaponize it to beat their competition.

In the right hands, a data management platform (DMP) can do some truly wonderful things. With so many use cases, different ways to leverage data technology, and fast moving buzzwords, it’s easy for early conversations to get way too “deep in the weeds” and devolve into discussions of “match rates” and how cross-device identity management works. The truth is that data management technology can be much simpler than you think.

At its most basic level, DMP comes down to “data in” and “data out.” While there are many nuances around the collection, normalization, and activation of the data itself, let’s look at the “data in” story, the “data out” story, and an example of how those two things come together to create an amazing use case for marketers.

The “Data In” Story

datainTo most marketers, the voodoo that happens inside the machine isn’t the interesting part of the DMP, but it’s really where the action happens. Understanding the “truth” of user identity (who are all these anonymous people I see on my site and apps?) is what makes the DMP useful in the first place, making one-to-one marketing and understanding customer journeys something that goes beyond AdExchanger article concepts, and starts to really make a difference!

  • Not Just Cookies: Early DMPs focused on mapping cookie IDs to a defined taxonomy and matching those cookies with execution platforms. Most DMPs—from lightweight “media DMPs” inside of DSPs to full-blown “first-party” platforms—handle this type of data collection with ease. Most first-generation DMPs were architected as cookie collection and distribution platforms, meant to associate a cookie with an audience segment, and pass it along to a DSP for targeting. The problem is that people are spending more time in cookie-less environments, and more time on mobile (and other devices). That means today’s DMPs have to have the ability to do more than organize cookies, but also be able to capture a large variety of disparate identity data, which can also include hashed CRM data, data from a point-of-sale (POS) system, and maybe even data from a beacon signal.
  • Ability to Capture Device Data: To a marketer, I look like eight different “Chris OHara’s:” three Apple IDFAs, several Safari unique browser signatures, a Roku device ID, and a hashed e-mail identity or two. These “child identities” must be reconciled to a “Universal ID” that is persistent and collects attributes over time. Most DMPs were architected to store and manage cookies for display advertising, not cross-device applications, so platforms’ ability to ingest highly differentiated structured and unstructured data are all over the map. Yet, with more and more time dedicated to devices instead of desktop, cookies only cover 40% of today’s pie.
  • Embedded Device Graph: Cross-device identification is notoriously difficult, requiring both the ability to identify people through deterministic data (users authenticate across mobile and desktop devices), or the skill to apply smart algorithms across massive datasets to make probabilistic guesses that match users and their devices. Over the next several years, the word “device graph” will figure prominently in our industry, as more companies try and innovate a path to cross-device user identity—without data from “walled garden” platforms like Google and Facebook. Since most algorithms operate in the same manner, look for scale of data; the bigger the user set, the more “truth” the algorithms can identify and model to make accurate guesses of user identity.

The “data in” story is the fundamental part of DMP—without being able to ingest all kinds of identifiers and understand the truth of user identity, one-to-one marketing, sequential messaging, and true attribution is impossible

dataoutData Out

While the “data in” story gets pretty technical, the “data out” story starts to really resonate with marketers because it ties three key aspects of data-driven marketing together. Here’s what a DMP should be able to do:

  • Reconcile Platform Identity: Just like I look like eight different “Chris O’Haras” based on my device, I also look like 8 different people across media channels. I am a cookie in DataXu, another cookie in Google’s DoubleClick, and yet another cookie on a site like the New York Times. The role of the DMP is to user match with all of these platforms, so that the DMP’s universal identifier (UID) maps to lots of different platform IDs (child identities). That means the DMP must have the ability to connect directly with each platform (a server-to-server integration being preferable), and also the chops to trade data quickly, and frequently.
  • Unify the Data Across Channels: To a marketer, every click, open, like, tweet, download, and view is another speck of gold to mine from a river of data. When aggregated at scale, these data turn into highly valuable nuggets of information we call “insights.” The problem for most marketers that operate across channels (display, video, mobile, site-direct, social, and search, just to name a few) is that the fantastic data points they receive all live separately. You can log into a DSP and get plenty of campaign information, but how do you relate a click in a DSP with a video view, an e-mail “open,” or someone who has watched a YouTube on an owned and operated channel? The answer is that even the most talented Excel jockey running twelve macros can’t aggregate enough ad reports to get decent insights. You need a “people layer” of data that spans across channels. To a certain extent, who cares what channel performed best, unless you can reconcile the data at the segment level? Maybe Minivan Moms convert at a higher percentage after seeing multiple video ads, but Suburban Dads are more easily converted on display? Without unifying the data across all addressable channels, you are shooting in the dark.
  • Global Delivery Management: The other thing that becomes possible when you tie both cross-device user identity and channel IDs together with a central platform is the ability to manage delivery globally. More on this below!

gdmGlobal Delivery Management

If I am a different user on each channel—and each channel’s platform or site enables me to provide a frequency cap—it is likely that I am being over-served ads. If I run ads in five channels and frequency cap each one at 10 impressions a month per user, I am virtually guaranteed to receive 50 impressions over the course of a month—and probably more depending on my device graph. But what if the ideal frequency to drive conversion is only 10 impressions? I just spent 5 times too much to make an impact. Controlling frequency at the global level means being able to allocate ineffective long-tail impressions to the sweet spot of frequency where users are most likely to convert, and plug that money back into the short tail, where marketers get deduplicated reach.

In the above example, 40% of a marketer’s budget was being spent delivering between 1-3 impressions per user every month. Another 20% was spent delivering between 4-7 impressions, which conversion data revealed to be where the majority of conversions were occurring. The rest of the budget (40%) was spent on impressions with little to very little conversion impact.

In this scenario, there are two basic plays to run: Firstly, the marketer wants to completely eliminate the long tail of impressions and reinvest it into more reach. Secondly, the marketer wants to push more people from the short tail down into the “sweet spot” where conversions happen. Cutting off long tail impressions is relatively easy, through sending suppression sets of users to execution platforms.

“Sweet spot targeting” involves understanding when a user has seen her third impression, and knowing the 4th, 5th, and 6th impressions have a higher likelihood of producing an action. That means sending signals to biddable platforms (search and display) to bid higher to win a potentially more valuable user.

It’s Rocket Science, But It’s Not

If you really want to get deep, the nuts and bolts of data management are very complicated, involving real big data science and velocity at Internet speed. That said, applying DMP science to the common problems within addressable marketing is not only accessible—it’s making DMPs the must-have technology for the next ten years, and global delivery management is only one use case out of many. Marketers are starting to understand the importance of capturing the right data (data in), and applying it to addressable channels (data out), and using the insights they collect to optimize their approach to people (not devices).

It’s a great time to be a data-driven marketer!

A Brief History of Banner Time

mighty-jointIt’s been a long time since publishers have truly been in control of their inventory, but new trends in procurement methodologies and technology are steadily giving premium publishers the upper hand.

The story of display inventory procurement started with the Publisher Direct Era, when publishers were firmly in control of their banners, and kept them safely hidden behind sales forces and rate cards. Then the Network Era crept in, and smart companies like Tacoda took all the unwanted banners and categorized them. Advertisers liked to buy based on behavior, and publishers liked the extra check at the end of the month for hard-to-sell inventory.

That was no fun for the demand side though. They started the Programmatic Era, building trading desks, and leveraging DSPs to make sure they were the ones scraping a few percentage points from media deals. Why let networks have all of the arbitrage fun? The poor publisher was left to try and fight back with SSPs and more technology to battle the technology that was disintermediating them, kind of like a robot fight on the Science Channel.

But all of the sudden, publishers realized how silly it was to let someone else determine the value of their inventory, and launched the DMP Era. They ingested first-party data from their registration and page activity and created real “auto intenders” and “cereal moms” and wonderful segments that they could use to effectively sell to marketers. Now, every smart publisher knows more about their inventory than 3rd parties, and they can also find their readers across the wider Web through exchanges. A win-win!

Then all of the marketers in the world started reading AdExchanger, and saw the publisher example, and thought, “Wow, good call!” They started to truly understand how much money Programmatic companies were taking out of the investment they earmarked for media (silly marketers, Y U no read Kawaja’s first IAB deck?), and decided to use their own technology and data to power audience targeting. If it were a baseball game, this DMP Era for Marketers would be in the first or second inning, but the pitcher is throwing at a fast pace.

The next thing that happened was the Programmatic Direct Era, which lasted about ten minutes and effectively jumped the shark when Rubicon bought two of the more prominent companies involved (ShinyAds and iSocket). Programmatic Direct marketplaces promised a flip of the yield curve for publishers to expose the “fat middle” of undervalued impressions. They attempted this by placing blocks of inventory in a marketplace, and enabled the publisher to set rates, impression levels, and provide API access directly into their ad server. Alas, a tweak to Google’s API did not an industry make. Marketers loved the idea, but since they use audience as the primary mechanism to value inventory, PD marketplaces failed as stand-alone entities and were gobbled up. Under the steady hand of RTB-based technologies, they slowly evolve based on buy-side methodologies. Again, the demand side foils a perfectly reasonable, publisher-derived procurement scheme!

So, what’s next?

The Programmatic Direct Era still lives, albeit within private marketplaces (PMPs) and Direct Deal functionality. The IAB’s Open Direct protocol remains stuck at 1.0, but there is hope—and this time it’s a change that is positive for both marketers and publishers. The latest Era in inventory procurement is what I call Total Automation. Let me explain.

Say a big auto manufacturer has a DMP and has identified, via purchase information, the exact profile of everyone who buys their minivan. Call then “Van Moms.” Then suppose the publisher, who licenses an instance of the same DMP, is a women-friendly publication chock full of those Van Moms—and women who just happen to look like Van Moms. It’s pretty easy to pipe those Moms from the marketer right to the publisher. That process, which you might call Programmatic Direct 2.0, is interesting.

It requires no exchanges, no 3rd party data, no DSPs, no “private marketplace” no SSP, and potentially no agencies (spare the thought!). All it requires is some technology to map users and port them directly into an ad server.

What I just described is happening today, and moving quickly. Marketers are discovering that the change from demo-based buying to purchase-based buying through 1st party data is winning them more customers. Publishers are asking for—and commanding—high CPMs, and those CPMs are backing out for marketers. Thanks to all the crap in open exchanges, paying more for quality premium, “well lit” inventory actually works better than slogging through exchanges trying to find the audience needle in a haystack full of robots and “natural born clickers.”

The new Era of Total Automation will start putting publishers back on the map—but not all of them. The big distinction between the winners and losers will not only be the quality of their audience but, more importantly, the first-party data used to derive that audience. Not long ago, it was easy to apply a layer of 3rd party data and call someone an “auto intender” if they brushed past an article on the latest BMW. But compare that to the quality of an “auto intender” on a car site that has looked at 5 sedans over the last 2 weeks, and also used a loan calculator. There’s no comparison. The latter “intender,” collected from page- and user-level attributes directly by the publisher is 10 times more valuable (or $30 CPM rather than $3, if you like). The reason? That user volunteered real, deterministic information about herself that the publisher can validate. I am willing to bet that an auto manufacturer would pay a high CPM for access to an identified basket of those intenders on an ongoing, “always on” basis.

This is fantastic news for publishers that have great, quality inventory and have implemented a first-party data strategy. It’s even better news for the marketers that have embraced data management, and can extract and find their perfect audience on those sites. The Era of Total Automation will be over when every single marketer has a DMP. At that time, we will discover that there is no longer a glut of display inventory—all of the quality “Van Moms” and “Business Travelers” and the like will be completely spoken for. What will be left is a large pile of unreliable, long tail inventory available for the brave DR marketer and his DSP.

I think both marketers and publishers should welcome this new Era of data-driven one-to-one marketing. The crazy thing is that, once we get it right, it looks just like an anonymized version of direct mail—perhaps the oldest, greatest, most effective and measurable marketing tactic ever invented!

[This post originally appeared in AdExchanger on 7/2/15]

Is 2015 the Year of Programmatic Branding?

YourBrandHereWith companies like Kraft and Kellogg’s starting to leverage the programmatic pipes for equity advertising, we are starting to hear a lot of buzz about the potential for “programmatic branding,” or leveraging ad tech pipes to drive upper-funnel consumer engagement. It makes sense. Combine 20 years in online infrastructure investment with rapidly shifting consumer attention from linear to digital channels, and you have the perfect environment to test whether or not digital advertising can create “awareness” and “interest,” the first two pieces of the age old “AIDA” funnel.

The answer, put simply, is yes.

Online reach is considerably less expensive than linear reach, and we are starting to have the ability to reliably measure how that brand engagement is generated. Marketers want an “always-on” stream of equity advertising that comes with measurement—but they also need it. With attention rapidly shifting from traditional channels, investments in linear television are starting to return fewer sales. But most marketers are just starting to gain the digital competency to make programmatic branding a reality.

That competency is called data management—the ability to segment, activate, and analyze consumer audiences in a reliable way at scale. Why is that so?

The most fundamental problem with digital branding is that it is truly a one-to-one marketing exercise. If we dream of the “right message, right person, right time,” then matching a user with her devices is table stakes for programmatic branding. How do I know that Sally Smith on desktop is the same as Sally Smith on tablet? Cross-device identity management (CDIM or, alternatively, CDUI) is the key. Device IDs must be mapped to cookies, other mobile identifiers, and Safari browser signals in order to get a sense of who’s who. Once you unlock user identity, many amazing things become possible.

Global Frequency Capping

One of the reasons programmatic branding has yet to gain serious ground with marketers is because of waste. This is both real (lots of wasted impressions due to invisible ads or robotic traffic) and perceived (how many impressions are ineffective due to frequency issues). The former problem is getting solved by smart technology, and already somewhat mitigated by market pricing. But the latter problem is solvable with data management. Assuming the marketer understands the ideal effective frequency of impressions per channel, or on a global basis, a DMP can manage how many impressions an individual sees by controlling segment membership in various platforms. Let’s say the ideal frequency for cereal advertising aimed at Moms is 30 per day across channels. The advertiser knows less than 30 impressions lessens effectiveness—and over 30 impressions has negligible impact. Advertisers using multiple channels (direct-to-publisher, plus a mobile, video, and display DSPs) are likely over serving impressions in each channel, and maybe underserving in key channels like video. Connecting user identity helps control global frequency, and can save literally millions of dollars, while optimizing the effectiveness of cross-channel advertising.

Sequential Messaging

If “right person” technology is enabled as above, then it makes sense to try and get to “right place and right time.” Data management can enable this Holy Grail of branding, helping marketers create relevance for consumers as they embark on the customer journey. What brand marketers have dreamed of is now possible, and starting to happen. Dad, in the auto-intender bucket, gets exposed to a 15 second pre-roll ad before logging into his newspaper subscription on his tablet in the morning; gets the message reinforced by more equity display ads in the afternoon at work; and, while checking messages on his mobile phone on the way home, gets an offer for $500 off with a qualified test drive. After he hits the dealership and checks in via the CRM system, he receives an e-mail thanking him for his visit and reminding him of the $500 coupon he earned. These tactics are not possible without tying both user identity and systems together. Doing so not only enables sequential messaging, but also the ability to test and measure different approaches (A/B testing).

Cross Channel Attribution

How about attribution? It’s impossible to perform cross-channel attribution without knowing who saw what ad. At the end of the day, it’s really about the insights. Proctor and Gamble is famous for spending millions of dollars every year to understand the “moment of truth,” or why people choose Tide over Surf detergent. Although they know consumer segmentation and behavior better than anyone, even the biggest brand marketers struggle to gain quality insights from digital channels. Data management is starting to make a more reliable view possible. Brand advertising is just another form of investment. Money is the input, and the output is sales and—as important—data on what drove those sales. In the past, brand marketers were reliant upon panel-based measurement to judge campaign effectiveness. Now, data management helps brands understand which channels drove results—and how each contributed. It is early days for truly reliable cross-channel attribution modeling, but we are finally starting to see the death of the “last click” model. Smart marketers are using data to author their own flexible attribution models, making sure all channels involved receive variable credit for driving the final action. In the near future, machine learning will help drive dynamic models, which flex over time as new signals are acquired. We will then start to see just how effective (or not) tactics like standard display advertising are for driving upper funnel engagement.

So, is 2015 the year for programmatic branding? For a select group of marketers that are leveraging data management to enable the best practices outlines above, yes. The more accurately marketers can map online user identity and understand results, the more investment will flow from linear to addressable channels.

An Ad Tech Temperature Check

tenmpcheckClayton Christensen, the father of “disruptive innovation,” would love the ad technology industry.

With more than 2,500 Lumascape companies across various verticals chasing an exit, venture funding drying up for companies that haven’t made an aggressive SAAS revenue case and the rapid convergence of marketing and ad technology, the next few years will see some dramatic shifts.

The coming tsunami of powerful megatrends is driving ad technology relentlessly forward at a time when data is king and the companies that best package and integrate it into multichannel inventory procurement will be the rulers.

In a world where scale matters most, the big are getter bigger and smaller players are getting forced out, which is not necessarily good for innovation.

Data: Powering The Next Decade Of Ad Tech

Data, especially as it relates to “people data,” is and will be the dominant theme for ad technology going forward.

Monolithic companies with access to a people-based identity graph are leaning in heavily to identity management, trying to own the phone book of the connected device era. Facebook’s connection to Atlas leverages powerful and deeply personal deterministic data, continually volunteered on a daily basis by its users, to drive targeting. Google is attaching its massive PII data set garnered through Gmail, search and other platforms to its execution platforms with its new DMP, DoubleClick Audience Manager.

Both platforms prefer to keep information on audience reach safely within their domains, leaving marketers wondering how smart it really to tie the keys of user identity in a “walled garden” with media execution.

Will large marketers embrace these platforms for their consumer identity management needs, or will they continue to leverage them for media and keep their data eggs in another basket?

While some run into the arms of powerful cloud solutions that combine data management with media execution, many are choosing to take a “church and state” approach to data and media, keeping them separate. Marketers have to decide whether the risk of tying first-party data together with someone’s media business is worth having an all-in-one approach.

Agencies Must Adapt Or Die As Consultancies Edge Into Programmatic

Media agencies have also been challenged to provide more transparency around the way they procure inventory, the various incentive schemes they have with publishers and their overall methodology for finding audiences. With cross-device proliferation, agencies must be able to identify users to achieve one-to-one marketing programs, and they need novel ways to reach those users at scale.

That means a commitment to automation, albeit one that may come at the expense of revenue models derived through percentage of spend and arbitrage. Agencies will need new ways to add value in a world where demand-side players are finding closer connections to the supply side.

As media margins collapse, agencies need to act as data-driven marketing consultants to lift margins and stay relevant. They face increasing competition from large consultancies whose bread and butter has been technology integration. It’s a tough spot but opportunities abound for smart agencies that can differentiate themselves.

Zombie Companies Die Off But Edge-Case Innovation Continues

We’ve been talking about “zombie ad tech” for years now, but we are finally starting to see the end of the road for many point solution companies that have yet to be integrated into larger mar tech “stacks.”

Data-management platforms with native tag-management capabilities are displacing standalone tag-management companies. Retargeting is a tactic, not a standalone business, which is now a status quo part of many execution platforms. Fraud detection systems are slowly being dragged into existing platforms as add-on functionality. Individual data providers are being sucked into distribution platforms and data exchanges that offer customer exposure at scale. The list goes on and on.

This is an incredibly positive thing for marketers and publishers, but it is also a challenge. Cutting-edge technologies that give a competitive advantage are rarely so advantageous after they’ve moved into a larger “cloud.” Smart tech buyers must strike a balance between finding the next shiny objects that confer differentiating value, while building a stable “stack” that can scale as they grow.

That said, the big marketing technology “clouds” offered by Adobe, Oracle and Salesforce continue to grow, as they gobble up interesting pieces of the digital marketing “stack.”

Will marketers go all-in on someone’s cloud, build their own “cloud” or leverage services offerings that bring a unified capability together through outsourcing?

Right now, the jury is out, mostly because licensing your own cloud takes more than just money, but also the right personnel and company resources to make it work. Yet, marketers are starting to understand that the capability to build automated efficiency is no longer just a function of marketing, but a way to leverage people data to drive value across the entire company.

Today’s media targeting will quickly give way to tomorrow’s data-driven enterprise strategy. It’s happening now, and quickly

New Procurement Models Explode Exchanges, Drive Direct Deals

I think the most exciting things happening in ad technology are happening in inventory procurement.

Programmatic direct technologies are evolving, adding real audience enablement. Version 1.0 of programmatic direct was the ability to access a futures marketplace of premium blocks of inventory. Most buyers, used to transacting on audience, not inventory, rejected the idea.

Version 2.0 brings an audience layer to premium, well-lit inventory, while changing the procurement methodology. I think most private marketplaces within ad exchanges are placeholders for a while, as big marketers and publishers start connecting real people data pipes together and start to buy directly. It’s happening now – quickly.

I also can see really innovative companies leaning into creating a whole new API-driven way of media planning and buying across channels that makes sense. In the near future, the future-driven approaches of companies like MassExchange will bring to cross-channel inventory procurement a methodology that is more regulated, transparent and reminiscent of financial markets. It’s a fun space to watch.

Who will begin adding algorithmic, data-science driven automation and proficiency to the planning process, not just execution and optimization in the programmatic space?

Many of those in the ad technology and media game are here for the challenge, the rapid pace of innovation and the opportunity to change the status quo. We are all getting way more than we imagined lately, in a fun, exciting and fast-moving environment that punishes failure harshly, but rewards true market innovation. Stay safe out there.

[This post was originally published in AdExchanger on 6.16.15]

Remarks at ICOM 2015 in San Sebastien

I-COM Global Summit: Panel Discussion on Leveraging Big Data to take Programmatic to the Next Level – Chris O’Hara, Krux Digital, Eric Picard, Mediamath & Tom Simpson, MediaQuark

Leveraging big dataChris O’Hara, VP Strategic Accounts, Krux Digital, USA, Eric Picard, VP Strategic Partnerships, Mediamath, UK and Tom Simpson, CEO, MediaQuark, Singapore were speakers and David Smith, CEO & Founder, Mediasmith, USA was moderator in Leveraging Big Data to take Programmatic to the Next Level. This discussion had no presentation.

Remarks at INMA World Congress

15WC-Panel-Programmatic-250Programmatic advertising, once a threat, is now considered an opportunity

11 May 2015 · By Western iMedia

The varied facets of programmatic advertising have become more mainstream in the media industry. Those heavily involved at their media companies see a promising future for this once niche revenue opportunity.

“Efficiency is the name of the game,” said Matt Prohaska, CEO and principal of Prohaska Consulting, speaking about programmatic buying at the World Congress in New York City on Monday. Prohaska turned the discussion over to a panel of experts that explored the future of programmatic.

There are two major aspects of programmatic advertising, said Jeremy Crandall, senior vice president of operations and client services at Adroit Digital. Automated programmatic, which includes real-time bidding (RTB), is the side heard more often.

“To me, programmatic is like a BLT sandwich,” she said. “RTB is the bacon.”

The other aspect is data-driven programmatic, in which buyers leverage data to make informed decisions about which ad impressions to buy.

Crandall explained the different modes of programmatic buying and selling. Open RTB leverages ads not sold by a direct salesperson. These ads are are remnant impressions and the transactions take less than 100 milliseconds.

“It really is a many-to-many marketplace,” she said.

Crandall described private marketplace (PMP) transactions as a walled garden, in which select buyers are invited to participate and usually involves price floors. PMP is not as impersonal as RTB, Crandall said. Sharing data makes a significant difference in efficacy of the buy.

“It is really those relationships that still matter a lot,” she said.

Chris O’Hara, vice president of strategic accounts at Krux Digital, gave insight from the data management platform (DMP) side of the programmatic equation.

O’Hara outlined the evolution of publisher ad sales as moving from publisher direct, to ad network 1.0, to the introduction of the DSP era. We are currently in the DMP area, one of “programmatic direct,” O’Hara said.

We are moving toward an era of total automation across channels. Efficient automation, where publishers retain more revenue and advertisers get increased reach for budget, is part of that future.

“This is the wave of the future; it’s happening and it’s super exciting,” O’Hara said.

Jon Usry, director of digital platforms at Dallas Morning News Media, shared his company’s strategic approach to the current and future states of programmatic.

When programmatic first entered the market, the reaction from publishers was one of fear, as buyers were perceived to have the advantage and the quality of these ads seemed low.

This is not the case now, Ursy said. Programmatic is seen as an opportunity rather than a threat.

“Certainly, a lot of things have changed significantly,” he said.

Dallas Morning News made the strategic decision to build or purchase digital marketing solutions as programmatic grew in influence.

The current state of programmatic is a level playing field, Ursy said. The company hold regular meetings about how to leverage programmatic. They explore all options, he said: “There might be certain situations where we want programmatic to be competing with direct sells.”

Developing a strong plan for programmatic is important to the company. Programmatic media spend is set to double in the U.S. alone, Ursy said.

Usry shared where The Dallas Morning News is focusing as they develop their future programmatic strategy:

  1. Establishing programmatic as a core competency.
  2. Selecting the right technology partners.
  3. Embracing a culture of testing and learn.
  4. Hiring the right talent.

As data continues to grow and marketers get better tools, Usry says programmatic has a positive future: “For the future of programmatic, I think it looks promising for all parties.”

Read more: http://www.inma.org/blogs/world-congress/post.cfm/programmatic-advertising-once-a-threat-is-now-considered-an-opportunity#ixzz3bGw6O7yI
Read more: http://www.inma.org/blogs/world-congress/post.cfm/programmatic-advertising-once-a-threat-is-now-considered-an-opportunity#ixzz3bGvnIBgK

The Five (New) Things to Expect from a DMP

5thingsIn early 2012, when data management technology was somewhat nascent, I wrote about “the five things to expect from a DMP.” They were: To unlock the power of one’s first party data; decrease reliance upon third party data; generate unique audience insights; use data to audience power new channels; and create efficiency. A little over three years later, those things still continue to drive interest in DMP technology—and great value for both publishers and marketers.

The “table-stakes” functionality of DMPs—segmentation, lookalike modeling, targeting, and analytics—continue to resonate. Even the least advanced DMPs have those abilities, and this is what people who buy DMP software should expect from any system. Unfortunately, there are now dozens of “platforms” that claim DMP technology. Some are legitimate players, born from the ground up to be “first-party” DMPs. Some have been created as “lightweight” DMPs to collect and distribute cookies for display advertising. And still others are legacy tag management or network platforms that have bolted on DMP functionality as they work towards a fuller “stack” solution that marketers say they want.

Writing this article again, three years later, I would still encourage software buyers to evaluate their DMP choice on the ability of their partner to meet the above-listed criteria. But, there has been so much nuance and development over the last several years. Therefore, additional selection criteria present themselves if one is expected to make a reasonably informed choice in DMP selection going forward.

Here’s what the modern DMP consumer should be looking out for:

  • Lookback: Three years ago I talked about “lookback windows” in the context of giving publishers the ability to attribute future conversion events to ads shown previously on their site. That is still a compelling publisher user case. What “lookback windows” really refer to is whether or not your DMP can capture 100% of the raw, log-level user event data—and store it. This necessitates an open taxonomy (because “you don’t know what you don’t know,”) and also the ability to store tons of data and make it accessible quickly. This is considered to be complete data architecture. Many DMPs operate with a rigid, defined taxonomy and only collect segment IDs—not the underlying data. That’s a problem for businesses that need to move fast and activate new segments opportunistically. Ask how—and for how long—your DMP stores data.
  • Onboarding: Lots of DMPs claim to have the ability to easily ingest CRM and other offline data and match it to cookies, but the truth is everyone depends on a limited set of “onboarding” vendors to provide the matches. That’s fine, but there are some nuances and subtleties involved in the process by which offline data enters the online identity space (hashing). DMPs should enable seamless connection to all three major onboarding providers, the ability to select the methodology by which offline identity is matched to online, and also be able to automatically choose which onboarding partner is right for each identity. Ask how each DMP you evaluate works with each vendor, what kind of match rates you can expect, and how each stores persistent user identity to insure better matches over time.
  • Measurement: Let’s face it, the ability to tweak programmatic audience delivery to online video viewability numbers up a few percentage points is great, but nothing moves the needle like linear television. Marketers spend a ton of money there, and will continue to do so for the foreseeable future—all the while moving incremental percentages of their budget into the digital channels where folks are spending an increasing amount of time. But, they are never really going to go full throttle with digital until they can reconcile reach and frequency across channels—and those channels must include linear! Your DMP should be able to handle overlap reporting, light attribution, and cross-channel media performance—but it should also start making some highly informed guesses about how linear audiences map to digital ones, in order to enable true attribution and media mix models. Ask how your DMP is positioned to tie the linear and digital strings together from a measurement perspective.
  • CDIM: Three years ago, we were still waiting for the “year of mobile” to occur, so “cross device identity management” was still largely pre-funded slideware on some entrepreneur’s computer. Jump to today, and “CDIM” and “CDUI” are at the tip of every ad tech tongue! As more and more people move from device to device—almost none of which support the traditional cookie as an identifier—marketers and publishers desperately need to map devices to people. It’s the only way to deliver the fabled “360 degree view” of the user. Ask your DMP vendor how they are prepared to deliver deterministic matches and, more importantly, how they reconcile identity without seeing a user logging in across devices. Doing great probabilistic matching necessitates not only strong algorithms but, more importantly, scale of users which breeds precision models. What is the size of their “truth set” of user data with which to probabilistically determine user identity? The quality and scale of that data will determine your choice.
  • Data Governance: I think the biggest question to ask a potential DMP vendor is their philosophy on data ownership. For both marketers and publishers, audience data is likely one of their top three assets. Trusting such data to a technology vendor is not something to be considered lightly. How is that data stored? What are the policy controls available to help you share that data with trusted partners? What about privacy and governance? How can my platform help me activate data in different places, where different rules about PII and data collection and storage apply? Knowing the answers to these before you buy can save lots of heartache (and legal fees) later. More importantly, how independent is your data? Is your partner also in the business of selling media or data? That can create some conflicts of interest—especially if your data might be valuable to a competitor. Finally, what if you want your data back? You have the right to get it out quickly, and in a useable format.

The bad news is that choosing a DMP isn’t any easier than it was three years ago. It’s a lot more complex, and you really need to dig in deeply to understand the very small nuances between platforms that appear, on the surface, to be very much the same. The good news is that there is a great deal of selection available, and some very high quality vendors to choose from. Take your time, put your vendors through a very rigorous process that includes asking the questions outlined above, and choose wisely!

[This post originally appeared in the EConsultancy blog on 5.11.15]

What Marketers want from AdTech

analyitics_segmentationIf you read AdExchanger regularly, you might think that nearly every global marketer has a programmatic trading strategy. They also seem to be leveraging data management technology to get the fabled “360-degree view” of their customers, to whom they are delivering concise one-to-one omnichannel experiences.

The reality is that most marketers are just starting to figure this out. Their experience ranges from asking, “What’s a DMP?” to “Tell me your current thinking on machine-derived segmentation.”

A small, but significant, number of major global marketers are aggressively leaning into data-driven omnichannel marketing, pioneering a trend that is not going anywhere anytime soon. Over the next five years, nearly every global marketer will have a data-management platform (DMP), programmatic strategy and “chief marketing technologist,” a hybrid chief marketing officer/chief information officer that marries marketing and technology. These are exciting times for people in data-driven marketing.

So, what are marketers looking for from technology today? Although these conversations ultimately become technical in nature, you soon discover that marketers want some pretty basic, “table stakes” type of stuff.

Better Segmentation Through First-Party Data 

Marketers spend a lot of time building customer personas. Once a customer is in their customer relationship management (CRM) database and generates some sales data, it’s pretty easy to understand who they are, what they like to buy and where they generally can be found. From a programmatic perspective, these are the equivalent of a car dealer’s “auto intenders,” neatly packaged up by ad networks and data providers to be targeted in exchanges.

That’s still available today, but the amazing amount of robotic traffic, click fraud and media arbitrage has made marketers realize just how loose some segment definitions may be. Data companies have a great deal of incentive to create and sell lots of auto intenders, so marketers are starting to look deeper at how such segments are actually created. It turns out that some auto intenders are people who brushed past a car picture on the web, which lumped them into a $12 cost per mille (CPM) audience segment.

Those days seem to be coming to an abrupt close as marketers increasingly use their own data to curate such segments and premium publishers, which do have auto intenders among their readerships, use data-management tools to make highly granular segments available directly to the demand side. Marketers are now willing to pay premium prices for premium audiences in a dynamic being driven by more transparency into how audiences are created in the first place. Audiences comprised of first- and second-party data will win every time in a transparent ecosystem. 

Less Waste, More Efficiency

Part and parcel of better audience segmentation is less waste and more media efficiency. The old saw, “I know half of my marketing works, I just don’t know which half,” goes away with good data and better attribution.

As an industry, we promised to eliminate waste 20 years ago. The banner ad was supposed to usher in a brave new world of media accountability, but we ended up creating a hell of a mess. Luckily, venture money backed “solutions” to the problems of click fraud, faulty measurement and endless complexity in digital marketing workflow.

Marketers don’t want to buy more technology problems they need to fix. And they don’t want to spend money chasing the same people around the web. They want to limit how much they spend trying to achieve reach. Data-management technology is starting to rein in wasteful spending, via tactics including global frequency management, more precise segmentation, overlap analysis and search suppression.

Marketers want to use data to be more precise. They are starting to leverage systems that help them understand viewability and get a better sense of attribution by moving away from stale last-click models. The days are numbered for marketers with black-box technology that creates a layer between their segmentation strategies and how performance is achieved against it.

One-To-One Communication Via Cross-Device Identity

Maybe the biggest trend and aspiration among marketers is the ability to truly achieve one-to-one marketing. A few years ago, that meant email, telemarketing and direct mail. Today, if you want to have a one-to-one customer relationship, you must be able to associate the “one” person with as many as five or six connected devices.

That is extremely difficult, mostly because we have been highly dependent on the browser-based “cookie” to determine identity. Cookie-based technologies evolved to ensure different cookies match up in different systems, but it’s a new world today.

Really understanding user identity means being able to reconcile different device signals with a universal ID. That means lots of cookies from different browsers, Safari’s unique browser signature, IDFAs, Android device IDs and even signals from devices like Roku, not to mention reliably “onboarding” anonymized offline data, such as CRM records.

Without device mapping, an individual looks like seven different devices to a marketer, making it impossible to deliver the “right message, right place, right time.” Frequency management is tougher, attribution models start to break and sequential messaging is hard to do. Marketers want a reliable way to reconcile user identity across devices so they can adapt their messages to your situation.

Data-Derived Insights 

Marketers inject tons of dollars into the advertising ecosystem and expect detailed performance reports. Each dollar spent is an investment. Some dollars create sales results, but all dollars spent in addressable channels create some kind of data.

Surprisingly, that data is still mostly siloed, with social data signals not connected to display results. Much of it is delivered in the form of weekly spreadsheets put together by an agency account manager. It seems crazy that marketers can’t fully take advantage of all the data produced by their digital marketing, but that is still very much the reality of 2015.

Thankfully, that dynamic is changing quickly. Data technology is rapidly offering a “people layer” of intelligence across all channels. Data coming into a central system can look at campaign performance across many dimensions, but the key is aggregating that data at the people level. How did a segment of “shopping cart abandoners” perform on display vs. video?

Marketers now operate under the new but valid assumption that they will be able to track performance in this way. They are starting to understand that every addressable media investment can create more than just sales – it can produce data that helps them get smarter about their media investments going forward.

It’s a great time to be a data-driven marketer.

[This post originally appeared in AdExchanger on 4.6.15]

CDIM is Table Stakes in the Data Management Wars

IdentityCrisisA recent analyst report made an astute observation that all marketers should consider: It’s not about “digital marketing” anymore – it’s about marketing in a digital world. The nuance there is subtle, but the underlying truth is huge. The world has changed for marketers, and it’s more complicated than ever.

Most consumers spend more time on web-connected devices than television, creating a fragmented media landscape where attention is divided by multiple devices and thousands of addressable media outlets. For marketers, the old “AIDA” (attention, interest, desire and action) funnel persists, but fails in the face of the connected consumer.

When television, print and radio dominated, moving a consumer from product awareness to purchase had a fairly straightforward playbook. Today’s always-on, connected consumer is on a “customer journey,” interacting with a social media, review sites, pricing guides, blogs and chatting with friends to decide everything from small supermarket purchases to big investments like a new house or car.

Marketers want to be in the stream of the connected consumer and at key touch points on the customer journey. But, in order to understand the journey and be part of it, they must be able to map people across their devices. This is starting to be known as cross-device identity management (CDIM), and it is at the core of data-driven marketing.

In short, identity lies at the heart of successful people data activation.

Until very recently, managing online identity was largely about matching a customer’s online cookie with other cookies and CRM data, in order to ensure the desktop computer user was aligned with her digital footprint. Today, the identity landscape is highly varied, necessitating matching ID signals from several different browsers, device IDs from mobile phones and tablets, IDs from streaming devices and video game consoles and mobile app SDKs.

Matching a single user across their various connected devices is a challenge. Matching millions of users across multiple millions of devices is both a big data and data science challenge.

Real one-to-one marketing is only possible when the second party – the customer – is properly identified. This can be done using deterministic data, or information people volunteer about themselves, in a probabilistic manner, where the marketer guesses who the person is based on certain behavioral patterns and signals. Most digital marketing companies that offer identity management solutions take what data they have and use a proprietary algorithm to try and map device signals to users.

The effectiveness of device identity algorithms depends on two factors: the quality of the underlying deterministic data – the “truth set” – and its scale.

Data Quality Matters

There is data, and then there is data. The old software axiom of “garbage in, garbage out” certainly applies to cross-device user identity. Truly valuable deterministic data include things like age, gender and income data. In order to get such data, web publishers must offer their visitors a great deal of value and be trusted to hold such information securely. Therefore, large, trusted publishers – often with subscription paywalls – are able to collect highly valuable first-party user data.

Part of the quality equation also relates to the data’s ability to unlock cross-device signals. Does the site have users that are logged in across desktop, mobile phone and tablet? If so, those signals can be aggregated to determine that Sally Smith is the same person using several different devices. Publishers like The Wall Street Journal and The New York Times meet these criteria.

Scale Is Critical

In order to drive the best probabilistic user matches, algorithms need huge sets of data to learn from. In large data sets, even small statistical variances can yield surprising insights when tested repeatedly. The larger the set of deterministic data –the “truth” of identity – the better the machine is able to establish probability. A platform seeing several million unique users and their behavioral and technographic signatures may find similarities, but seeing billions of users will yield the minuscule differences that unlock the identity puzzle. Scale breeds precision, and precision counts when it comes to user identity.

As digital lives evolve beyond a few devices into more connected “things,” having a connected view of an individual is a top priority for marketers that want to enable the one-to-one relationship with consumers. Reliably mapping identity across devices opens up several possibilities.

Global Frequency Management: Marketers that leverage multiple execution platforms, including search, email, display, video and mobile, have the ability to limit frequency in each platform. That same user, however, looks like five different people without centralized identity management.

Many marketers don’t understand what ideal message frequency looks like at the start of a campaign, and most are serving ads far above the optimal effective frequency, resulting in large scale waste. Data management platforms can control segment membership across many different execution platforms and effectively cap user views at a “global” level, ensuring the user isn’t over-served in one channel and underserved in another.

Sequential Messaging: Another benefit of cross-device identity is that a user can be targeted with different ads based on where they are in the consumer journey. Knowing where a consumer is in an established conversion path or funnel is a critical part of creative decisioning. Optimizing the delivery of cross-channel messages at scale is what separates tactical digital marketers and enterprise-class digital companies that put people data at the heart of everything they do.

Customer Journey Modeling: Without connecting user identity in a centralized platform, understanding how disparate channels drive purchase intent is impossible. Today’s models bear the legacy of desktop performance metrics, such as last click, or have been engineered to favor display tactics, including first view. The true view of performance must involve all addressable channels, and even consider linear media investment that lacks deterministic data. This is challenging but all but impossible without cross-device identity management in place.

The ubiquity of personal technology has transformed today’s consumers into “digital natives” who seamlessly switch between devices, controlling the way they transmit and receive information. Marketers and publishers alike must adapt to a new reality that puts them in control of how editorial and advertising content is accessed. Delivering the right consumer experience is the new battleground for CMOs. Unlocking identity is the first step in winning the war.

[This post originally appeared in AdExchanger on 3.16.15]

The Agency’s Role in Data Management

MadMenTwenty years after the first banner ad, the programmatic media era has firmly taken hold. The Holy Grail for marketers is a map to the “consumer journey,” a circuitous route filled with multiple addressable customer touchpoints. With consumers spending more of their time on mobile devices – and interacting with brands like never before through social channels, review sites, pricing comparison sites and apps – how can marketers influence customers everywhere they encounter a brand?

It’s a tough nut to crack, but starting to become an achievable reality to companies dedicated to collecting, understanding and activating their data. Marketers are starting to turn towards data management platforms (DMP), which help them connect people with their various devices, develop granular audience segments, gain valuable insights and integrate with various platforms where they can activate that data. In addition to technology, marketers also have to configure their entire enterprises to align with the new data-driven realities on the ground.

The question is: Where do marketers turn for help with this challenging, enterprise-level transition?

Many argue that agencies cannot support the type of deep domain expertise needed for the complicated integrations, data science and modeling that has become an everyday issue in modern marketing. But should data management software selection and integration be the sole province of the Accentures and IBMs of the world, or is there room for agencies to play?

For lots of software companies, having an agency in between an advertiser and their marketing platform sounds like a problem to overcome, rather than a solution. Many ad tech sellers out there have lamented the process of the dreaded agency “lunch and learn” to develop a software capability “point of view” for a big client.

Yet, there are highly compelling ways agencies add value to the software selection process. The best agencies insert themselves into the data conversation and use their media and creative expertise to influence what DMPs marketers choose, as well as their role within the managed stack.

From Digital To Enterprise

It makes perfect sense that agencies are involved with data management. The first intersection of data and media added the “targeting” column to the digital RFP. Agencies have started to evolve beyond the Excel-based media planning process to start their plans with an audience persona that is developed in conjunction with their clients. Today, plans begin with audience data applied to as many channels as are reachable. Audience data has moved beyond digital to become universal.

Agencies have also been at the tip of the spear, both from an audience research standpoint (understanding where the most relevant audiences can be found across channels) and an activation standpoint (applying huge media budgets to supply partners). Since they are on the front lines of where media dollars are expressed, they often get the first practical look at where data impacts consumer engagement. During and after campaigns conclude, the agency also owns the analytics piece. How did this channel, partner and creative perform? Why?

Having formerly limited agencies to doing campaign development and execution, marketers are now turning to the collected expertise of their agency media and analytics teams and asking them to embed the culture of audience data into their larger organization. When it’s time to select the DMP—the internal machine that will drive the people-based marketing enterprise—the agency is naturally called upon.

Data Management Is About Ownership

Although a small portion of innovative marketers have begun leveraging DMP technology and taken media execution “in-house,” the vast majority stills relies on agencies and ad tech platform partners to operate their stacks through a managed services approach. Whether a marketer should own the capability to manage its own ad technology stack is a matter of choice, but data ownership shouldn’t be. Brands may not want to own the process of applying audience data to cross-channel media, but they absolutely must own their data.

Where Agencies Play in Data Management

The Initial Approach: Most agencies have experience leveraging marketers’ first-party data through retargeting on display advertising. In an initial DMP engagement, marketers will rely on their agencies to build effective audience personas, map those to available attributes that exist within the marketer’s taxonomy and apply the segments to existing addressable channels. Marketers can and should rely on past campaign insights, attribution reports and other data insights from their agencies when test-driving DMPs.

Connect the Dots: For most marketers, agencies have been the de-facto connector of their diverse systems. Media teams operate display, video and mobile DSPs, ad serving platforms, and attribution tools. Helping a marketer and their DMP partner tie these execution platforms together, understand audience data, and the performance data generated from campaigns is a critical part of a successful DMP implementation.

Operator: Last, but not least, is the agency as operator of the DMP. Marketers want their data safely protected in their own DMP, with strong governance rules around how first-party data is shared. They also need a hub for utilizing third-party data and integrating it with various execution and analytics platforms. Marketers may not want to operate the DMP themselves, though. Agencies can win by helping marketers wring the most value from their platforms.

Marketers have strong expertise in their products, markets and customer base – and should focus on their core strengths to grow. Agencies are great at finding audiences, building compelling creative and applying marketing investment dollars across channels, but are not necessarily the right stewards of others’ data.

Future success for agencies will come from helping marketers implement their data management strategy, align their data with their existing technology stack and return insights that drive ongoing results.

[This post originally appeared in AdExchanger on 2.2.15]

[This

2015 is the Year of Programmatic Branding

MuchWinWith companies like Kraft and Kellogg’s starting to leverage the programmatic pipes for equity advertising, we are starting to hear a lot of buzz about the potential for “programmatic branding,” or the use of ad tech pipes to drive upper-funnel consumer engagement.

It makes sense. Combine 20 years in online infrastructure investment with rapidly shifting consumer attention from linear to digital channels, and you have the perfect environment to test whether or not digital advertising can create “awareness” and “interest,” the first two pieces of the age old “AIDA” funnel.

The answer, put simply, is yes.

Online reach is considerably less expensive than linear reach, and we are starting to have the ability to reliably measure how that brand engagement is generated. Marketers don’t just want an “always-on” stream of brand advertising that comes with measurement – they also need it. With attention rapidly shifting from traditional channels, investments in linear television are starting to return fewer sales.

But most marketers are just starting to gain the digital competency to make programmatic branding a reality. That competency is called data management – the ability to segment, activate and analyze consumer audiences in a reliable way at scale.

The most fundamental problem with digital branding is that it is truly a one-to-one marketing exercise. If we dream of the “right message, right person, right time,” then matching a user with her devices is table stakes for programmatic branding. How do I know that Sally Smith on desktop is the same as Sally Smith on tablet?

Cross-device identity management is the key. Device IDs must be mapped to cookies, other mobile identifiers and Safari browser signals to get a sense of who’s who. Once you unlock user identity, many amazing things become possible.

Global Frequency Capping

One of the reasons programmatic branding has yet to gain serious ground with marketers is because of waste. This is both real, including all those wasted impressions due to invisible ads or robotic traffic, and perceived, such as impressions that are ineffective due to frequency issues.

Smart technology and market pricing solves the first problem, while data management solves the second. Assuming the marketer understands the ideal effective frequency of impressions per channel, or on a global basis, a DMP can manage how many impressions an individual sees by controlling segment membership in various platforms. Let’s say, for example, the ideal frequency for cereal advertising aimed at moms is 30 per day across channels. The advertiser knows showing fewer than 30 impressions reduces effectiveness, while more than 30 impressions has a negligible impact. Advertisers using multiple channels, such as direct-to-publisher, plus mobile, video and display DSPs, are likely overserving impressions in each channel and possibly underserving in key channels like video. Connecting user identity helps control global frequency and can save literally millions of dollars, while optimizing the effectiveness of cross-channel advertising.

Sequential Messaging

If “right person” technology is enabled as above, the next logical step is to try and get to “right place and right time.” Data management can enable this holy grail of branding, helping marketers create relevance for consumers as they embark on the customer journey. What brand marketers have dreamed of is now possible and starting to happen.

Dad, in the auto-intender bucket, is exposed to a 15-second pre-roll ad before logging into his newspaper subscription on his tablet in the morning. The message is reinforced by more equity display ads he sees in the afternoon at work. And while checking messages on his mobile phone on the way home, he receives an offer for $500 off with a qualified test drive. After Dad hits the dealership and checks in through the CRM system, he receives an email thanking him for his visit and reminding him of the $500 coupon he earned.

These tactics are not possible without tying user identity and systems together. Doing so not only enables sequential messaging, but also the ability to test and measure different approaches through A/B testing.

Cross-Channel Attribution

How about attribution? It’s impossible to perform cross-channel attribution without knowing who saw what ad. At the end of the day, it’s really about the insights.

Procter & Gamble is famous for spending millions of dollars every year to understand the “moment of truth,” or why people choose Tide over another detergent. Although they know consumer segmentation and behavior better than anyone, even the biggest brand marketers struggle to gain quality insights from digital channels.

Data management is starting to make a more reliable view possible. Brand advertising is just another form of investment. Money is the input. The output is sales and, just as important, the data on what drove those sales. In the past, brand marketers relied on panel-based measurement to judge campaign effectiveness. Now, data management helps brands understand which channels drove results and how each contributed.

It is early days for truly reliable cross-channel attribution modeling, but we are finally starting to see the death of the “last-click” model. Smart marketers use data to author their own flexible attribution models, making sure all channels involved receive variable credit for driving the final action. In the near future, machine learning will help drive dynamic models, which flex over time as new signals are acquired. We will then start to see just how effective – or not – tactics like standard display advertising are for driving upper-funnel engagement.

Is 2015 the year for programmatic branding? For marketers that are leveraging data management to enable the best practices outlined above, the answer is yes. The more accurately marketers can map online user identity and understand results, the more investment will flow from linear to addressable channels.

[This post originally appeared on 1.4.2015 in AdExchanger]

How Can Advertisers Bypass The Industry’s Walled Gardens?

own-walled-gardenIn this increasingly cross-device world, marketers have been steadily losing the ability to connect with consumers in meaningful ways. Being a marketer has gone from three-martini lunches where you commit to a year’s worth of advertising in November to a constant hunt for new and existing customers along a multifaceted “customer journey” where the message is no longer controlled.

Consumers’ attention migrates from device to device, where they spread their limited attention among multiple applications. It’s become a technology game to try and track them down, and starting to become a big data game to serve them the “right message, at the right place, at the right time.”

Modern ad tech is supposed to be the marketer’s savior, helping him sort out how to migrate budgets from traditional media, such as TV, radio and print, to the addressable channels where people now spend all of their time. Marketers and their agencies need a technology “stack,” but they end up with a hot mess of different solutions, including various DSPs for multiple channels, content marketing software and ad servers.

Operating and managing all of them is possible, but laborious and difficult to do right. Worse still, these systems are nearly impossible to connect. Am I targeting the same consumer over and over through various channels? How to manage messaging, frequency and sequencing of ads?

Since all of these systems purport to connect marketers to customers on the audience level, the coin of the realm is data. It’s not just “audience data” but actual data on the individuals the marketer wants to target.

Marketing is now a people game.

Yet, in the cross-channel, evolving world of addressable media, connecting people to their various devices is difficult. You need to see a lot of user data, and you have to not only collect web-based event data, but also mobile data where cookies don’t exist. Deterministic data, such as a website’s registration data, can lay the foundation for identity. When blended with probabilistic data and modeled from user behavior and other signals, it becomes possible to find an individual.

Right now, the overlords of the people marketing game are platforms like Google, where people are happy to stay logged in to their email application on desktop, mobile and tablet, or Facebook, which knows everything because we are nice enough to tell them. Regular publishers may be lucky enough to have subscription users that log in to desktop and mobile devices, but most publishers don’t collect such data. Their ability to deliver true one-to-one marketing to their advertisers is limited to their ability to identify users.

This dynamic rapidly makes the big “walled gardens” of the Internet the only place big marketers can go to unlock the customer journey. That might work for Google and Facebook shareholders and employees, but it’s not good for anyone else. In our increasingly data-dependent world, not all marketers are comfortable borrowing the keys to user identity from platforms that sell their customers advertising. Soon, everyone will have to either pay a stiff toll to access such user data, or come up with innovation that enables a different way to unlock people-centric marketing.

What is needed is an independent “truth set” that advertisers can leverage to match their anonymous traffic with rich customer profiles, so they can actually start to unlock the coveted “360-degree view of the user.” Not only does a large truth set of users create better match rates with first-party data to improve targeting, but it also holds the key to making things like lookalike modeling and algorithmic optimization work. Put simply, the more data the machine has to work with, the more patterns it finds and the better it learns. In the case of user identity, the probabilistic models most DMPs deploy today are very similar. Their individual effectiveness depends on the underlying data they can leverage to do their jobs.

In the new cross-device reality: If you can’t leverage a huge data set to target users, it’s time to take your toys and go home. Little Johnny doesn’t use his desktop anymore.

Think about the three principle assets most companies have: their brand, their intellectual property and products and their customer data. Why should a company make a third of their internal value dependent upon a third party, whether or not they pledge “no evil?” Those that offer a “triple play” of mobile, cable television and phone services are also part of the few companies that can match a user across various devices. The problem? They all sell, or facilitate the sale of, lots of advertising. Marketers are not sure they want to depend on them for unlocking the puzzle of user identity.

Some of the greatest providers of audience data are independent publishers who, banded together, can create great scale and assemble a truth set as great as Facebook and Google. Maybe it’s time to create a data alliance that breaks the existing paradigm. The “give to get” proposition would be simple: Publishers contribute anonymized audience identity data to a central platform and get access to identity services as a participant. This syndicate could enable the deployment of a universal ID that helps marketers match consumers to their devices and create an alternative to the large walled gardens.

The real truth is that, without banding together, even great premium publishers will have a hard time unlocking the enigma of cross-device identity for marketers. Why not build a garden with your neighbors, rather than play in somebody else’s?

[This post was originally published in AdExchanger on 12.11.14]

The Future of Programmatic is Programmatic Futures

ElroyBack in 2007, a company called TRAFFIQ started one of the first programmatic futures exchanges. The idea was simple: publishers committed blocks of premium inventory into the exchange at a stated price (say, a block of 500,000 homepage rectangle units in November at $8 CPM), and advertisers could construct packages of premium inventory at discounted prices by making future commitments. Basically, a better, faster way to buy digital guaranteed.

The idea never really took off. Publishers didn’t really understand how to value their inventory in the future, real-time enablement was just starting to take off, and advertisers and their agencies were deeply stuck in manual inventory procurement run by spreadsheets and fax machines. (TRAFFIQ went on to build some highly innovative workflow automation software, and is now a successful technology- enabled digital agency).

Almost eight years later, we are living in a fully programmatic world—but many of the benefits of programmatic futures have yet to come true. Today’s “programmatic” is still very focused on RTB, inventory pools are still murky, and technology’s ability to value publisher inventory still has a long way to go.  What’s missing?

The problem with today’s programmatic RTB environment is that the “exchanges” aren’t really true exchanges like we have in the financial markets. Although you can liken online inventory to stocks, the comparison is tough to justify. Lacking agreed measurement, value continues to be in the eye of the beholder. More importantly, the procurement process is still driven more by the buyers than the sellers. Private exchanges are starting to make inroads in terms of creating valid counterparty transactions, but the RTB pipes have not been engineered to handle the key aspects of transactional workflow.

The biggest, fundamental problem with RTB is that it values inventory in a singular way. In the open market, a 30-year old male car intender costs the same whether you find him on Cars.com or Hotmail. Although tweaks in RTB with private transactions can enable premium inventory procurement, it’s not scalable. The right exchange should be able to value audience separately from everything else.

Another issue is the problem of valuing inventory over time. As a publisher with 30 days to go in my quarter, my homepage inventory may be worth $10CPM. But, the day before the quarter closes, that same inventory may be worth only $1CPM if I haven’t sold it yet. Today’s networks and exchanges enable publishers to set a solid floor price, but have trouble managing value dynamically. That’s because future publisher pricing is not being matched with visible demand. Ironically, the real-time nature of today’s exchanges actually limits a publisher’s ability to manage yield, because every impression is always chasing an immediate bid. A real futures exchange would enable a publisher to value inventory dynamically, so it matches the value set not by bids—but by buy orders in the system (real, stated demand for future inventory).

Although the demand side has it pretty good right now, a true programmatic futures exchange could be truly game changing. Yes, today’s exchanges are serious arbitrage machines. Because the buyer has access to the entire market, they have more information available to them to manage their investment. The problem is, in programmatic RTB, they are stuck with a two-tiered system: secure “premium” inventory through private exchanges and/or DealID functionality for branding and demand creation, and drive lower-funnel activity through performance-driven bidded buying in the wilds of the exchange. Ask any trading desk manager—it’s still really hard to get exactly what you want without going to guaranteed buys, cross-channel buying still requires multiple systems, and communicating the value of the “media investments” you are making to clients is near impossible, because everything is bought and measured differently. So, going from “media buying” to true “media investment” necessitates a true programmatic future exchange, akin to NASDAQ or the NYSE.

In such an exchange, publishers would be able to value their inventory by utilizing a combination of their existing rate cards and product catalogs (for selling advance contracts), and data from buy orders in the market itself. Just like in the stock market, prices would fluctuate based on the spread between bid and ask pricing—and the contract date. Publishers could therefore execute any type of guaranteed buy in such a system (direct sales) as well as have the exchange handle direct deals (“programmatic direct” and “private market”). This is because such an exchange would manage matchmaking, not the execution piece. This is critical. Today, we are watching systems built from the ground up to deliver ads try and go in reverse to manage the process of buying them. As we have seen, the rise of “automated guaranteed” platforms suggests that RTB is not quite cut out for the job.

Why would the demand side want a true programmatic futures exchange? First of all, a true futures exchange treats media as a true commodity—and makes it tradable. The beauty of a commodities exchange is that, once you own a future contract for pork bellies, you can sell it. In digital media, once you have bought a bunch of 300x250s in Appnexus, you are stuck with them. Arbitrage is not the same as futures trading in a regulated market. A true programmatic futures exchange for media would actually enable well-heeled buyers to leverage their scale to consolidate positions in media, and resell them in bulk (or in chunks). Think about that. Imagine GroupM buying the entire Q4 consumer electronics inventory in February. What would that be worth to another agency representing Sony as the holidays approached?

The bottom line is that, despite the power of RTB pipes, we are a long way away from seeing the platforms where addressable media will be traded in the future. Eight years ago, my bet was on a programmatic futures exchange, and I am still long.

[This post originally appeared on 11/7/2014 in AdExchanger]

Mobile is the Beast

BeastModeMobile is truly the biggest opportunity in advertising right now. Sorry, but nothing else even comes close.

Not only are mobile devices nearing ubiquity – but research shows they’re owned by more than nine out of 10 earthlings – smartphones are nearing ubiquity in the developed world, too, with 56% penetration. People are on mobile all the time, and more than half of them use the mobile device as the primary way they access the Internet. At 1.8 hours a day, media consumption on mobile devices now surpasses both television (1.5 hours) and desktop (1.6 hours). If marketers would match their investment in mobile advertising, now at just 4% of media budgets, with the amount of time we spend there – 20% of our time – a lot of people would make a lot of money.

Not only is mobile the fastest growing, most exciting place to be in advertising right now, it’s where the hugest opportunities are. Did you know that 44% of Fortune 100 companies don’t have a mobile-optimized website? That is insane.

Mobile is now “first among equals” when it comes to marketing channels, and every advertiser should think that way when they start putting their plans together for 2015.

Everything Has Already Changed Forever

Proctor and Gamble loves to talk about “the moment of truth,” which is when a consumer stands in front of a store shelf and chooses between two products. Why did they buy Tide detergent instead of Surf? There are a lot of emotional connections between brands and people, whether you are buying soap or making a decision on your next high-ticket item, like a dishwasher. Although brands still need to make an emotional connection, there is an entirely new dynamic driving the many different “moments of truth” we have every day.

Today, we also have what Google calls the “zero moment of truth,” or the fact that every consumer with a smartphone can find out when they are standing in front of that shelf every good and bad thing ever written about a product. So, as a marketer, how do you handle that every one of your customers has the acquired knowledge of the universe in their hands at all times? They can get all the reviews, see all the coupons and deals, and ask their friends before making a decision. That’s going to keep us all busy for years to come in ad tech.

Stop Saying ‘Funnel’

Mobile killed the sales funnel. Somehow, over the last year or so, the AIDA funnel died a quiet death after 116 years. The idea of driving potential customers through a process of “attention, interest, desire and action” has been replaced with something we now call the “customer journey” – a circuitous route, where marketers must be in control, or quickly able to react to, all kinds of touchpoints.

If that sounds confusing, you are not alone. Most marketers struggle with the sheer data expertise needed to create and build sequential messages that follow a consumer from television to tablet to smartphone as they learn more about brands or products. In 2014, the customer journey is mostly handled through retargeting on as many devices as possible, but the lack of a universal ID makes telling a good story across screens pretty tough.

If you want to be able to do that as a marketer, or help marketers do that on your audience as a publisher, then it’s all about the data.

The Tom Cruise Thing

At every mobile conference, someone usually shows a slide with Tom Cruise from “Minority Report.” In the 2002 movie – released more than a decade ago! – we saw future Tom walking by interactive DOOH billboards for Lexus and the Gap, receiving all kinds of personalized offers after being retina scanned. Everything in that movie now exists, including facial identification, in-store beacons, real-time creative delivery, geolocation, RFID and personalization.

We are living in a “Minority Report” world, and sooner or later, we are going to figure out how to put all of the pieces together at scale. Was that a mobile ad that Tom Cruise saw, or will we be calling it something else? Does it matter?

This post appeared in AdExchanger on 9.18.14]

The Top Data Trends in Marketing

Everything seems to be generating data nowadays: mobile devices, e-commerce transactions, Web browsing.

Savvy marketers use data mining, data visualization, text analytics, and forecasting to make more effective decisions and reach customers. But the savviest among them are innovating with fresh types of data—and attracting new business as a result.

Sensing Opportunity for an Upsell
“The data that devices collect are going to add all kinds of context to advertising,” says Chris O’Hara, cofounder of Bionic Advertising Systems, a digital advertising service. Marketers can know exactly where potential consumers are, the current time and temperature, and which of the consumer’s friends are nearby.

When might such factors come into play? O’Hara gives the example of sensors in grocery stores that can detect the items shoppers take off the shelves. That data, run through huge databases, enable marketers to instantly suggest—via tech such as smartphones or electronic shelf displays—other products for shoppers to add to their carts.

Adding Location
More and more, geography will help marketers zero in on demographics, says Kevin Lee, CEO of online advertising and marketing firm Didit. “Geotargeting is a great way to market not only at the hyper-local business level but also for national marketers looking to target specific demographic and psychographic groups,” he says.

Marketers have experimented for years with mobile geolocation-centered campaigns, primarily using couponing. However, since research shows that a whopping 72% of consumers say they’ll respond to sales calls-to-action within sight of the retailer, there are plenty more location-based opportunities that encourage customer loyalty, such as special gifts, alerts to flash sales and early access.

Cooking a Data Stew
With the evolution in data analytics, marketers can now mix different types of data to glean new insights. David L. Smith, CEO of media agency Mediasmith, sees this as the coming of age of the data management platform: tools that integrate data from several sources, including customer information, website data and digital advertising input. All of it serves to improve messaging.

“Messages that come from ad campaigns, direct mail and other communications to the consumer can be coordinated,” Smith says, “so that the consumer is always getting relevant information—not just standard communications.”

SASCollecting Data—While Respecting Customer Privacy and Security
All these data-driven trends can bring benefits to the consumer and improve marketing efficiency. But they also raise privacy and security issues—to which marketers are giving serious attention. “Privacy is going to remain a constant fear in the consumer’s heart,” says Michael Hardin, dean at the University of Alabama’s Culverhouse College of Commerce. “A lot of companies are going to be struggling mightily to deal with that.”

Smart marketers will learn how to walk this fine line and mine significant value from relatively little personal information, says O’Hara. One company strikes this balance with one of its products, an activity-tracker wristband: With just a little personal data input from its user, the wristband gives them athletic performance feedback.

These new technologies are changing the world of marketing—especially given the speed at which data are arriving, says Hardin. Shrewd marketers are contemplating how best to react in a way that benefits their companies.

[This post originally appeared on 6/16 in WSJ]