Talking Integration at DMEXCO

ChrisOHara_BeetTV

COLOGNE – At Salesforce, the acquisitions keep on coming, most recently that of AI-powered marketing intelligence and analytics platform Datorama. The company’s ongoing mantra is “integration” and it seems to have no shortage of assets to leverage in that quest.

It all stems from what Chris O’Hara, VP, Product Marketing, calls the “fourth industrial revolution” led by things like data, AI and the internet of things.

“It’s harder for marketers to deliver personalization at scale to consumers and that’s the goal. So everything we’re doing at Salesforce is really about integration,” O’Hara says in this interview with Beet.TV at the recent DMEXCO conference.

By way of examples, he cites the acquisition of ExactTarget about four years ago with the intention of making email “a very sustainable part of marketing, such that it’s not just batch and blast email marketing but it’s also your single source of segmentation for the known consumer.” The end result was the ExactTarget Marketing Cloud Salesforce Integration.

In late 2016, Salesforce bought a company called Krux and within six months had morphed it into Salesforce DMP. It was a way to assist marketers in making sense of households “comprised of hundreds of cookies and dozens of different devices” and aggregate them to a single person or households “so can get to the person who makes the decision about who buys a car or what family vacation to take,” O’Hara says.

Salesforce DMP benefits from machine-learned segmentation, now known as Einstein Segmentation, to make sense out of the thousands of attributes that can be associated with any given individual and determine what makes them valuable. Developing segments by machine replaces “you as a marketer using your gut instinct to try to figure out who’s the perfect car buyer. Einstein can actually tell you that.”

In March of 2018, MuleSoft, one of the world’s leading platforms for building application networks, joined the Salesforce stable to power the new Salesforce Integration Cloud. It enables companies with “tons of legacy data sitting in all kinds of databases” to develop a suite of API’s to let developers look into that data and “make it useful and aggregate it and unify it so it can become a really cool, consumer-facing application, as an example.”

Datorama now represents what O’Hara describes as a “single source of truth for marketing data, a set of API’s that look into campaign performance and tie them together with real marketing KPI’s and use artificial intelligence to suggest optimization.”

In addition to driving continual integration, Salesforce sees itself as “democratizing” artificial intelligence, according to O’Hara. “There’s just too much data for humans to be able to make sense of on their own. You don’t have to be a data statistician to be able to use a platform like ours to get better at marketing.”

This interview is part of a series titled Advertising Reimagined: The View from DMEXCO 2018, presented by Criteo. Please find more videos from the series here.

Data Science is the New Measurement

tumblr_m9hc4jz_pp_x1qg0ltco1_400It’s a hoary old chestnut, but “understanding the customer journey” in a world of fragmented consumer attention and multiple devices is not just an AdExchanger meme. Attribution is a big problem, and one that marketers pay dearly for. Getting away from last touch models is hard to begin with. Add in the fact that many of the largest marketers have no actual relationship with the customer (such as CPG, where the customer is actually a wholesaler or retailer), and its gets even harder. Big companies are selling big money solutions to marketers for multi-touch attribution (MTA) and media-mix modeling (MMM), but some marketers feel light years away from a true understanding of what actually moves the sales needle.

As marketers are taking more direct ownership of their own customer relationships via data management platforms, “consumer data platforms” and the like, they are starting to obtain the missing pieces of the measurement puzzle: highly granular, user-level data. Now marketers are starting to pull in more than just media exposure data, but also offline data such as beacon pings, point-of-sale data (where they can get it), modeled purchase data from vendors like Datalogix and IRI, weather data and more to build a true picture. When that data can be associated with a person through a cross-device graph, it’s like going from a blunt 8-pack of Crayolas to a full set of Faber Castells.

Piercing the Retail Veil

Think about the company that makes single-serve coffee machines. Some make their money on the coffee they sell, rather than the machine—but they have absolutely no idea what their consumers like to drink. Again, they sell coffee but don’t really have a complete picture of who buys it or why. Same problem for the beer or soda company, where the sale (and customer data relationship) resides with the retailer. The default is to go to panel-based solutions that sample a tiny percentage of consumers for insights, or waiting for complicated and expensive media mix models to reveal what drove sales lift. But what if a company could partner with a retailer and a beacon company to understand how in-store visitation and even things like an offline visit to a store shelf compared with online media exposure? The marketer could use geofencing to understand where else consumers shopped, offer a mobile coupon so the user could authenticate upon redemption, get access to POS data from the retailer to confirm purchase and understand basket contents—and ultimately tie that data back to media exposure. That sounds a lot like closed-loop attribution to me.

Overcoming Walled Gardens

Why do specialty health sites charge so much for media? Like any other walled garden, they are taking advantage of a unique set of data—and their own data science capabilities—to better understand user intent. (There’s nothing wrong with that, by the way). If I’m a maker of allergy medicine, the most common trigger for purchase is probably the onset of an allergy attack, but how am I supposed to know when someone is about to sneeze? It’s an incredibly tough problem, but one that the large health site can solve, largely thanks to people who have searched for “hay fever” online. Combine that with a 7-day weather forecast, pollen indices, and past search intent behavior, and you have a pretty good model for finding allergy sufferers. However, almost all of that data—plus past purchase data—can be ingested and modeled inside a marketer DMP, enabling the allergy medicine manufacturer to segment those users in a similar way—and then use an overlap analysis to find them on sites with $5 CPMs, rather than $20. That’s the power of user modeling. Why don’t site like Facebook give marketers user-level media exposure data? The question answers itself.

Understanding the Full Journey

Building journeys always falls down due to one missing piece of the puzzle or another. Panel-based models continually overemphasize the power of print and linear television. CRM-based models always look at the journey from the e-mail perspective, and value declared user data above all else. Digital journeys can get pretty granular with media exposure data, but miss big pieces of data from social networks, website interactions, and things that are hard to measure (like location data from beacon exposure). What we are starting to see today is, through the ability to ingest highly differentiated signals, marketers are able to combine granular attribute data to complete the picture. Think about the data a marketer can ingest: All addressable media exposure (ad logs), all mobile app data (SDKs), location data (beacon or 3rd party), modeled sales data (IRI or DLX), actual sale data (POS systems), website visitation data (javascript on the site), media performance data (through click and impression trackers), real people data through a CRM (that’s been hashed and anonymized), survey data that been mapped to a user (pixel-enabled online survey), and even addressable TV exposure (think Comscore’s Rentrak data set). Wow.

Why is “data science the new measurement?” Because, when a marketer has all of that data at their fingertips, something close to true attribution becomes possible. Now that marketers have the right tools to draw with, the winners are going to be the ones with the most artists (data scientists).

It’s a really interesting space to watch. More and more data is becoming available to marketers, who are increasingly owning the data and technology to manage it, and the models are growing more powerful and accurate with every byte of data that enters their systems.

It’s a great time to be a data-driven marketer!

[This post originally appeared in AdExchanger on 8/12/16]

Trends in Programmatic Buying

 

thefuture

The digital marketing future we were promised years ago looks pretty lame in retrospect. This is an image of a trading desk supervisor at Razorfish, circa 2013.

2015 has been one of the most exciting years in digital driven marketing to date. Although publishers have been leading the way in terms of building their programmatic “stacks” to enable more efficient selling of digital media, marketers are now catching up. Wide adoption of data management platforms has given rise to a shift in buying behaviors, where data-driven tactics for achieving effectiveness and efficiency rule. Here’s a some interesting trends that have arisen.

 

Purchase-Based Targeting

Remember when finding the “household CEO” was as easy as picking a demographic target? Marketers are still using demographic targeting (Woman, aged 25-44) to some extent, but we have seen a them shift rapidly to behavioral and contextually based segments (“Active Moms”), and now to Purchase-Based Targeting (PBT). This trend has existed in categories like Automotive and Travel, but is now being seen in CPG. Today, marketers are using small segments of people who have actually purchased the product they are marketing (“Special K Moms”) and using lookalike modeling to drive scale and find more of them. These purchase-defined segments are a more precise starting point in digital segmentation—and can be augmented by behavioral and contextual data attributes to achieve scale. The big winners here are the folks who actually have the in-store purchase information, such as Oracle’s Datalogix, 84.51, Nielsen’s Catalina Solutions, INMAR, and News Corp’s News America Marketing.

Programmatic Direct

For years we have been talking about the disintermediation in the space between advertisers and publishers (essentially, the entire Lumascape map of technology vendors), and how we can find scalable, direct, connections between them. It doesn’t make sense that a marketer has to go through an agency, a trading desk, DSP an exchange, SSP, and other assorted technologies to get to space on a publisher website. Marketers have seen $10 CPMs turn into just $2 of working media. Early efforts with “private marketplaces” inside of exchanges created more automation, but ultimately kept much of the cost structure. A nascent, but quickly emerging, movement of “automated guaranteed” procurement is finally starting to take hold. Advertisers can create audiences inside their DMP and push them directly to a publisher’s ad server where they have user-matching. This is especially effective where marketers seek as “always on” insertion order with a favored, premium publisher. This trend will grow in line with marketers’ adoption of people-based data technology.

Global Frequency Management

The rise in DMPs has also led to another fast-growing trend: global frequency management. Before marketers could effectively map users to all of their various devices (cross-device identity management, or CDIM) and also match users across various execution platforms (hosting a “match table” that assures user #123 in my DMP is the same guy as user #456 in DataXu, as an example), they were helpless to control frequency to an individual. Recent studies have revealed that, when marketers are only frequency capping at the individual level, they are serving as many as 100+ ads to individual users every month, and sometimes much, much more. What is the user’s ideal point of effective frequency is only 10 impressions on a monthly basis? As you can see, there are tremendous opportunities to reduce waste and gain efficiency in communication. This means big money for marketers, who can finally start to control their messaging—putting recovered dollars back into finding more reach, and starting to influence their bidding strategies to get users into their “sweet spot” of frequency, where conversions happen. It’s bad news for publishers, who have benefitted from this “frequency blindness” inadvertently. Now, marketers understand when to shut off the spigot.

Taking it in-House

More and more, we are seeing big marketers decide to “take programmatic in house.” That means hiring former agency and vendor traders, licensing their own technologies, and (most importantly) owning their own data. This trend isn’t as explosive as one might think, based on the industry trades—but it is real and happening steadily. What brought along this shift in sentiment? Certainly concerns about transparency; there is still a great deal of inventory arbitrage going on with popular trading desks. Also, the notion of control. Marketers want and deserve more of a direct connection to one of their biggest marketing costs, and now the technology is readily available. Even the oldest school marketer can license their way into a technology stack any agency would be proud of. The only thing really holding back the trend is the difficulty in staffing such an effort. Programmatic experts are expensive, and that’s just the traders! When the inevitable call for data-science driven analytics comes in, things can really start to get pricey! But, this trend continues for the next several years nonetheless.

Closing the Loop with Data

One of the biggest gaps with digital media, especially programmatic, is attribution. We still seem to have the Wannamaker problem, where “50% of my marketing works, I just don’t know which 50%.” Attitudinal “brand lift” studies, and latent post-campaign sales attribution modeling has been the defacto for the last 15 years, but marketers are increasingly insisting on real “closed loop” proof. “Did my Facebook ad move any items off the shelf?” We are living in a world where technology is starting to shed some light on actual in-store purchases, such that we are going to able to get eCommerce-like attribution for corn flakes soon. In one real world example, a CPG company has partnered with 7-11, and placed beacon technology in the store. Consumers can receive a “get 20% off” offer on their mobile device, via notification, when the they approach the store; the beacon can verify whether or not they arrive at the relevant shelf or display; and an integration with the point-of-sale (POS) system can tell (immediately) whether the purchase was made. These marketing fantasies are becoming more real every day.

Letting the Machines Decide

What’s next? The adoption of advanced data technology is starting to change the way media is actually planned and bought. In the past, planners would use their online segmentation to make guesses about what online audience segments to target, an test-and-learn their way to gain more precision. Marketers basically had to guess the data attributes that comprised the ideal converter. Soon, algorithms will atart doing the heavy lifting. What if, instead of guessing at the type of person who buys something, you could start with the exact composition of that that buyer? Today’s machine learning algorithms are starting at the end point in order to give marketers a hige edge in execution. In other words, now we can look at a small group of 1000 people who have purchased something, and understand the commonalities or clusters of data attributes they all have in common. Maybe all buyers of a certain car share 20 distinct data attributes. Marketers can have segment automatically generated from that data, and expend it from there. This brand new approach to segmentation is a small harbinger of things to come, as algorithms start to take over the processes and assumptions of the past 15 years and truly transform marketing.

It’s a great time to be a data-driven marketer!

 

Creating the Fabled 360 View of the Consumer

ImageDespite years of online targeting, the idea of having a complete, holistic “360 degree view” of the consumer has been somewhat of a unicorn. Today’s new DMP landscape and cross-device identification technologies are starting to come close, but they are missing a key piece of the puzzle: the ability to incorporate key social affinities.

In the nearby chart, you can see that online consumers tell us all about themselves in a number of ways:

Viewing Affinities: Where they go online and what they like to look at provides strong signals of what they are interested in. Nielsen, comScore, Arbitron and others have great viewership/listenership data that is strong on demographics, so we can get a great sense of the type of folks a certain website or show attracts. This is great, but brands still struggle to align demographic qualities perfectly with brand engagement. 34 year old men should like ESPN, but they could easily love Cooking.com more.

Buying Affinities: What about a person’s buying habits? Kantar Retail, OwnerIQ, and Claritas data all tell us in great detail what people shop for and own—but they lack information on why people buy the stuff they do. What gets folks staring at a shelf to “The Moment of Truth” (in P&G parlance) when they decide to make a purchase? The buying data alone cannot tell us.

Conversational Affinity: What about what people talk about online? Radian6 (Salesforce), Crimson Hexagon, and others really dig into social conversations and can provide tons of data that brands can use to get a general sense of sentiment. But this data, alone, lacks the lens of behavior to give it actionable context.

Social Behavioral Affinity: Finally, what about the actions people take in social environments? What if we could measure not just what people “like” or “follow” online, but what they actually do (like post a video, tweet a hashtag, or engage with a fan page)? That data not only covers multiple facets of consumer affinity, but also gives a more holistic view of what the consumer is engaged with.

Adding social affinity data to the mix to understand a consumer can be a powerful way to understand how brands relate to the many things people spend their time with (celebrities, teams, books, websites, musicians, etc.). Aligning this data with viewing, buying, and conversational data gets you as close as possible to that holistic view.

Let’s take an example of actionable social affinity in play. Say Whole Foods is looking for a new celebrity to use in television and online video ads. Conventional practice would be to engage with a research firm who would employ the “Q Score” model to measure which celebrity had the most consumer appeal and recognition. This attitudinal data is derived from surveys, some with large enough sample sizes to offer validity, but it is still “soft data.”

Looking through the lens of social data, you might also measure forward affinity: how many social fans of Whole Foods expressed a Facebook “like” for Beyonce, or followed her account on Twitter? This measurement has some value, but fails at delivering relevance because of the scale effect. In other words, I like Beyonce, so does my wife, and so does my daughter . . . along with many millions of other fans—so many that it’s hard to differentiate them. The more popular something is, the broader appeal and less targetability that attribute has.

So, how do you make social affinity data relevant to get a broader, more holistic, understanding of the consumer?

Obviously, both Q Score and forward affinity can be highly valuable. But when mixing viewing, buying, and listening with real social affinity data, much more becomes possible. The real power of this data comes out when you measure two things against one another. Sree Nagarajan, CEO of Affinity Answers, explained this mutual affinity concept to me recently:

“In order for the engagement to be truly effective, it needs to be measured from both sides (mutual engagement). The parallel is a real-world relationship. It’s not enough for me to like you, but you have to like me for us to have a relationship. Mapped to the brand affinity world, it’s not enough for Whole Foods fans to engage with Beyonce; enough Beyonce fans have to engage with Whole Foods (more than the population average on both sides) to make this relationship truly meaningful and thus actionable. When true engagement is married with such mutual engagement, the result is intelligence that filters out the noise in social networks to surface meaningful relationships.”

As an example, this approach was recently employed by Pepsi to choose Nicki Minaj as their spokesperson over several other well-known celebrities.

What else can social affinity data do?

  • Brands can use social affinity data to decide what content or sponsorships to produce for their users. Looking at their users’ mutual affinity between the brand and music, for example, might suggest which bands to sponsor and blog about.
  • A publisher’s ad sales team can use such data to understand the mutual affinity between itself and different brands. A highly correlated affinity between activated social visitors to GourmetAds’ Facebook page and those who post on Capital One’s Facebook page may suggest a previously unknown sales opportunity. The publisher can now prove that his audience has a positive predisposition towards the brand, which can yield higher conversions in an acquisition campaign.
  • What about media buying? Understanding the social affinity of fans for a television show can produce powerful actionable insights. As an example, understanding that fans of “Teen Wolf” spend more time on Twitter than Facebook will instruct the show’s marketing team to increase tweets—and post more questions that lead to increased retweets and replies. Conversely, an Adult Swim show may have more Facebook commenters, leading the marketer to amplify the effect of existing “likes” by purchasing sponsored posts.
  • Keyword buying is also interesting. Probing the mutual affinities between brands and celebrities, shows, music acts, and more can yield long tail suggested keyword targets for Google, Bing/Yahoo, and Facebook that are less expensive and provide more reach than those that are automatically suggested. As an example, when “Beavis and Butthead” re-launched on MTV, Google suggested keywords for an SEM campaign such as “Mike Judge” (the show’s creator) and “animated show.” Social affinity data suggested that socially activated Beavis fans also loved “Breaking Bad.” Guess what? Nobody else was bidding on that keyword, and that meant more reach, relevance, and results.

I believe that understanding social affinity data is the missing piece of the “360 degree view” puzzle. Adding this powerful data to online viewing, buying, and social listening data can open up new ways to understand consumer behavior. Ultimately, this type of data can be used to generate results (and measure them) in online branding campaigns that have thus far been elusive.

Want a full view of the people who are predisposed to love your brand? Understand what you both mutually care about through social affinities—and measure it.

[This post originally appeared in AdExchanger on 4.14.14]

 

Watching you Watch

im_watching_you_watch_me_watch_you_shirt-r24476fd527d14a9cbe18bf36d2edce07_va6lr_512

A few years ago, I had coffee with Nick Langeveld, who left Nielsen to run business development for an interesting company called Affectiva. He was telling me how the company, an MIT labs spin-off, was going to make measurement in a new direction by measuring people’s facial expressions.

Like Intel, who is going to start shipping set top boxes that know who is watching television, Affectiva is using the ability to watch consumers through their webcams as they consume video, and measure the emotions in real-time.

Now, marketers could see the exact moment when they captured surprise, delight, or revulsion in a consumer—and scale that effort to anyone with a webcam, who opted into their panel. This sounded great, but I wondered if and when large marketers would adopt such technology.

The question of adoption was answered early this year, when the company announced that both Unilever and Coca Cola would use the technology to measure all of their marketing efforts this year. In the consumer products wars, perhaps tweaking your video assets to get an extra smile or“wow moment” will give Coke what it needs to pick up market share from Pepsi. Even if that is not the case, the measurement will give their agency creative a very real—and real-time—indication of how impactful their content is.

I think this is a great sign.

The relentless automation of digital media seems to have taken a lot of emphasis off the creative. Now that digital marketers can buy audiences so precisely, delivering the right message doesn’t seem to matter as much. For agency people like R/GA’s Michael Lowenstern, who recently spoke at Digiday Agency Summit on this topic, the right equation is right message plus right creative equals more performance and lower costs. His recent Verizon campaign used hundreds of individual banner creatives, matched to audiences, and raised sales of FiOS 187%. That’s sales increase—not an increase in campaign performance.

Creative matters

When talking about higher funnel branding activities using rich media and video, creative matters even more. Building brands takes means connecting emotionally with consumers, so it would seem like technologies like Affectiva’s, which utilise “big data” approaches to drive branding, all the more relevant for this day and age. Research presented by The Intelligence Group’s Allison Arling-Giorgi at the recent 4As conference showed that Generation Y consumers find humor the most effective form of advertising, so Unilever leveraging technology to squeeze one more smile out of a 30-second spot starts to make a lot of sense.

In the case of Intel, adding a “webcam” to a set top box sounds really scary from a privacy perspective, but starts to make a lot of sense when you think about the migration of television ad dollars to display.

Theoretically, the shifting dollars reflect a desire on the advertisers’ part to achieve more granular targeting, and put the right ads in front of the right people at scale. The digital display channel offers lots of targeting, but the low-quality inventory, commoditized and bland creative units, and tremendous amount of fraud have kept a lot of TV money on the sidelines.

In 2009, Morgan Stanley analyst Mary Meeker estimated the gap between where people spend their time (increasingly online) and where advertisers spend their money (TV) represented a $50 billion global opportunity. While that has shrunk considerably, the gap is still measured in the tens of billions, despite the enormity of Google, and the embrace of ubiquitous social platforms such as Facebook and Twitter.

Connected set top boxes may be biggest online advertising disruptor

Despite being a mass reach vehicle today, plans like Intel’s threaten to disrupt online advertising in a far more fundamental way than digital has been impacting traditional ads. Connected set top boxes are connected to households—the composition of which are easy to access, both from a demographic and financial perspective. They also tend to deliver a much more engaging video experience for the brand advertiser than ignored “pre-roll” inside a tiny 300×250 pixel video player.

Now, add the element of being able to actually see who is on the couch watching—and tailor ad messaging to those viewers, based on their age, and contextual relevance of the content they are consuming. That’s powerful in a way that digital cannot ever become, despite the rise of “social TV” watchers who tweet during “appointment viewing” shows like The Walking Dead.

Delivering ads to TVs, depending on who is watching them? That’s something to really watch out for. 

[This post originally appeared on the EConsultancy Blog on 4/8]

Social Affinity

Is your media measurement as dated as this 1970s den?

Is your media measurement as dated as this 1970s den?

The New Panel-Based Audience Measurement for Brands

With the prevalence of social data, yesterday’s panel-based measurement for digital campaigns is starting to look like the wood paneling in your grandmother’s den: A bit out of fashion. Marketers have been trained to buy media based on demographics, and it is natural to want your ads to be where you think your customers are. For BMW’s new entry level sedan, that might mean finding the media that males aged 26-34, who are earning 75,000 or more a year, consume. That makes a lot of sense, but it also means that your paid media will always compete alongside ads for your competitors. That is a big win for websites with premium inventory that fits your demographic, because it means scarcity and high prices for marketers.

What if there was another way to measure what audiences are right for brands? And what if that data were based on a panel of a few hundred million people, rather than a few thousand? Well, thanks to Facebook and Twitter, we have just such a web-based panel of consumers, and they are always eager to share their opinions in the forms of “likes,” “follows,” and (more importantly) engagement. Social listening platforms have been able to tell brands what people think about them directionally, and measure how certain marketing efforts move the social needle. Listening is great, but how do you get to hear what to buy?

A company called Colligent has been going beyond listening, by measuring what people actually do on Twitter, Facebook, and other social sites. “Liking” is not enough (when me, my 10-year old daughter, and my mom “like” Lady Gaga, the audience I am a part of gets too broad to target against). What matters is when people express true affinity by sharing videos, tweeting, and commenting. When people who are nuts about a certain celebrity are also nuts about a certain brand—and that relationship over-indexes against normal affinity, you have struck real social gold: Data that can make a difference. Pepsi recently used such data to choose Nicki Minaj as a spokesperson over dozens of other choices.

What about other media? Nielsen defines television, Arbitron measures radio, and MRI defines magazine audiences by demographics. Now, for the first time, marketers can use social data—gathered from panels nearly as large as the buying population—to define audiences by their own brand and category terms. That’s a world in which Pepsi can purchase “Pepsi GRPs” across all media, rather than GRPs in a specific media.

This is the way brands will buy in the future.

[This post originally appeared on 2/21/13 in The CMO Site, a United Business Media publication]

Thoughts on Data-Driven Audience Measurement

A Conversation with Scott Portugal of PulsePoint

What are some best practices for the modern digital marketer? Cookie-based data makes knowing your audience easier  than ever. Developing accurate audience profiles, optimizing campaigns based on audience composition, and validating audience reach are all critical components for marketers doing targeted digital campaigns. I recently spoke with Scott Portugal, long time digital media veteran and currently VP of Business Development for PulsePoint, who has been working with PulsePoint’s Aperture audience measurement offering, what marketers should be thinking about when it comes to measurement.

Scott Portugal: First and foremost, marketers must really understand the goals of the campaign. “Branding” vs. “Performance” aren’t goals – they are notional indicators of goals. “Increase brand awareness amongst men passionate about health and fitness by 50%,” is a goal. The more specific, the better. It eliminates the guesswork that agencies have to do around media tactics, and most importantly, specificity in KPIs means everyone knows which data sets to use along the way.  Also, a modern marketer knows that buying digital media isn’t an on/off switch. Once the buy starts, the work starts. Prepare to optimize everything you can – look at performance across targets, media partners, creative (the most important and often least optimized variable), etc.. Good digital marketers are like good scientists: ask plenty of questions, account for all variables, and constantly test to find success.

What new tools are out there to assist in audience measurement, and supplement the standard offerings from Comscore, Nielsen?  

SP: Data is ubiquitous – some might say commoditized. But there are a few platforms out there that are taking novel approaches to audience measurement. Certainly our PulseAudience platform is among that group. We’re able to build audience profiles at the domain level, meaning at a very granular level we can infer the audience composition of a page even without a cookie. Another new player is Korrelate, founded by the guys who ran TACODA. Korrelate is in the business of helping marketers understand how different data sets perform across different platforms – essentially helping a buyer know what data segment to buy when and where. At a broader level, audience measurement platforms are starting to look cross-media, bringing together disparate data sets that show impact of a campaign on ALL digital activity, not just clicks.

What about social data? How are technologies like Facebook and Twitter enabling a more concise view of audiences, and helping marketers validate their choices?  

SP: If you think about Facebook and Twitter NOT as destinations, but as communication tools, then you can start to see where a more holistic audience view can be created. Social media is more than updates – it’s sharing news, communicating about brands, raising hands about interests, and more. Social data, when done right, is true first party data that goes above and beyond standard behavioral data. Marketers can understand not just when a user engages, but how, where, and how valuable that engagement actually was (likes, shares, tweets, etc.). it should validate a marketers choice around creative and placement, but only if the creative and placements actually include social elements. Social data is powerful, but it’s only powerful if it’s part and parcel to other data sets and targeting mechanisms used in conjunction with social media. Nothing happens in a vacuum, and nothing happens ONLY in one channel.

Your company owns Aperture. Can you provide some examples of how progressive media organizations are using audience measurement data? Is it about audience validation? Optimization? Upselling clients?  

SP: It’s about delivering value via insights up and down the funnel. It sounds like ad jargon but it’s what we strive to do with every single engagement. Cookie targeting works, but we believe that there is real value in modeling at other points of content interaction – insights that help guide and inform at all points of the campaign. Our RTB partners can leverage some of this data in real time; our non-programmatic partners work with our data and insights group to go even deeper via custom reporting and deeper dives on how to get consumers to engage. Data availability and normalization—what we do—is what makes the tide rise to lift all boats.

How can (the right) measurement data influence brand advertising? Is this the key to bringing more brand dollars online?

SP: Brands will feel safe moving dollars over from television to digital when they can do two things: ensure the environment is safe and ensure that they are reaching the right audience with minimal waste. Does television have massive amounts of waste in it? Of course – but as an industry we promised the world that we would eliminate much of that problem via targeting and optimization, so we have to lay in the bed we made. So measuring not just reach & frequency, but the impact of that reach & frequency is critical. Did search queries go up relative to their competitors? Did social commentary increase? Are there more tweets about campaigns in other platforms (did you create awareness that increases awareness in other channels as well)? Like I said earlier – understanding the specific goals of that branding campaign, and ensuring that the right creative is matched with the right tactics, will allow for the right measurement data to be used.

What’s next in measurement?  

SP: To me it comes down to cross-platform impact. Devices and screens aren’t truly linked yet, but the audience at the other end of that ad campaign is the same person. They tweet, they promote, they like, they friend, they blog, they comment, they shop….but they do it across multiple screens in the home, the office, and on the street. The best measurement companies are going to be those that can build an impact assessment across ALL platforms and show the points of interconnection. It’s a big task – but the ones who get it right will be the ones working directly with marketers and become embedded into everything they do. The next big push will be to show marketers that social, search, display, video, and mobile are all tactics inside the same strategy…and then show them how each tactic impacts the other.

This interview, among many others, appears in EConsultancy’s recently published Best Practices in Data Management by Chris O’Hara. Chris is an ad technology executive, the author of Best Practices in Digital Display Media,  a frequent contributor to a number of trade publications, and a blogger.

Can you Buy “Brand?”

SreeUnderstanding Social Affinity Data

Marketers are increasingly turning to social platform data to understand their customers, and tapping into their social graphs to reach more of them. Facebook “likes” and Twitter “follows” are religiously captured and analyzed, and audience models are created—all in the service of trying to scale the most powerful type of marketing of all: Word-of-mouth.  With CRM players (like Salesforce, who recently acquired Buddy Media and Radian6) jumping into the game, digitally-derived social data is now an established part of traditional marketing.

But, are marketers actually finding real signals amid the noise of social data? In other words, if I “like” Lady Gaga, and you “like” Lady Gaga, and my ten year old daughter also “likes” Lady Gaga, then what is the value of knowing that? If I want to leverage social data to enrich my audience profiles, and try and get the fabled “360 degree” view of my customer, “likes” and “follows” may contribute more noise than insight. I recently sat down with Colligent’s Sree Nagarajan to discuss how brand marketers can go beyond the like, and get more value out of the sea of social data.

Colligent (“collectively intelligent,” if you like) goes beyond “likes” and actually measures what people do on social sites. In other words, if you merely “like” Lady Gaga, you are not measured, but if you post a Lady Gaga music video, you are. By scraping several hundred million Facebook profiles, and accessing the Twitter firehose of data, Nagarajan’s company looks at what people are socially passionate about—and matches it against other interests. For example, the data may reveal that 5% of Ford’s socially active fanbase is also wild about NASCAR. That’s great to know. The twist is that Colligent focuses on the folks who are nuts about NASCAR—and like Ford back. That’s called mutual engagement and, arguably, a more powerful signal.

Nagarajan’s focus on this type of data has many questioning the inherent value of targeting based on social media membership. “In any social network’s lifecycle, likes (or ‘follows’ or friends) start out as genuine signals of brand affinity. However as more and more like the page their audience gets increasingly diluted, making likes less of an indicator of brand’s true audience. True engagement as measured by comments, photo posts, re-tweets, hashes, etc. became much better indicators of brand affinity and engagement.”

Colligent data recently convinced Pepsi to choose Nicki Minaj as their spokesperson, since the data revealed a strong correlation between socially activated Pepsi and Minaj fans. Think about that for a second. For years, major brands have used softer, panel-based data (think “Q Score”) to decide what celebrities are most recognizable, and capture the right brand attributes. Now, getting hard metrics around the type of people who adore your brand are just a query away.  Digital marketers have been talking about “engagement” for years, and have developed a lexicon around measurement including “time spent” and “bounce rate.” Social affinity data goes deeper, measuring true engagement. For Nagarajan, “In order for the engagement to be truly effective, it needs to be measured from both sides (mutual engagement). The parallel is a real-world relationship. It’s not enough for me to like you, but you have to like me for us to have a relationship. Mapped to the brand affinity world, it’s not enough for Pepsi fans to engage with Nicki Minaj; enough Nicki fans have to engage with Pepsi (more than the population average on both sides) to make this relationship truly meaningful and thus actionable. When true engagement is married with such mutual engagement, the result is intelligence that filters the noise in social networks to surface meaningful relationships.”

So, what else can you learn from social affinity data? With so many actively engaged fans and followers, throwing off petabytes of daily data, these networks offer a virtual looking glass for measuring real world affinities. If you think about the typical Facebook profile, you can see that many of the page memberships are driven by factors that exist outside the social network itself. That makes the data applicable beyond digital:

  • Television: Media planners can buy the shows, networks, and radio stations that a brand’s fans are highly engaged with.
  • Public Relations: Flacks can direct coverage towards  the media outlets a brand’s fans are engaged with.
  • Sponsorships: Marketers can leverage affinity data to determine which celebrity should be a brand’s spokesperson.
  • Search: SEM directors can expand keyword lists for Google and Facebook buys using social affinity-suggested keywords.
  • Display: Discover what sites Ford’s socially activated consumers like, and buy those sites at the domain level to get performance lift on premium guaranteed inventory buys.

Are we entering into a world in which marketers are going to use this type of data to fundamentally change the way they approach media buying?  What does it mean to “buy brand?” Sree Nagarajan sees this type of data potentially transforming the way offline and online media planners begin their process. “Much of the audience selection options available in the market today are media based. Nielsen defines TV audience, Arbitron radio, ComScore digital sites, MRI magazines, etc. Brand marketers are forced to define their audiences in the way media measures audience: by demographics (e.g., 18-49 male),” remarks Sree.  “Now, for the first time, social data allows marketers to define audiences based on their own brand and category terms. Now, they can say ‘I want to buy TV shows watched by Pepsi and more generally, Carbonated Soft Drinks audience.’ This will truly make marketing brand-centric instead of media-centric. Imagine a world where brand and category GRPs can be purchased across media, rather than GRPs in a specific media.”

Look for this trend to continue, especially as company’s become more aggressive aligning their CRM databases with social data.

[This article originally appeared in ClickZ on 12/11/12]

Discover more on this topic and others by downloading my new whitepaper, Best Practices in Data Management