NextMark Spawns Bionic Advertising Systems

Bionic-Advertising-fka-NextMarkNextMark today announced the creation of Bionic Advertising Systems, a new division focused on delivering technology that streamlines digital advertising workflow for digital marketers, their advertising agencies, and publishers.

“The new Bionic brand represents our philosophy of delivering advertising technology that combines the strengths of humans and machines,” remarked Joe Pych, CEO of NextMark, and co-founder of Bionic. “Over the past few years, there’s been a battle of man versus machine in digital media. Neither side is winning. Instead of man or machine, the best ‘systems’ of the future will be a combination of both. The recent announcements by AOL,Yahoo!, and Microsoft around Programmatic Direct validate this belief and heralds a new age in digital advertising: the Bionic Age. As the name implies, our new Bionic unit is 100% dedicated to delivering solutions for this new era in digital advertising.”

Launched today, Bionic Advertising Systems will encompass NextMark’s solutions for digital advertising, including the latest Programmatic Direct technologies. Bionic’s software automates the mundane processes of digital media planning, buying, and ad operations. It frees media planners, buyers, and sellers to spend their time on higher-value tasks. It enables digital media planners to find advertising opportunities, gather information, create and send requests for proposals, negotiate with publishers, build media plans, execute orders, and implement their campaigns with the click of a button. With its modern API-driven architecture, it integrates with popular agency tools such as DoubleclickMediaMind, and comScore. It’s currently integrating with leading sell-side Programmatic Direct technology providers AdslotiSocket, and Yieldex. Bionic’s Digital Media Planner aims to tie together the many disparate systems used in digital advertising, giving them a single interface that simplifies the way they develop and deliver media plans.

“’Bionic’ is such a great concept for the digital media industry,” added Chris O’Hara, the business unit’s co-founder and Chief Revenue Officer. “A lot of companies in the space think that algorithms and robots are the answer. We know human creativity can be unleashed by automation, and that digital advertising works best when people are empowered by technology.”

Currently, more than forty advertising agencies are using the Bionic Digital Media Planner to create and execute their media plans. More than 900 publishers and networks are using the Bionic Digital Ad Sales System to promote more than 9,000 premium digital advertising programs—the largest directory of its kind, which also powers the IAB’s Digital Advertising Directory.

To learn more, visit the Bionic website: http://www.bionic-ads.com/

When Cost-Plus is a Minus

imagesIs the Hourly Pricing Model Right for Today’s Digital Media Agency?

It’s funny how people deride Microsoft for not being successful in advertising technology when 80% of digital media dollars are transacted using their media planning software. Despite the fact that we live in a world where computers can evaluate hundreds of individual bid requests on a single impression and render an ad serving  decision in under 50 milliseconds, the overwhelming majority of display inventory is bought using e-mail and fax machines. Those media plans are manually created in Excel.

Terence Kawaja of the famous  LUMAscape maps, which depict the 300-plus companies who enable the 20% of display buying that happens programmatically, once said that “inertia is the agency’s best friend” when asked why holding companies were not doing more to bring innovation to advertising. I imagine that part of what he meant was that their common business model (billable hours plus a negotiated margin) does not create an incentive for efficiency. On the contrary, complexity in media planning means more billable hours—as well as a built-in need for agencies’ existence. After all, if media buying were easy, then marketers would do it themselves.

A result of this inertia is the fact that Microsoft’s business products (Outlook, PowerPoint, and Excel) power the majority of digital media buying today. After research is done in platforms like Comscore and Nielsen, media planners output a spreadsheet, create an RFP, and begin the long process of gathering other spreadsheets from publishers. After a few weeks and $40,000 in hours spent cutting and pasting, a media plan is born. This grueling process has the average media planner spending more time on manual, repetitive processes than on strategy and high-value, client-facing activities. You would imagine that agencies would work quickly to adopt technologies that make the transactional nature of media planning more streamlined.

It seems like agencies don’t care, as long as they are getting paid for their work, but there are real problems with the Excel model. Here are a few to consider:

  • Employee Happiness: One of the biggest problems facing agencies is the constant turnover in media planning departments. Agencies hire junior planners directly out of college in many cases, and work them long hours where they perform many of the manual processes that go into digital media plan execution. After a while, they take their training and insights and ascend the ladder into the next position, or take their newfound expertise to the next agency, where they can expect more of the same for a slight raise. Wouldn’t it be better to deploy technology that takes out the grunt work of campaign planning, and enables planners to focus on more high value activities, such as strategy? The costs of employee turnover are high, as are the hidden costs of employee dissatisfaction.
  • More Bandwidth Equals More Clients:  Although agencies get paid for their hours, there is a point at which an agency can only take on so many clients. After all, adding employees (even low cost ones) means adding more desk space, furniture, computers, and financial overhead in general. Eventually, an agency starts to need increases in productivity at the employee level in order to scale, and add more clients (and revenue) without overly expanding its physical footprint. Leveraging technology that streamlines the manual part of media planning means being able to do more planning with less planners, enabling shops to scale their market share without adding as many junior personnel.
  • You Don’t Get Paid for Pitching: Digital media shops don’t always get paid for all of their hours. Pitching new clients means creating sample plans and putting company resources to work on speculative business, which is all the more reason to find efficiencies in media planning technology.
  • Spreadsheets Don’t Learn: One of the biggest problems with digital media planning using manual, spreadsheet-driven processes is that it becomes hard to maintain a centralized knowledge base. Planners leave, plans get stored on disparate hard drives, information on pricing and vendors is fragmented, and it is hard to measure performance over time. Despite the fact that they are getting remunerated for their work, agencies must consider whether the method of using man hours to perform repetitive tasks could be more expensive in the long run. As David Kenny once remarked, “If you are using people to do the work of machines, you are already irrelevant.”

At the macro level, the cost-plus pricing model’s principle disadvantage is that it creates what economists call a perverse incentive or, put more simply, an incentive for inefficiency. When it that cost model is applied to digital media planning—already fraught with inefficiency—you have an environment ripe for disruption. The advent of new, platform-driven media planning and buying technologies is spawning a new era of “systematic guaranteed” buying which promises to streamline and centralize the way banner ads are bought today. Agencies will be able to dedicate more hours to client facing tasks and strategy, and publishers will be able to manage their transactional RFP business more seamlessly, and be able to focus their sales teams on super premium, high CPM sales.

By eliminating much of the human cost of media planning and buying, technology can help add more value to the media itself—the real “plus” that we have been looking for.

[This post originally appeared in ClickZ on 2/7/13]

Programmatic Premium is not about Bidding

bid-nobid-article “A market is never saturated with a good product, but it is quickly saturated with a bad one” – Henry Ford

When it comes to digital publishing sales, it seems like many publishers are questioning whether the product they have—the standard banner ad—is what they should be selling. Last month, I wrote that 2013 would be the year of “premium programmatic,” where LUMA map companies who make their living in real time bidding turn towards the guaranteed space, where 80% of digital marketing dollars are being spent. My recent experience at Digiday Exchange Summit convinced me that this meme continues—with an important distinction: “premium programmatic” is not about bidding on  quality inventory through exchanges. Rather, it is about using technology to enable premium guaranteed buys at scale. Here is what I heard:

The Era of the Transactional RFP is Over

Forbes’ Meredith Levien currently gets 10% of her display revenue from programmatic buying, up from 2% in 2011. The rest of her revenue is comprised of 45% premium programs , and 45% from what she calls the “transactional RFP” business. The latter is the type that comes from continually responding to agency RFPs for standard IAB banner programs, with little customization. Levien questioned whether that type of transactional business was completely on its way to becoming driven exclusively by technology.

Are publishers really going to be able to abandon the  relentless RFP treadmill where countless hours are spent reacting to agency RFPs—many of which are sent to over 100 publishers, despite the fact that an average of 5 find themselves on the campaign? In order for that to happen, Levien said, the very language we are using must change. The language of the transactional RFP (“GRP,” “CPM,” “Impressions”) must change to the language of premium (“Social Shares,” “Influence,” and “Engagement”). Ultimately, Levien sees a world where there are fewer people managing  RFP response and more multi-disciplinary teams that create super premium tentpole programs for large brands. Forbes’ teams feature copywriters, developers, and creatives who don’t talk about the “buy details” of a campaign, but more about the social and cultural implications a great advertising program can create.

To paraphrase Federated’s CEO Deanna Brown, publishers “really have to question whether sending 100 e-mails to win a $50,000 RFP is worth it.”

Create Scarcity

Gannett’s Steve Ahlberg was even more forceful in his rejection of programmatic buying, and the transactional nature of guaranteed buying. After living in a world of their own creation (pages festooned Nascar-like with low-CPM banners) USAToday.com took the draconian move of removing all below-the-fold ads from its site, stripped every network and exchange tag from its pages, and decided to have one large ad placement per page. The experiment—revenue neutral in the 4th quarter of 2012—has thus far proven that publishers can get off “set it and forget it” SSP revenue by creating the type of scarcity that drives up both rates and demand. According to Ahlberg, the publication is talking to quality brand clients that were not on the radar just months before.

Part of the equation is getting away from standardized IAB units and trying to create a “television-like” experience for brand advertisers. Like Forbes, that means getting RFP response teams away from transactional duties, and leveraging cross-disciplinary teams that think like wealth managers, rather than salespeople. Instead of asking about reach and frequency, USAToday.com asks brands what they want to accomplish, and works with them to craft campaigns that work towards a different set of KPIs.

“If the Premium Publishers’ Product is the Banner Ad, then they are in Trouble”

For Walker Jacobs, who oversees Turner’s digital ad sales, the recent leaps and bounds in programmatic technology has done nothing but “accelerate the bifurcation of the ecosystem” which is divided been the good inventory and the bad. Like Gannett, Turner takes a jaundiced view of the programmatic ad economy.  “Our RTB strategy is ‘no’,” as Jacobs concisely put it.

Going further, Jacobs suggested that “there is no such thing as programmatic premium” in a world saturated with banner ad units, many of which go unseen. Standard banners, therefore, are a “flawed currency.”  It is hard to argue with Jacobs in a world that sees 5 trillion impressions every month.

It is clear that, despite the massive strides being made in programmatic buying technology, there is a very large gap between publishers who control super premium inventory, and those that do not. Publishers in the former want to find more streamlined and efficient ways to respond to RFPs, and ultimately turn more of their efforts into selling creative, multi-tiered, tentpole solutions to major brands. It doesn’t sound like many premium publishers are implementing private exchanges either, despite all of the hype in 2012. As Dan Mosher of Brightroll Exchange remarked, a private exchange “is just a blocklist feature of a larger platform.”

In 2013, it seems like premium publishers are not embracing private exchanges, not because of the technology, but rather because they are rejecting the notion of commoditization of their inventory in general.  For most premium publishers, there are the types of sales: Super-premium programs, that will continue to be handled primarily by their direct sales force; “transactional RFP” business for standard IAB display units, which most see being streamlined by “systemic” reserved platform technologies; and programmatic RTB sales of lower class inventory.

So, is 2013 the year of “programmatic premium?” Yes—but only if that means that publishers embrace technologies that help them streamline the way they hand-sell their top-tier inventory.

[This post originally appearded in the EConsultancy blog on 2/5/13]

What I Learned Writing Best Practices in Data Management

Today data is like water: free-flowing, highly available, and pervasive. As the cost of storing and collecting data decreases, more of it becomes available to marketers looking to optimize the way they acquire new customers and activate existing ones. In the right hands, data can be the key to understanding audiences, developing the right marketing messages, optimizing campaigns, and creating long-term customers. In the wrong hands, data can contribute to distraction, poor decision-making, and customer alienation. Over the past several weeks, I asked over thirty of the world’s leading digital data practitioners what marketers should be thinking about when it comes to developing a data management strategy. The result was the newly available Best Practices in Data Management report. A few big themes emerged from my research, which I thought I would share:

Welcome to the First Party

Digital marketing evolves quickly but, for those of us working as digital marketers or publishers for the past 10 years, we have seen distinct waves of transformation impact the way we use data for audience targeting. Early on, audience data was owned by publishers, who leveraged that data to control pricing for premium audiences. The Network Era quickly supplanted this paradigm by leveraging tag data to understand publishers’ audiences better than the sites themselves. Buying targeted remnant inventory at scale created new efficiencies and easy paychecks for publishers, who found themselves completely disintermediated. The DSP Era (which we are still in) continued that trend, by completely separating audiences from media, and giving even more control to the demand side. Today, the “DMP Era” promises a new world where publishers and advertisers can activate their first party data, and use it for remarketing, lookalike modeling, and analytics.

The ubiquity of third party data (available to all, and often applied to the same exact inventory) makes activating first party data more valuable than ever. Doing so effectively means regaining a level of control over audience targeting for publishers, and being able to leverage CRM data for retargeting and lookalike modeling for the demand side, as well as a deeper level of analytics for both sides. If there has been one huge takeaway from my conversations with all of the stakeholders in the data-driven marketing game, it is that getting control and flexibility around the use of your own first-party data is the key to success. As a marketer, if you are buying more segments than you are creating, you are losing.

The New Computing Paradigm

In order to successfully activate all of the data your company can leverage for success takes a lot of work, and a lot of advanced technology. Whether you are a publisher trying to score audiences in milliseconds in order to increase advertising yield, or an advertiser attempting to deliver a customized banner ad to a prospect in real-time, you need to store an incredible amount of data and (more importantly) be able to access it at blazing speeds. In the past, having that capability meant building your own enormous technology “stack” and maintaining it.  Today, the availability of cloud-based computing and distributed computing solutions like Hadoop has created a brand new paradigm or what former Microsoft executive and current RareCrowds CEO Eric Picard likes to call the “4th Wave.”

“Being a Wave 4 company implicitly means that you are able to leverage the existing sunk cost of these companies’ investment,” says Picard. That means building apps on top of AppNexus’ extensible platform, leveraging Hadoop to process 10 billion daily transactions without owning a server (as Bizo does), or simply hosting portions of your data in Amazon’s cloud to gain speed and efficiency. As digital marketing becomes more data intensive, knowing how to leverage existing systems to get to scale will become a necessity. If you are not taking advantage of this new technology paradigm, it means you are using resources for IT rather than IP. These days, winning means applying your intellectual property to available technology—not who has the biggest internal stack.

Social Data is Ascendant

One of the most interesting aspects of data management is how it is impacting traditional notions of CRM. In the past, digital marketing seemed to end below the funnel. Once the customer was driven through the marketing funnel and purchased, she went into the CRM database, to be targeted later by more traditional marketing channels (e-mail, direct mail). Now, the emergence of data-rich social platforms had actually created a dynamic in which the funnel continues.

Once in the customer database (CRM), the post-purchase journey starts with a commitment beyond the sale, when a consumer joins an e-mail list, “friends” a company’s page, follows a company’s Twitter account, or signs up for special offers on the company’s site. The next step is an expression of social interest, when the consumer agrees to make public his like for a company or brand by “friending” a company’s page, following a company’s Twitter account. Beyond the “like” is true social activation, wherein the consumer actively (not passively) recommends the product or service, through commenting, sharing, or other active social behaviors. The final step is having the consumer sell on your behalf (directly via affiliate programs or, in the softer sense, as a “brand ambassador”).  This dynamic is why Salesforce has acquired Radian6 and Buddy Media.

For digital marketers, going beyond the funnel and activating consumers through social platforms means understanding their stated preferences, affinities, and that of their social graph. Most companies already do this with existing platforms. They real key is tying this data back into your other data inputs to create a 360 degree user view. That’s where data science and management platforms come in. If you are not ingesting rich social data and using it to continually segment, target, expand, and understand your customers, you are behind the curve.

[This post originally appeared on the EConsultancy blog. Get the paper here.]

Can you Buy “Brand?”

SreeUnderstanding Social Affinity Data

Marketers are increasingly turning to social platform data to understand their customers, and tapping into their social graphs to reach more of them. Facebook “likes” and Twitter “follows” are religiously captured and analyzed, and audience models are created—all in the service of trying to scale the most powerful type of marketing of all: Word-of-mouth.  With CRM players (like Salesforce, who recently acquired Buddy Media and Radian6) jumping into the game, digitally-derived social data is now an established part of traditional marketing.

But, are marketers actually finding real signals amid the noise of social data? In other words, if I “like” Lady Gaga, and you “like” Lady Gaga, and my ten year old daughter also “likes” Lady Gaga, then what is the value of knowing that? If I want to leverage social data to enrich my audience profiles, and try and get the fabled “360 degree” view of my customer, “likes” and “follows” may contribute more noise than insight. I recently sat down with Colligent’s Sree Nagarajan to discuss how brand marketers can go beyond the like, and get more value out of the sea of social data.

Colligent (“collectively intelligent,” if you like) goes beyond “likes” and actually measures what people do on social sites. In other words, if you merely “like” Lady Gaga, you are not measured, but if you post a Lady Gaga music video, you are. By scraping several hundred million Facebook profiles, and accessing the Twitter firehose of data, Nagarajan’s company looks at what people are socially passionate about—and matches it against other interests. For example, the data may reveal that 5% of Ford’s socially active fanbase is also wild about NASCAR. That’s great to know. The twist is that Colligent focuses on the folks who are nuts about NASCAR—and like Ford back. That’s called mutual engagement and, arguably, a more powerful signal.

Nagarajan’s focus on this type of data has many questioning the inherent value of targeting based on social media membership. “In any social network’s lifecycle, likes (or ‘follows’ or friends) start out as genuine signals of brand affinity. However as more and more like the page their audience gets increasingly diluted, making likes less of an indicator of brand’s true audience. True engagement as measured by comments, photo posts, re-tweets, hashes, etc. became much better indicators of brand affinity and engagement.”

Colligent data recently convinced Pepsi to choose Nicki Minaj as their spokesperson, since the data revealed a strong correlation between socially activated Pepsi and Minaj fans. Think about that for a second. For years, major brands have used softer, panel-based data (think “Q Score”) to decide what celebrities are most recognizable, and capture the right brand attributes. Now, getting hard metrics around the type of people who adore your brand are just a query away.  Digital marketers have been talking about “engagement” for years, and have developed a lexicon around measurement including “time spent” and “bounce rate.” Social affinity data goes deeper, measuring true engagement. For Nagarajan, “In order for the engagement to be truly effective, it needs to be measured from both sides (mutual engagement). The parallel is a real-world relationship. It’s not enough for me to like you, but you have to like me for us to have a relationship. Mapped to the brand affinity world, it’s not enough for Pepsi fans to engage with Nicki Minaj; enough Nicki fans have to engage with Pepsi (more than the population average on both sides) to make this relationship truly meaningful and thus actionable. When true engagement is married with such mutual engagement, the result is intelligence that filters the noise in social networks to surface meaningful relationships.”

So, what else can you learn from social affinity data? With so many actively engaged fans and followers, throwing off petabytes of daily data, these networks offer a virtual looking glass for measuring real world affinities. If you think about the typical Facebook profile, you can see that many of the page memberships are driven by factors that exist outside the social network itself. That makes the data applicable beyond digital:

  • Television: Media planners can buy the shows, networks, and radio stations that a brand’s fans are highly engaged with.
  • Public Relations: Flacks can direct coverage towards  the media outlets a brand’s fans are engaged with.
  • Sponsorships: Marketers can leverage affinity data to determine which celebrity should be a brand’s spokesperson.
  • Search: SEM directors can expand keyword lists for Google and Facebook buys using social affinity-suggested keywords.
  • Display: Discover what sites Ford’s socially activated consumers like, and buy those sites at the domain level to get performance lift on premium guaranteed inventory buys.

Are we entering into a world in which marketers are going to use this type of data to fundamentally change the way they approach media buying?  What does it mean to “buy brand?” Sree Nagarajan sees this type of data potentially transforming the way offline and online media planners begin their process. “Much of the audience selection options available in the market today are media based. Nielsen defines TV audience, Arbitron radio, ComScore digital sites, MRI magazines, etc. Brand marketers are forced to define their audiences in the way media measures audience: by demographics (e.g., 18-49 male),” remarks Sree.  “Now, for the first time, social data allows marketers to define audiences based on their own brand and category terms. Now, they can say ‘I want to buy TV shows watched by Pepsi and more generally, Carbonated Soft Drinks audience.’ This will truly make marketing brand-centric instead of media-centric. Imagine a world where brand and category GRPs can be purchased across media, rather than GRPs in a specific media.”

Look for this trend to continue, especially as company’s become more aggressive aligning their CRM databases with social data.

[This article originally appeared in ClickZ on 12/11/12]

Discover more on this topic and others by downloading my new whitepaper, Best Practices in Data Management

Digital Advertising Veteran Chris O’Hara Appointed Chief Revenue Officer at LookSmart

SAN FRANCISCO, May 30, 2012 (GlobeNewswire via COMTEX) — LookSmart, Ltd. LOOK +2.58% , an online advertising network solutions company, today announced a key addition to its management team with the hire of industry veteran Chris O’Hara as Chief Revenue Officer. In this role, O’Hara will lead the LookSmart sales and delivery organization and leverage his years of experience with direct advertisers and agencies to increase LookSmart’s market share.

“With the addition of Chris to the LookSmart team, we are continuing to build an outstanding sales and delivery organization with a history of proven success in digital advertising,” said Dr. Jean-Yves Dexmier, Chairman and CEO of LookSmart. “We are very excited to have someone with Chris’s vast experience to lead our revenue generating organization.”

Chris O’Hara is a domain expert on platform technology, with an emphasis on digital advertising workflow, data management, and real-time bidding. He has led successful sales efforts at TRAFFIQ, Reviewed.com, and Mediabistro.com. Chris is a member of American Business Media’s speaker’s bureau and the IAB’s Networks and Exchanges and Sales Executive Committees. Chris is an accomplished author and a contributor to industry publications including Business Insider, eMarketing & Commerce, eConsultancy, AdMonsters, MediaPost, The Agency Post, Adotas, ClickZ, iMediaConnection, DigiDay, and AdWeek. His latest work, Best Practices in Digital Display Media (eConsultancy) is aimed at helping marketers understand the digital display technology landscape. He is currently working on a comprehensive whitepaper on Data Management that covers marketing approaches to Big Data.

“I am very excited about joining LookSmart,” said Chris, “To me the three things you need to create differentiated ad technology are platform technology, valuable data, and great people. LookSmart happens to have all of that in abundance. I am looking forward to joining the team and leading the effort to tell the advertising community what our plans are for leveraging our assets to create a unique cross-channel digital advertising platform.”

About LookSmart, Ltd.

LookSmart is an online advertising network solutions company that provides performance solutions for online advertisers and online publishers. LookSmart offers advertisers targeted, performance based advertising via its Advertiser Networks; and an Ad Center platform for customizable private-label advertiser solutions for online publishers. LookSmart is based in San Francisco, California. For more information, visit http://www.looksmart.com or call 415-348-7500.

The LookSmart, Ltd. logo is available at http://www.globenewswire.com/newsroom/prs/?pkgid=8717

This news release was distributed by GlobeNewswire, http://www.globenewswire.com

SOURCE: LookSmart, Ltd.

        CONTACT: Bill O'Kelly, Senior Vice President Operations and
        Chief Financial Officer
        (415) 348-7208
        bo'kelly@looksmart.net
        ICR, Inc.
        John Mills, Senior Managing Director
        (310) 954-1100
        john.mills@icrinc.com

Choosing a Data Management Platform

“Big  Data”  is  all  the  rage  right  now,  and  for a good reason. Storing tons and tons of data has gotten very inexpensive, while the accessibility of that data has increased substantially in parallel. For the modern marketer, that means having access to literally dozens of disparate data sources, each of which cranks out large volumes of data every day. Collecting, understanding, and taking action against those data sets is going to make or break companies from now on. Luckily, an almost endless variety of companies have sprung up to assist agencies and advertisers with the challenge. When it comes to the largest volumes of data, however, there are some highly specific attributes you should consider when selecting a data management platform (DMP).

Collection and Storage: Scale, Cost, and Ownership
First of all, before you can do anything with large amounts of data, you need a place to keep it. That  place  is  increasingly  becoming  “the  cloud”  (i.e.,  someone  else’s  servers),  but  it  can  also  be   your own servers. If you think you have a large amount of data now, you will be surprised at how much it will grow. As devices like the iPad proliferate, changing the way we find content, even more data will be generated. Companies that have data solutions with the proven ability to scale at low costs will be best able to extract real value out of this data. Make sure to understand how your DMP scales and what kinds of hardware they use for storage and retrieval.

Speaking of hardware, be on the lookout for companies that formerly sold hardware (servers) getting into the  data  business  so  they  can  sell  you  more  machines.  When  the  data  is  the  “razor,”   the  servers  necessarily  become  the  “blades.”  You  want  a  data  solution  whose  architecture  enables the easy ingestion of large, new data sets, and one that takes advantage of dynamic cloud provisioning to keep ongoing costs low. Not necessarily a hardware partner.

Additionally, your platform should be able to manage extremely high volumes of data quickly, have an architecture that enables other systems to plug in seamlessly, and whose core functionality enables multi-dimensional analysis of the stored data—at a highly granular level. Your data are going to grow exponentially, so the first rule of data management is making sure that, as your data grows, your ability to query them scales as well. Look for a partner that can deliver on those core attributes, and be wary of partners that have expertise in storing limited data sets.
There are a lot of former ad networks out there with a great deal of experience managing common third party data sets from vendors like Nielsen, IXI, and Datalogix. When it comes to basic audience segmentation, there is a need to manage access to those streams. But, if you are planning on capturing and analyzing data that includes CRM and transactional data, social signals, and other large data sets, you should look for a DMP that has experience working with first party data as well as third party datasets.

The concept of ownership is also becoming increasingly important in the world of audience data. While the source of data will continue to be distributed, make sure that whether you choose a hosted or a self-hosted model, your data ultimately belongs to you. This allows you to control the policies around historical storage and enables you to use the data across multiple channels.

Consolidation and Insights: Welcome to the (Second and Third) Party
Third party data (in this context, available audience segments for online targeting and measurement) is the stuff that the famous Kawaja logo vomit map was born from. Look at the map, and you are looking at over 250 companies dedicated to using third party data to define and target audiences. A growing number of platforms help marketers analyze, purchase, and deploy that data for targeting (BlueKai, Exelate, Legolas being great examples). Other networks (Lotame, Collective, Turn) have leveraged their proprietary data along with their clients to offer audience management tools that combine their data and third party data to optimize campaigns. Still others (PulsePoint’s  Aperture  tool  being  a  great  example)  leverage  all  kinds  of  third party data to measure online audiences, so they can be modeled and targeted against.

The key is not having the most third party data, however. Your DMP should be about marrying highly validated first party data, and matching it against third party data for the purposes of identifying, anonymizing, and matching third party users. DMPs must be able to consolidate and create as whole of a view of your audience as possible. Your DMP solution must be able to enrich the audience information using second and third party data. Second party data is the data associated with audience outside your network (for example, an ad viewed on a publisher site or search engine). While you must choose the right set of third party providers that provide the best data set about your audience, your DMP must be able to increase reach by ensuring that you can collect information about as many relevant users as possible and through lookalike modelling.

First Party Data

  • CRM data, such as user registrations
  • Site-site data, including visitor history
  • Self-declared user data (income, interest in a product)

Second Party Data

  • Ad serving data (clicks, views)
  • Social signals from a hosted solution
  • Keyword search data through an analytics platform or campaign

Third Party Data

  • Audience segments acquired through a data provider

For example, if you are selling cars and you discover that your on-site users who register for a test drive are most closely  matched  with  PRIZM’s  “Country  Squires”  audience,  it  is  not  enough  to  buy   that Nielsen segment. A good DMP enables you to create your own lookalike segment by leveraging that insight—and the tons of data you already have. In other words, the right DMP partner can help you leverage third party data to activate your own (first party) data.

Make sure your provider leads with management of first party data, has experience mining both types of data to produce the types of insights you need for your campaigns, and can get that data quickly.  Data  management  platforms  aren’t  just  about  managing  gigantic  spreadsheets.  They  are   about finding out who your customers are, and building an audience DNA that you can replicate.

Making it Work
At the end of the day, it’s  not  just  about  getting  all  kind  of  nifty  insights  from  the  data.  It’s   valuable to know that your visitors that were exposed to search and display ads converted at a 16% higher rate, or that your customers have an average of two females in the household.  But  it’s   making those insights meaningful that really matters.
So, what to look for in a data management platform in terms of actionability? For the large agency or advertiser, the basic functionality has to be creating an audience segment. In other words, when the blend of data in the platform reveals that showing five display ads and two SEM ads to a household with two women in it creates sales, the platform should be able to seamlessly produce that segment and prepare it for ingestion into a DSP or advertising platform. That means having an extensible architecture that enables the platform to integrate easily with other systems.

Moreover, your DSP should enable you to do a wide range of experimentation with your insights. Marketers often wonder what levers they should pull to create specific results (i.e., if I change my display creative, and increase the frequency cap to X for a given audience segment, how much will conversions increase)? Great DMPs can help built those attribution scenarios, and help marketers visualize results. Deploying specific optimizations in a test environment first means less waste, and more performance. Optimizing in the cloud first is going to become the new standard in marketing.

Final Thoughts
There are a lot of great data management companies out there, some better suited than others when it comes to specific needs. If you are in the market for one, and you have a lot of first party data to manage, following these three rules will lead to success:

  • Go beyond third party data by choosing a platform that enables you to develop deep audience profiles that leverage first and third party data insights. With ubiquitous access to third party data, using your proprietary data stream for differentiation is key.
  • Choose a platform  that  makes  acting  on  the  data  easy  and  effective.  “Shiny,  sexy”  reports  are   great, but the right DMP should help you take the beautifully presented insights in your UI, and making them work for you.
  • Make sure your platform has an applications layer. DMPs must not only provide the ability to profile your segments, but also assist you with experimentation and attribution–and provide you with ability to easily perform complicated analyses (Churn, and Closed Loop being two great  examples).  If  your  platform  can’t  make  the  data  dance,  find  another  partner.

Available DMPs, by Type
There are a wide variety of DMPs out there to choose from, depending on your need. Since the space is relatively new, it helps to think about them in terms of their legacy business model:

  • Third Party Data Exchanges / Platforms: Among the most popular DMPs are data aggregators like BlueKai and Exelate, who have made third  party  data  accessible  from  a  single  user  interface.  BlueKai’s  exchange approach enables data buyers  to  bid  for  cookies  (or  “stamps”)  in  a  real-time environment, and offers a wide variety of providers to choose from. Exelate also enables access to multiple third party sources, albeit not in a bidded model. Lotame offers  a  platform  called  “Crowd  Control”  which  was  evolved  from  social   data, but now enables management of a broader range of third party data sets.
  • Legacy Networks: Certain networks with experience in audience segmentation have evolved to provide data management capabilities, including Collective, Audience Science, and Turn. Collective is actively acquiring assets (such as creative optimization provider, Tumri14) to  broaden  its  “technology   stack”  in  order  to  offer  a  complete  digital  media  solution  for  demand  side customers. Turn is, in fact, a fully featured demand-side platform with advanced data management capabilities, albeit lacking  the  backend  chops  to  aggregate  and  handle  “Big  Data”  solutions  (although  that  may   rapidly change, considering their deep engagement with Experian). Audience Science boasts the most advanced native categorical audience segmentation capabilities, having created dozens of specific, readily accessible audience segments, and continues to migrate its core capabilities from media sales to data management.
  • Pure Play DMPs: Demdex (Adobe), Red Aril, Krux, and nPario are all pure-play data management platforms, created from the ground up to ingest, aggregate, and analyze large data sets. Unlike legacy networks, or DMPs that specialize in aggregating third party data sets, these DMPs provide three core offerings: a core platform for storage and retrieval of data; analytics technology for getting insights from the data with a reporting interface; and applications, that enable marketers to take action against that data, such as audience segment creation, or lookalike modeling functionality. Marketers with extremely large sets of structured and unstructured data that go beyond ad serving and audience data (and may include CRM and transactional data, as an example), will want to work with a pure-play DMP

This post is an excerpt of Best Practices in Digital Display Advertising: How to make a complex ecosystem work efficiently for your organization All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording or any information storage and retrieval system, without prior permission in writing from the publisher.

Copyright © Econsultancy.com Ltd 2012

The Five Things to Expect in a DMP

Getting back control over their inventory is giving publishers a lot to think about.

“We want to make sure that we’re controlling what happens with data . . . we want to make sure we control pricing. Control’s a very important message. We don’t want there to be a cottage industry built on our backs” – Nick Johnson, SVP, NBC Universal

What do publishers really want? It’s simple, really: Power and control. In order to survive the ad technology era, publishers need the power to monetize their audiences without relying on third parties, and complete control over how they sell their inventory. In this era of “Big Data,” there is a fire hose stream of tremendously valuable information for publishers to take advantage of, such as keyword-based search data, attitudinal survey data, customer declared data from forms, page-level semantic data, and all the 3rd party audience data you can shake a stick at.

All of this data (cheap to produce, and ever-cheaper to store) has given rise to companies who can help publishers bring that data together, make sense of it, and use it to their advantage. Currently, ad technology companies have been using the era of data to their advantage, utilizing it to create vertical ad networks, ad exchanges, data exchanges, DSPs, and a variety of other smart-sounding acronyms that ultimately purport to help publishers monetize their audiences, but end up monetizing themselves.

Rather than power the ad tech ecosystem, what if data could actually help publishers take back their audiences? If “data is the new gold” as the pundits are saying, then smart publishers should mine it to increase margins, and take control of their audiences back from networks and exchanges. Here are the five things a good data management platform should enable them to do:

  • Unlock the Value of 1st Party Data: Publishers collect a ton of great data, but a lot of them (and a LOT of big publishers) don’t leverage it like they should. Consider this recent stat: according to a recent MediaPost article, news sites only use in-site audience targeting on 47% of their impressions, as opposed to almost 70% for Yahoo News.  By leveraging site-side behavioral data, combined with CRM data and other sources, it is possible to layer targeting on almost every impression a publisher has. Why serve a “blind” run-of-site (ROS) ad, when you can charge a premium CPM for audience-targeted inventory?
  • Decrease Reliance on 3rd Parties: The real reason to leverage a DMP is to get your organization off the 3rd party crack pipe. Yes, the networks and SSPs are a great “plug and play” solution (and can help monetize some “undiscoverable” impressions), but why are publishers selling raw inventory at $0.35 and letting the people with the data resell those impressions for $3.50? It’s time to turn away those monthly checks, and start writing some to data management companies that can help you layer your own data on top of your impressions, and charge (and keep) the $3.50 yourself. Today’s solutions don’t have to rely on pre-packaged 3rd party segments to work, either, meaning you can really reduce your data costs. With the right data infrastructure, and today’s smart algorithm-derived models, a small amount of seed data can be utilized to create discrete, marketable audience segments that publishers can own, rather than license.
  • Generate Unique Audience Insights: Every publisher reports on clicks and impressions, but what advertisers are hungry for (especially brand advertisers) are audience details. What segments are most likely to engage with certain ad content? Which segments convert after seeing the least amount of impressions? More importantly, how do people feel about an ad campaign, and who are they exactly? Data management technology is able to meld audience and campaign performance data to provide unique insights in near real-time, without having to write complicated database queries and wait long times for results. Additionally, with the cost of storing data getting lower all the time, “lookback windows” are increasing, enabling publishers to give credit for conversion path activity going back several months. Before publishers embraced data management, all the insights were in the hands of the agency, who leveraged the data to their own advantage. Now, publishers can start to leverage truly powerful data points to create differentiated insights for clients directly, and provide consultative services with them, or offer them as a value-added benefit.
  • Create New Sales Channels: Before publisher-side data management, when a publisher ran out of the Travel section impressions, he had to turn away the next airline or hotel advertiser, or offer them cheap ROS inventory. Now, data management technology can enable sales and ops personnel to mine their audience in real time and find “travel intenders” across their property—and extend that type of audience through lookalike modeling, ensuring additional audience reach. By enabling publishers to build custom audience segments for marketers on the fly, a DMP solution ensures that no RFP will go unanswered, and ROS inventory gets monetized at premium prices. 
  • Create Efficiency: How many account managers does it take to generate your weekly ad activity reports? How much highly paid account management time are publishers burning by manually putting together performance reports? Why not provide an application that advertisers can log into, set report parameters, and export reports into a friendly format? Or, better yet, a system that pre-populates frequent reports into a user interface, and pushes them out to clients via an e-mail link? You would think this technology was ubiquitous today, but you would be wrong. Ninety-nine percent of publishers still do this the hard (expensive) way, and they don’t have to anymore.

It’s time for publishers to dig into their data, and start mining it like the valuable commodity it is. Data used to be the handcuffs which kept publishers chained to the ad technology ecosystem, where they grew and hosted a cottage industry of ad tech remoras. The future that is being written now is one of publishers’ leveraging ad technologies to take back control, so they can understand and manage their own data and have the freedom to sell their inventory for what it is truly worth.

That’s a future worth fighting for.

[This post originally appeared in ClickZ on 2/29/12]

Same Turkey, New Knife

The way the ad tech world looked pre-DSP...and pre-DMP

Technology may still capture the most advertising value, but what if publishers own it?

A few years ago, ad technology banker Terence Kawaja gave a groundbreaking IAB presentation entitled, “Parsing the Mayhem: Developments in the Advertising Technology Landscape.” Ever since then, his famed logo vomit slide featuring (then) 290 different tech companies has been passed around more than a Derek Jeter rookie card.

While the eye chart continues to change, the really important slide in that deck essentially remains the same. The “Carving up the stack” slide (see above), which depicts how little revenue publishers see at the end of the ad technology chain, has changed little since May 2010. In fact you could argue that it has gotten worse. The original slide described the path of an advertiser’s $5 as it made it’s way past the agency, through ad networks and exchanges, and finally into the publisher’s pocket.

The agency took about $0.50 (10%), the ad networks grabbed the biggest portion at $2.00 (40%), the data provider took two bits (5%), the ad exchange sucked out $0.35 (7%), and the ad server grabbed a small sliver worth $0.10 (2%), for a grand total of 64%. The publisher was left with a measly $1.80. The story hasn’t changed, and neither have the players, but the amounts have altered slightly.

While Kawaja correctly argued that DSPs provided some value back to both advertisers and publishers through efficiency, let’s look ahead through the lens of the original slide. Here’s what has happened to the players over the last 2 years:

  • Advertiser: The advertiser continues to be in the cat bird seat, enjoying the fact that more and more technology is coming to his aid to make buying directly a fact of life. Yes, the agency is still a necessary (and welcomed) evil, but with Facebook, Google, Pandora, and all of the big publishers willing to provide robust customer service for the biggest spenders, he’s not giving up much. Plus, agency margins continue to shrink, meaning more of their $5.00 ends up as display, video, and rich media units on popular sites.
  • Agency: It’s been a tough ride for agencies lately. Let’s face it: more and more spending is going to social networks, and you don’t need to pay 10%-15% to find audiences with Facebook. You simply plug in audience attributes and buy. With average CPMs in the $0.50 range (as opposed to $2.50 for the Web as a whole), advertisers have more and more reason to find targeted reach by themselves, or with Facebook’s help. Google nascent search-keyword powered display network isn’t exactly helping matters. Agencies are trying to adapt and become technology enablers, but that’s a long putt for an industry that has long depended on underpaying 22 year olds to manage multi-million dollar ad budgets, rather than overpaying 22 year old engineers to build products.
  • Networks: Everyone’s talking about the demise of the ad network, but they really haven’t disappeared. Yesterday’s ad networks (Turn, Lotame) are today’s “data management platforms.” Instead of packaging the inventory, they are letting publishers do it themselves. This is the right instinct, but legacy networks may well be overestimating the extent to which the bulk of publishers are willing (and able) to do this work. Networks (and especially vertical networks) thrived because they were convenient—and they worked. Horizontal networks are dying, and the money is simply leaking into the data-powered exchange space…
  • Data Providers: There’s data, and then there’s data. With ubiquitous access to Experian, IXI, and other popular data types through 3rd party providers, the value of 3rd party segments has declined dramatically. Great exchanges like eXelate give marketers a one-stop shop for almost every off-the-shelf segment worth purchasing, so you don’t need to strike 20 different license deals. Yet, data is still the lifeblood of the ecosystem. Unfortunately for pure-play segment providers, the real value is in helping advertisers unlock the value of their first party data. The value of 3rd party data will continue to decline, especially as more and more marketers use less of it to create “seeds” from which lookalike models are created.
  • Exchanges: Exchanges have been the biggest beneficiary of the move away from ad networks. Data + Exchange = Ad Network. Now that there are so many plug and play technologies giving advertisers access to the world of exchanges, the money had flowed away from the networks and into the pockets of Google AdX, Microsoft, Rubicon. PubMatic, and RMX.
  • Ad Serving: Ad serving will always be a tax on digital advertising but, as providers in the video and rich media space provide more value, their chunk of the advertiser pie has increased. Yes, serving is a $0.03 commodity, but there is still money to be made in dynamic allocation technology, reporting, and tag management. As an industry, we like to solve the problems we create, and make our solutions expensive. As the technology moves away from standardized display, new “ad enablement” technologies will add value, and be able to capture more share.
  • Publisher: Agencies, networks, and technologists have bamboozled big publishers for years, but now smart publishers are starting to strike back. With smart data management, they are now able to realize the value of their own audiences—without the networks and exchanges getting the lion’s share of the budget. This has everything to do with leveraging today’s new data management technology to unlock the value of first party data—and more quickly aggregate all available data types to do rapid audience discovery and segmentation.

 The slide we are going to be seeing in 2012, 2013 and beyond will show publishers with a much larger share, as they take control of their own data. Data management technology is not just the sole province of the “Big Five” publishers anymore. Now, even mid-sized publishers can leverage data management technology to discover their audiences, segment them, and create reach extension through lookalike modeling. Instead of going to a network and getting $0.65 for “in-market auto intenders” they are creating their own—and getting $15.00.

Now, that’s a much bigger slice of the advertising pie.

[This post originally appeared in ClickZ on 2/1/12]

Signal to Noise

What Data Should Inform Media Investment Decisions?

The other day, I was updating my Spotify app on my Android device. When it finally loaded, I was asked to log in again. I immediately loaded up a new playlist that I had been building—a real deep dive into the 1980s hardcore music I loved back in my early youth. I’m not sure if you are familiar with the type of music that was happening on New York City’s lower east side between 1977 and 1986, but it was some pretty raw stuff…bands like the Beastie Boys (before they went rap), False Prophets, the Dead Boys, Minor Threat, the Bad Brains, etc. They had some very aggressive songs, with the lyrics and titles to match.

Well, I put my headphones in, and started walking from my office on 6th Avenue and 36th street across to Penn Station to catch the 6:30 train home to Long Island…all the while broadcasting every single song I was listening to on Facebook. Among the least offensive tunes that showed up within my Facebook stream was a Dead Kennedys song with the F-word featured prominently in the song title.  A classic, to be sure, but probably not something all of my wife’s friends wanted to know about.

As you can imagine, my wife (online at the time), was frantically e-mailing me, trying to tell me to stop the offensive social media madness that was seemingly putting a lie to my carefully cultivated, clean, preppy, suburban image.

So why, as a digital marketer, would you care about my Spotify Facebook horror story?

Because my listening habits (and everything else you and I do online, for that matter) are considered invaluable social data “signals” that you are mining to discover my demographic profile, buying habits, shoe size, and (ultimately) what banner ad to serve me in real time. The only problem is that, although I love hardcore music, it doesn’t really define who I am, what I buy, or anything else about me. It is just a sliver of time, captured digitally, sitting alongside billions of pieces of atomic level data, captured somewhere in a massive columnar database.

Here are some other examples of data that are commonly available to marketers, and why they may not offer the insights we think they might:

– Zip Code: Generally, zip codes are considered a decent proxy for income, especially in areas like Alpine, New Jersey, which is small and exclusive. But how about Huntington, Long Island, with an average home value of $516,000? That zip code contains the village of Lloyd Harbor (average home value of $1,300,000) and waterside areas in Huntington Bay like Wincoma, where people with lots of disposable income live).

– Income: In the same vein, income is certainly important and there are a variety of reliable sources that can get close to a consumer’s income profile, but isn’t disposable income a better metric? If you earn $250,000 per year, and your expenses are $260,000, then you are not exactly Nordstrom’s choicest customer. In fact, you are what we call “broke.” Maybe that was okay back in the good old days of government-style deficit spending but, these days, luxury marketers need a sharper scalpel to separate the truly wealthy from the paper tigers.

– Self-Declared Data: We all like to put a lot of emphasis on the answers real consumers give us on surveys, but who hasn’t told a little fib from time to time? If I am “considering a new car” is my price range “$19,000 – $25,500” or “35,000 – $50,000?” This type of social desirability bias is so common that reaearchers have sought other ways of inferring income and purchase behavior. When people lie about themselves, to themselves (in private, no less)  you must take a good deal of self-declared data with a hearty grain of salt.

– Automobile Ownership: Want to know how much dough a person has? Don’t bother looking at his home or zip code. Look at his car. A person who has $1,800 a month to burn on a Land Rover is probably the same person liable to blow $120 on mail order steaks, or book that Easter condo at Steamboat. Auto ownership, among other things, is a great proxy for disposable income.

It would be overly didactic to rehearse all of the possible iterations of false data signals that are being used by marketers right now to make real-time bidding decisions in digital media. There are literally thousands—and social “listening” is starting to make traditional segmentation errors look tame. Take a recent Wall Street Journal article that reported that the three most widely socially-touted television shows fared worse than those than shows which received far less social media attention.

Sorry, but maybe that hot social “meme” you are trying to connect with just isn’t that valuable as a “signal.” We all hear the fire truck going by on 7th Avenue. The problem is that the only people who turn to look at it are the tourists. So what is the savvy marketer to do?

Remember that all data signals are just that: Signals. Small pieces of a very complicated data puzzle that you must weave together to create a profile. Unless you are leveraging reliable first-party data, second-party data, and third party data, and stitching that data together, you cannot get a true view of the consumer.

In my next column, we’ll look at how stitching together disparate data sources can reveal new, more reliable, “signals” of consumer interest and intent.

[This article was originally published in ClickZ on 12/2/2011]