Sometimes quality and beauty comes in small packages: diamonds, chocolates, poodles, bonsai trees, the Olsen twins. Okay, scrub the last one. But you get the idea? Quality is usually better than quantity.
Yet, in advertising we seem to ignore this premise. We look at reach and frequency as the two guiding principles – as many people as possible, as often as possible. It makes sense, because advertisers have products to launch and there is a critical mass they need to get to. But we don’t want to reach people who have no interest or inclination in the product we are pushing.
With digital, of course, we are seeing a move away from the scattergun approach of old. We can target more effectively. We still reach big audiences, but we cut the wastage. And we are using more data to help define precisely who we reach.
The introduction of richer data sets throws another dimension into the mix. Now, we want lots of data. The more the better. It is as if you can never have too much of the stuff. But, again, quality counts.
Online data is frequently gathered from just a handful of sources. Major publishers are used because, again, of the false assumption that bigger is best. Sure, they have lots of traffic and cookies can tell us a bit about their visitors, but often the depth of this information is quite limited. For example, they might tell you that they visited particular pages on their website, but in a mainstream site such information can have limited use. Everybody has an interest in the news of the day, for example. That is not going to help you identify where to target a campaign aimed at selling the latest from Lego Friends, or a Jenny Craig weight loss plan.
You need to know more about your audience than the bigger sites can tell you and, in reality, the richer data comes from smaller properties. For example, if someone visits a car sales site, you know they are likely to be in the market to buy or sell a vehicle. The type of vehicle might indicate their income or spending pattern. If it is a Prius you know they are greenies.
Some sites might ask their users to register and login through a social media channel, like Facebook. This gives the site access to more information about the user, potentially gathered from their feed. Again, smaller sites are more likely to gather and make available these extra sources of rich data.
Such sites might not collect information at scale, but if you collect data from lots of them then you can build a comprehensive database – far more powerful than the big sites are capable of providing. The key is to ensure that the data from each site is verified, defined and aggregated into meaningful segments.
This does not mean advertisers have to run campaigns on these smaller sites. Reaching scale across lots of them could easily be cost prohibitive. Instead, the profile information can point to campaigns that run across certain sections of mainstream sites which meet the need for scale. The small sites might simply have been used to provide the behavioural data that the big sites have not been able to pass on.
Suppose, for example, the data on car buyers gathered from specialist sites shows there are larger sites with a similar profile. They might not have pages about selling cars, but the profile tells us they are the same people, and you can reach more of them, more quickly, than running campaigns exclusively on car sales sites.
It is likely, though, that the smaller sites will be used as part of the media mix, helping to reach smaller niches and provide additional frequency against the bigger publishers. That means these sites will continue to offer richer data as a means of differentiation. That helps nurture a more diversified range of publishers, which is good for everyone.
The secret lies in how smaller sets of data is aggregated. Ultimately, we all want as much data as we can get because it helps us define our audiences more specifically and helps reduce waste. But quality counts. And in this instance, like in all things except the Olsen Twins, small can mean more depth.