At the end of January, Ron Sellers of Grey Matter Research published a second report entitled 'More Dirty Little Secrets of Online Panel Research' and briefly caused that old elephant in the room, panel quality, to re-surface in the various MR industry debating arenas.
Sellers' report was based on the findings of a Mystery Shopping exercise in which individuals signed up to variousonline access panels* and reported back on key aspects of the experience; like the frequency of invitation, the proportion of surveys from which they were disqualified, or were closed, the incentives offered, the length of surveys, the quality of survey authorship and so on.
The results were far from glowing. In Ron's words, if we were to treat respondents like this in a focus group setting , "They'll be tired, surly, disgruntled, biased, and just looking to get the whole thing over with as quickly as they can - assuming they don't just walk out."
So, the question is why do panel owners treat respondents like this? The obvious answer is that they want to get as much data out of them, at the lowest possible cost and achieve the largest possible profit. They are not interested in sending fewer surveys, paying better incentives, compensating for disqualifications or closed surveys. This is in large part because there is little incentive to do so (other than having a genuine regard for treating people fairly) when researchers who purchase sample do not demand this, and in fact demand ever lower sample prices.
However, there is also another, little discussed reason relating to panel member recruitment.
I first began recruiting for panels in 2002. At this time, answering questions was one of the more 'interactive' and novel things you could do online, which certainly made life easier. More importantly, the world of online marketing was very, very different. Instead of ads on specific sites being brokered by large advertising networks, you could go directly to website to buy space on their web properties, and with whom you could negotiate on price and placement. Similarly, many sites or organisations that collected lists of email addresses were not yet aware of the value of such lists. I recall being able to email an invitation to join a panel to a good list of small business owners, in return for an omni question, and recruiting freely off the back of an ad hoc survey deployed to senior health professionals.
We also used Google and other search engine contextual adverts for recruitment and not even via bidding on keywords like 'surveys', but by bidding on other terms like 'diet', 'recycling', or 'hockey'. The ads would pose a brief topic-specific question like 'Diet or exercise - which matters more?', or 'Who will win Stanley Cup: Canucks or Bruins?' and when clicked would allow the person to respond to a quick poll before being asked the join the panel to answer more questions of a similar nature.
Using these methods, we were able to reach a wide range of people online, who possessed varied interests, via diverse types of sources. We could also do that affordably.
In the intervening decade, the world of advertising has seen one of the biggest, most rapid changes that it perhaps ever has. Billions of dollars in advertising has moved from TV and Print to Online. In tandem, the cost of online advertising has rocketed. Panel companies doing 'lead generation' marketing now compete with huge brands with big budgets who might pay more per page impression for increasing brand awareness, than they could reasonably pay for a double opted-in member. The idea of bidding on a keyword like 'diet' versus pharmaceuticals, neutraceuticals, diet product / scheme producers and so on, would be out of the realms of possibility.
To recruit using Google now might cost upwards of $15 per recruit, or more, if bidding on less directly-related terms than 'survey'. In fact, most panel companies do not risk the expense of pay-per-click online advertising, only working with companies that will agree to a cost per acquisition that gives a fixed cost per recruit, or sometimes via commercial email lists that are more predictable in terms of conversion rate. Indeed, most website advertising is now controlled and brokered through advertising or affiliate networks. The networks receive ads from clients (such as panel companies) and offer them to their affiliate sites at a particular fee per click. If the affiliate feels that the ad will be a success on their site and generate enough clicks to warrant placement, they will run it - if not, they will not.
Unfortunately for panel owners, by far the majority of sites will not choose to take up their ads because the per-click offer is relatively paltry, and because our ads are less likely to draw in the crowds compared to something more topical, or a more compelling offer from a recognised brand. That is, unless, of course the affiliate site is indeed on a topic related to online research, or online rewards opportunities. And there's the rub: typically the only affiliates that will take up advertising for panels are those sites aimed at 'professional survey takers' or that promote 'making money at home' because their audiences will click through in sufficient numbers to make the ad worth carrying.
The fact is that for most access panels, recruitment comes predominantly from these sources, plus now too, from social media games advertising. Unfortunately recruits from the latter tend to have sub 5% response rates which, we suspect, is because the sole goal of the game player is to access the one-off in-game benefit.
With these rising costs of online lead generation and a narrowing range of online recruit sources feasible, it is perhaps less surprising that panel management good practices are being eroded: there is greater imperative to invite a recruited member to participate in as many surveys as possible to be sure to re-claim their recruitment cost. Moreover, panel owners are under no illusions that members are taking only their surveys, making the application of limits on invitations sent to a member seem somewhat futile. Finally, there is still the continuing assumption amongst sample buyers that 'bigger is better' when it comes to panels (somewhat fallacious where response rates are very low since the available sample pool may be the same as a smaller panel with high response rate). The volume of recruitment required to sustain huge panel sizes can only come from the sources that attract a high traffic composed of people who are highly likely to join.
At Vision Critical, we have put a greater emphasis on prevention than cure as an approach to panel quality. When it comes to recruiting our National Panels, this has informed our recruitment strategy. For example, when we use advertising and affiliate networks, we ask them to restrict our ads from sites aimed at professional survey takers. We likely pay more per click than the average access panel as a result. We have also been fortunate to find partnerships through which to non-commercially available sources of new members. This approach is borne out in the results the Grey Matter published: our panel was one of those tested and received a more favourable report. That said, we are not immune to the changing market for online panel recruitment, and no doubt like other online sample providers are exploring new ways to recruit.
What is most surprising is the very limited awareness of recruitment challenges beyond the limited niche of those charged with the task. In fact, in talking to industry professionals I've more often come across the opposite assumption that recruitment should be easier, cheaper, and have greater reach due to increased internet penetration, online activity and new channels like social media.
Let's take Sellers' second report as an opportunity to talk about recruitment issue more widely: what do you think are the solutions? How else can we seek out willing volunteers to spend their time and effort answering our questions?
*Note: the report only examined online panels of the type generally called 'Access Panels'. These are typically very large, national or international panels, which are created in order to supply sample to multiple end clients' surveys and research projects. Vision Critical's National Panels, Springboard America, Springboard UK and the Angus Reid Forum (ARF) fall into this category. This type of online panel is quite distinct from 'Community Panels' of the type we build and run on the Sparq platform for our client. Neither the report, nor I would argue, many of the issues it exposed, pertains to Community Panels.