The end of “don’t ask, don’t tell” in online survey research?
March 05, 2012
My colleague, Gregg Peterson, attended last week's CASRO Online Research Conference and has sent me this post.
The era of "don't ask, don't tell" in the world of commercial market research may well have ended last week in Las Vegas at the annual CASRO Online Research Conference. The purveyors of online surveyors came face-to-face with the survey taking monsters they created. Back in our rooms, if we looked up at the big mirrored walls of our swanky, conference hotel suites, we noticed the real culprits staring straight back at us.
Hands down, the talk of last week's conference was "the panel of panelists". These were eight people recruited at the back of a large quantitative study of web survey takers who were there to help us understand their world and to tell us what they liked and disliked about the surveys we served up to them. They were selected from among a few hundred qualified Las Vegas respondents (the Las Vegas respondents being a sub-quota of the large national study) based initially on the quality and thoughtfulness of their open ended responses in the quantitative survey, and, at a second stage, in a follow-up interview with the panel's eventual moderator, to see how well they could articulate their respective panel taking experiences. (Let's hope that the recruiter was unconcerned about demographic diversity, because 7 of the 8 were not employed full time, none were below the age of 30, and all were Caucasians.)
And articulate they did - all of the scary sins of our industry. Here's the least surprising of what we learned: Each of them were members of multiple panels – the least ambitious of them was a member of a mere four panels while the majority of them seemed to be a members of 8-10 panels. When asked who had received multiple invitations to the same survey, all hands shot up immediately. A few happily admitted to taking the same survey multiple times. Some seemed to be aware that it might not be completely kosher to do so, while one claimed proudly to have responded to all 10 invitations. ("It's not up to me to police you guys.") One very articulate elderly woman suggested that panel companies should allow them to indicate when they will be on vacation, because it was "so hard to come home to 600 or 800 survey invitations." All were very clear about the importance of incentives and one panelist admitted to taking surveys and participating in focus groups simply to pay off the mortgage on his vacation home. And he was less than happy about the new speeder detection tools sometimes imbedded in our surveys. "It's easy to go fast when you've seen the same thing over and over again."
As always, there are caveats. This was, after all, a qualitative study – not even pretending to be representative like most of the surveys we report on. That being said, there was some good news here. These folks take their survey taking responsibilities seriously. They did surveys at least in part because they really like giving their opinions. They seemed sincere in their pledge of honesty and good intentions about providing truthful answers to all questions including screeners. They only wish we were better at our jobs. The spoke of broken links, screeners that take forever, highly repetitive surveys, progress bars that clearly lie, 50 minute surveys with too small an incentive, incentives that never show up, questions without a "Don't Know" category, missing response categories in our employment questions (retired people do not want to mark themselves as unemployed) and the dread of being confronted with dense text or a "a page full of tiny little boxes". They don't think their detergent has a personality and they were certain that there aren't 60 different ways to describe a soft drink. A few really preferred surveys that gave them a chance to provide detailed open-end responses and many articulated the pleasure of taking a visually interesting and well-constructed survey.
What was perhaps most surprising was the reaction of researchers following this session. I heard a senior sales executive of a very large panel company express shock and amazement about the volume of surveys being taken and the professionalism of the respondents on the panel. Everyone was abuzz. Perhaps it was just the relief of getting it out in the open once and for all. We could suddenly talk more freely about what we feared – the life blood of our industry may be a small army of highly determined professional survey takers.
And guess what other topic was prominently featured on this year's agenda? Routers and maximizing panel "efficiency". In other words, let's figure out if we can get these poor souls to do a few more poorly constructed surveys. We have work to do.