I’m sure you’ve noticed that 3 weeks have elapsed and I’m
only day 2. Well, a few things intervened. Though I’d sum up the rest of the
conference – which ran the gamut from strong presentations to a fascinating but
somewhat tangential soliloquy on response rates – in very simple terms. What
did I learn you should and shouldn’t do in mobile research?
Use on immediate, emotive or recall, and
location-specific topics. Examples of ideal mobile survey question: How
attractive is that package you just picked up at Tesco? Which advertisement do
you remember from the commercial break that just ended and did you like or
dislike it? Ipsos found that ad recall was more accurate in mobile vs. web
surveys though verbatim were much less rich.
Expect to get nearly all of your responses very
quickly and if you don’t want that, stagger release of the sample.
Recognize that unlike other modes, it costs
respondents to participate (not just their time), even down to a per-response
cost if you use SMS.
Consider appropriate incentive structure that
compensates for cost of participating (SMS or data plan costs) over and above
the typical remuneration for time/trouble.
If working with a panel provider, determine
whether their mobile panel is purpose-built or simply regular panelists who
opted in to mobile surveys.
Use an enhanced invitation (see Ipsos paper for
example): personalized, emphasizes that participation is free, states purpose
of study (Total Design stuff). This resulted in higher response rate and more
meaningful verbatims (though it had no effect in an email invitation to a web
survey run in parallel, and a subsequent GlobalPark study contradicted the
findings on rich verbatim).
Think about how a mobile survey could integrate
or converge with a social network so that you can combine a conversation or
co-creativity exercise (network) with brief, point-in-measures (mobile survey).
Use gratuitously. There is no firm evidence that
mobile respondents will be more engaged than in other modes/panel types, or
that overall data quality or time-to-insights are improved. Examples of really
successful implementation are few and far between.
Expect to get older people or broadly
representative consumer samples out of mobile panels. That being said, every
mobile panel vendor said his or her panel is underutilized at this point so
there is room for experimentation.
Give people mobile survey as one among many
survey mode options. It is fundamentally different from other modes, confuses
people and lowers response rate to have too many options.
Field SMS surveys (as opposed to web-based
mobile surveys) if you can avoid them, though the invitation trigger may be an
SMS. They are easier and reach a broader base of respondents, but trigger major
concerns about cost and are tough to integrate on the back-end since each
response*respondent combination is a separate record.
Tell people it won’t cost them to participate
once and expect they’ll remember that. In experiments and field results shown,
cost to participate was a major concern no matter how often respondents were
reminded that it was free. In the Ipsos study, even though SMS was free, half
of those dissatisfied with the experience focused on cost.
Expect response rates from mobile panels to be
significantly higher than from “normal” internet access panels. Cited response
rates were mostly in the 10% to 15% range.