Previous month:
May 2013
Next month:
July 2013

Posts from June 2013

Ending Day 1 at 3D by predicting the future

The last session of the first day focused on social media. Edward Malthouse presented on the link between social media and purchase behavior. He presented some cool stuff on attempts by brands to create competitions as a way to engage (How does Superman shave? Design a new McDonald's sandwich, etc.) His data is from the Air Canada frequent flyer program. The program wanted to learn more about how to get members to redeem their miles. So they asked members to go to the site and suggest rewards that would encourage them to redeem their miles. His approach is based in System 1 and System 2 thinking. Promotions that use incentives typically rely on System 1 (habitual responding). Promotions that try to engage are directed at System 2 (effortful processing). His hypothesis is that the more a promotion engages System 2 the more likely someone will purchase. He measured engagement by the number of words and the degree of elaboration in the suggestions that they made at the site. He offered a whole bunch of hypotheses but the key one is that engaging System 2 increases the odds of purchase. His data generally supported that view.

We ended with an interactive session designed to have the audience answer three questions about the MR industry in 2023. Tables in the room were divided into three groups with each group asked to answer one question. So the unit of analysis below is the table.

  • Sixty six percent said that the basic source data for insight will be 50% digital and 50% non-digital. The remainder opted for 80% digital and 20% non-digital. This was a 3D conference after a day of hearing presentations on Big Data.
  • Sixty percent said that 30% of insights will come from clients and 70% from agencies. The remainder said 70% from clients and 30% from agencies. Blogging comrade Jeffrey Henning sees this as sample bias.
  • Seventy percent said that 80% of decisions will be made by people and 20% by machines.

A good day. Now for some drinks!


The continued evolution of MROCs at 3D

Back at ESOMAR 3D for where the topic switched to online mostly online qual. Frederick Gennart and Tom de Ruyck began a discussion of a project to redesign the IKEA catalogue by describing the advantages of MROCs over standard focus groups. Basically, it comes down to a single conversation versus an ongoing dialogue. There are other issues as well – time, money, size, etc. – but the principal advantage is the range and richness of insights. They then discussed to the details of the methodology, which they made multi-dimensional by approaching different topics from different perspectives using a range of tools as well as fine tune their approach by country. At one point, they revived the MROC to do some additional work to provide guidance on cover design. They used a survey and then in depthSteveAug discussion to make the cover choice. Their presentation did an excellent job of showing how the MROC methodology is becoming increasing more sophisticated as it evolves, at least in the hands of one of its prime practitioners like Tom.

Then we heard from Steve August and his client at P&G. They talked about studying product transitions – when consumers decide to replace a product they have with another product. There are multiple reasons for that – technology, life stage, use – and it's tough to see those transitions coming. Steve reminded us that transitions are a process, not an event and the key to understanding is in someone watching that process. Like a researcher. It turned out, that the study topic was diapers. P&G was taking a beating with its diaper products and did not know why. Through creative use of an MROC they discovered the problem was in the transition from one size to another as the kids grow. It provided some really useful intelligence which P&G took to heart. The changes were reflected in improved performance in the market place.

Both of these presentations were good examples of how online qual is becoming increasingly sophisticated in a very short period of time. Really great stuff!

This was followed by a panel on "Making an Impact," which I missed becauseI forgot to bring my iPhone charger to Boston and had ot run out and get another one for my already substantial collection at home.


Big Data at 3D

The topic has shifted to Big Data (BD from here on out) and moving from general talk to some tangible applications.

The first speaker was Jeff Hunter who showed us some specific uses of BD at General Mills. Well, maybe not so much BD as creative use of a whole range of different data sources, each fit to a specific problem related to company growth. One of his key points is reminding us that we live in a very rich data environment, much of which has not been mined and, sometimes, is free or very low cost. One of his examples was use of social media and sentiment coding to evaluate a new product trial. It worked well but there are caveats: the sentiment coding method makes a difference and the approach works best with high involvement products that generate lots of buzz. In another example he described what we might think of as "desk research" to gather a number of data series that were essentially free in order to evaluate a possible acquisition in an Eastern European country. Good, practical stuff from major buyer.

David Krajicek was up next and talked about whether MR has been doing enough to take advantage of Big Data. He used terms like "datification" and "digital exhaust." (I hope these don't stick.) He showed us a nice chart depecting the difference between BD and MR: Census/Sample, Flow/Fixed Point, Atheoretical/Hypothesis-Based, Unstructured/Structured, etc. The opportunity, as he sees it, is putting the two together and translating BD into "smart data." His argument is the current MR argument in this realm, specifically, that the skills of the market researcher and our classic concerns –representativeness, correlations vs. causality, complexity, etc. are essential here. The issue here is that value of BD is too great to leave it to the mathematicians. He showed some specific examples that merge survey data with BD-like data to enrich data and provide insights not possible from each source along. The thing is, we have been doing this for years. The presentation was a really good summary of the current storyline about BD in MR. Whether it is right, remains to be seen.

Finally, we had Greg Mishkin and his client, Don Hodson from AT&T who talked about a research program that goes through a cycle of interviewing, big data integration, and qualitative work as part of continuous cycle of improvement. One neat thing about this is that it includes a survey to assess whether the original model built by merging survey data with behavioral data is right as well as assess the impact of any actions the client may take to affect attitudes and even behavior. One key advantage of the BD approach that they highlighted is solving the recall problem inherent in surveys. Most importantly, the presentation has showed how surveys and big data can be used to great effect with a company that has a huge data base of interactions with their customers.


Kickoff at Esomar 3D

I am at the ESOMAR 3D Conference in Boston. Mike Cooke opened by reminding us that this conference has charted the evolution of new methods in MR since the Panels Conference in Budapest in 2005. It is THE place to be to catch up on what's happening now. I've attended all but one and chaired three, so I am really pleased to be here.

The first speaker and keynoter was Tony Chapman from Capital C in Canada. He talked about the critical nature of engagement in marketing. Marketing has always been about breaking through the clutter, but the clutter in the modern world keeps growing exponentially. Breaking through is harder than ever. The challenge for MR is to help clients understand how to do it and that comes down to figuring out what stories to tell. He argued that there are seven basic stories and the objective is to figure out how to insert the brand into one of those seven stories. In the process, MR companies need to migrate from the role of provider to enabler, which comes down to finding ways to change the conversation via story that rings true to individual consumers. My reaction to the talk was that it was very well done and great fun to listen to (almost inspiring), but I was not sure how to put it to work.

Next up was Richard Nicholls from Future Foundation. He began by showing some global data to indicate while "the smartphone revolution" and mobile commerce still have a very long way to go in terms of penetration, social networking is a true global phenomenon -- over 60% in many countries. His talk focused on three trends he sees in their ongoing global data collection:

The hyperactive self. These are people who use technology to maximize their behavior (e.g. finding the right product at the right price). There is a danger, however, that we over invest our time in this sort of thing and end up being less efficient rather than more efficient. He calls this overmaximizing.

The second trend is what he calls the preformative self -- using social media to tell stories about our everyday lives as a way to enhance our personal status. Self-validation. (The folks at the blogger table are rejecting this point of view out of hand.)

Finally, there is the smart networker. Some of us are finally realizing that too much sharing is a bad thing and we are adjusting to manage our online presence more effectively. There are several manifestations of this such as untagging photos, restricting who can see your online posts, and thinking more about what kind of image we want to present to potential suppliers.

Next up was Illya Lichtenstein from MixRank. They have been crunching data from major websites to get a sense of what technologies brands and advertisers are using and their effectiveness. They have found that the best indicator of conversion (buying) is the social media button on an online ad – the Like button or the Twitter icon. But conversion is preceded by engagement and linking the two is extremely difficult. There is lots of data for marketers to look at, but making sense in terms of what works and what doesn't is extremely difficult. His survey data show that two-thirds of marketers say they don't understand how to make the data at their disposable actionable. They know that social media plays a role in persuading people to buy, but can't measure or really understand that link so they can act on it. To state the obvious, that's a real opportunity for MR.

Break time.


ESOMAR tackles Big Data

On Friday I attended a small gathering of around 30 MR suppliers, clients, privacy experts, and Big Data practitioners in Boston. The goal of the event was to stimulate a conversation about the practice and impact of Big Data on MR firms and the industry, to understand where we are headed, and the problems we are going have to solve to make it all work. The event kicked off with a presentation by John Deighton from the Harvard Business School who gave an excellent overview of the issues – current and future. His themes were not terribly different from those of the piece in Thursday's NYT, but he offered a good deal more detail with specific emphasis on implications for MR and marketing. The attendees then broke up into groups to develop positions on a set of pre-specified questions that wBigdata1ere reported back to the group and discussed. In the afternoon we used a similar format, first with a panel of MR practitioners doing Big Data projects and then three privacy experts. 

I found this to be a really stimulating day. I would characterize the overall sentiment in the room as the day wore on as not being terribly different from the viewpoint shared by Larry Freidman in this blog post. The basic theme is that we now have a wide variety of sources and tools and we can serve clients best by bringing them together to tell a larger story. Some refer to this as "data diversity," although the term means different things to different people.

The revelation for me was in John Deighton's description of how data scientists who mine and analyzed these huge datasets approach their task. They are pure empiricists who eschew much of what we hold dear (e.g. accuracy, representativeness, association versus causality, etc.), all the skills that are the foundation of our profession. To quote Chris Anderson (editor of Wired), "[Big Data] makes the scientific method obsolete." We no longer need hypotheses or models; Big Data is self-modeling. We don't need to understand context or culture. We just need mathematics and the ability to listen to what the data are telling us. It is what it is. If it works, then the properties of the data don't matter. To my ears this sounds like atheoretical bullshit, but to a data scientist it is a first principle.

The Big Data future may be more challenging than we think, although it's not likely to be upon us as quickly as some fear. These kinds of transitions always take longer than we think. Or so some of us hope.