My email this morning included a message from Hertz describing their new fleet of BMWs. I don’t rent from Hertz anymore and the emails they continue to send are mostly reminders that my driver’s license has expired, which was some while ago. But why the BMW pitch? Perhaps because in 2009 I treated myself to a BMW 3 series rental in Ireland which was great fun on the winding roads of the Dingle Peninsula and Ring of Kerry? Maybe they have a long memory? Or maybe the fact that I currently drive a BMW somehow found its way into my profile?
In any case, it reminded me of Pew’s recently released study, “Americans and Privacy.” A few relevant findings:
- 72% of US adults believe that all or most of what they do online is tracked by companies.
- 79% are either very or somewhat concerned about how their personal data is being used by those companies.
- 59% say they understand very little or nothing about “what companies do with their data” and only 18% say have a great deal or some control over their data.
- 28% say they benefit a great deal or somewhat from the data companies collect on them and 81% say “the potential risks outweigh the potential benefits.
- 75% say there should be more “regulation of what companies can do with their customers’ personal information.”
I could go on, but I think these few examples make the point: the US public is beyond fed up with daily and routine violations of their privacy. They are especially concerned about the amount of personal information about them collected by social media companies (85%), advertisers (84%), and the companies they buy things from (80%).
The old saying, “On the Internet nobody knows you're a dog,” is no longer a thing.
The sad reality is that most, although not all of this data collection and reuse is legal, at least in the US, and that’s not likely to change anytime soon. One frequently cited reason for not taking privacy and personal data protection more seriously is that it just costs too much.
Earlier this year the Information Technology and Innovation Foundation (ITIF) released a study, “The Costs of an Unnecessarily Stringent Federal Data Privacy Law.” By way of definition “unnecessarily stringent” means something similar to the GDPR or CCPA. And the report estimates that such a privacy regime would cost the US economy about $122 billion (sometimes they say “billion” but the tables say “million”) per year, or $483 per US adult. (By way of comparison, that’s more than 50% of what we spend on electricity every year.) So what are those costs?
Around 10% would go to Data Protection Officers and upgraded data infrastructure, two major areas of complaint about the GDPR. But the lion’s share, 85% of the total, would go to two areas: Reduced Access to Data and Lower Ad Effectiveness.
In the case of the former, privacy requirements such as express consent, data minimization and purpose specification will reduce data sharing. In one of my favorite sentences in the report the authors write, “Unfortunately, opt-in requirements frame consumer choices in a way that leads to suboptimal data sharing because most users select the default option of not giving consent—for a number of irrational reasons.” So best we stop asking.
As for Lower Add Effectiveness, the report tells us that “Targeted advertising is beneficial to consumers and businesses.” Such advertising allows businesses to be more efficient and increase sales. “Consumers benefit by gaining more utility from relevant ads.” More utility?
Sadly, I hear similar arguments from within market research in the form of complaining about the cost of compliance as GDPR goes global.
One of my favorite lines in the old CASRO Code of Conduct is this one: “Since research participants are the lifeblood of the research industry, it is essential that those approached to participate be treated with respect and sensitivity.” I worry that we’ve lost that sense of respect for those whose data we rely on, whether when collecting from them directly or harvesting their data from the cloud. Online panels have led to us thinking of respondents as a commodity and our increasing reliance on big data sources has caused us to stop thinking about them as people at all. In the privacy debate they are an abstraction in a one-sided cost benefit exercise.
There are recent surveys that show those people who are our lifeblood don’t think very highly of us these days. They don’t trust us with their data any more than they trust social media networks or advertisers and, whether rational or irrational, they are less and less inclined to cooperate with research requests. This is not a good thing, to say the least. It’s important that we figure out sooner rather than later whose side we are on.