Create Trust Online by Pairing User Control and Data Security
Have you been seeing a lot more cookie notices lately? In a new study, Yale SOM’s K. Sudhir and his co-author examine the impact on business of the EU’s General Data Protection Regulation (GDPR)—a sweeping set of rules controlling how firms can use their customers’ data, and the reason for those omnipresent cookie policies. They find that strict privacy rights paired with strong data security mandates create an atmosphere of trust that makes data sharing more beneficial for both firms and their customers.
Sharing a story on Twitter or picking a movie out of your Netflix queue might seem inconsequential. But each keystroke on the internet is a transaction between consumers and businesses on the lookout for data.
It’s a complex ecosystem of information—and currently one with few clear rules, at least in the United States. “It’s ‘finders, keepers’, and firms can do whatever they want with the data they collect for the rest of our lives,” says Yale SOM professor of marketing K. Sudhir. “As a result, your own data can be used by firms to produce outcomes that aren’t necessarily good for you.”
Sometimes, offering firms personal data is an advantage to consumers—it might result in lower costs or more personalized and relevant information—say, movie suggestions during a pandemic. But losing control of your personal data can feel creepy and invasive—and a breach in security that results in that data being stolen can be disastrous.
“For the average person, data privacy is all about keeping one’s data secret and from the prying eyes of others,” Sudhir says. “But the concept of privacy is much richer than that.”
“We share when we believe the gains from sharing exceeds any loss. We share more with people who we trust and have relationships with because we expect them not to abuse our trust.”
In our non-digital lives, Sudhir points out, we share private information with people we trust because doing so benefits us. “We share when we believe the gains from sharing exceeds any loss from sharing,” he says, “We share more with people who we trust and have relationships with because we expect them not to abuse our trust. We share information with family members because we expect them to hold our trust and with our doctors because there are rules of confidentiality.”
Privacy doesn’t mean simply shutting yourself off from the world, Sudhir emphasizes. “It is really a boundary control mechanism which helps each of us decide with whom to share and when.”
The European Union’s General Data Protection Regulation (GDPR), familiar to many of us through omnipresent cookie warnings on websites, is widely viewed as the gold standard for the protection of consumers’ personal data, a contrast to the unfettered approach in the U.S. But according to Sudhir, what’s effective about the EU approach is how it balances various priorities.
For a new analysis, Sudhir and Tony Ke of the MIT Sloan School of Management developed a model to examine how the EU regulations balance the costs and benefits of data sharing—and what they mean for both firms and their customers. The GDPR couples rules on a user’s right to privacy with an emphasis on the security of shared data. That creates an atmosphere of trust that facilitates information exchanges, Sudhir explains. “The right to privacy and data security can feel like independent concepts, but they’re connected,” he says. “Our work explicitly draws the implications of these connections to see how they work together.”
Knowing which regulations work and why is critical to both consumers and companies, because when done well, data sharing benefits everyone. Without sufficient regulation, individual users might worry about too much surveillance or a misuse of their personal information. But placing too many limits may stymie business models that benefit customers as well as firms. “The challenge is to balance the gains and the losses,” Sudhir explains. “There’s no general solution that will work for everyone, but the GDPR is as good as it gets at this point in balancing consumers’ privacy and economic interests and aggregate economic outcomes.”
While there are many elements to GDPR, the paper focuses on three of the critical privacy rights conferred on users by GDPR. First, a user alone decides whether they want to share their data by explicitly opting-in—and providing personal data isn’t necessary to make a transaction. In the U.S, by contrast, the default is to share data and users must explicitly opt out; thus many consumers participate in data collection programs by default, not by making a deliberate choice.
Second, users are able to opt out of data sharing later if they want to—they have the right to be forgotten. “If a consumer feels like she made a mistake by providing consent, she can not only withdraw that consent,” Sudhir notes, “but the data collector has to delete the data that they had collected earlier. This allows consumers to be more confident in sharing data because they can always take back their consent later.”
Third, companies that gather personal data must provide users the ability to transfer their data from one firm to another. “This means that a data collector can’t hold a customer hostage by withholding information about their usage after initially enticing them with a good offer,” Sudhir says. For instance, if a customer creates playlists on Spotify, and wants to switch to Pandora, they won’t be discouraged from doing so by the prospect of having to recreate those playlists.
In addition, the GDPR requires firms to ensure the security of data—and to pay a heavy price if users’ data is lost in a breach. So data collectors are not only incentivized to ensure significant protections for data collected, but also to build privacy into their IT system structure—what is often called “privacy by design”—so that even if data is lost or stolen, there will be minimal harm to consumers.
When it came into effect in 2018, GDPR imposed substantial compliance costs on firms, as they had to build in new safeguards and procedures. However, Sudhir argues that this work produced benefits. Collectively, these rules help facilitate data exchange on the user’s terms, Sudhir explains. “You get to decide whether to start sharing your data, and knowing you can retract consent later, you might be more willing to share,” he says. “These rules also protect users by preventing potential price gouging. And in combination, they help create a better environment for the long-term trust needed to ensure consumer willingness to share data.”
Sudhir and Ke used their model to understand how the GDPR regulations played out in various scenarios. Using game theory, the researchers played out scenarios where consumers could purchase goods from a company in two time periods. During the first phase, firms could gather demographic and behavioral data based on people’s purchases, and use that information to personalize products and prices in the second phase. The researchers examine how different aspects of the GDPR affected individuals’ willingness to share data, as well as the costs and benefits to both firms and consumers over the course of the game.
They found that overall, privacy rights benefited both firms and consumers—but data security was integral to that benefit. Although privacy rights could have meant that more users opted out of sharing their information, the promise of better data security reduces the opt-out. Protecting users from the harms of a data breach increased the odds that they would share information—and it also increased firm profits.
“The GDPR essentially helps to create more trust so that more consumers will choose to share data. Their choice is not just a result of privacy rights, but the complementary regulations promising data security.”
There will be settings, Sudhir notes, where the dangers of any data loss or sharing are so great that consumers will not be willing to share data at all. But, he says, “the good news with GDPR is that the need to obtain explicit consent to collect and share data means that consumers can confidently engage in trade of goods and services, assured that no data is exchanged at all. This is good for consumers, companies, and the economy as a whole.”
The model and the results can help policymakers think through and predict the impact of alternative regulations in real-world circumstances, according to Sudhir. “Overall, our findings suggest that the privacy rights under GDPR is likely to prove useful to companies in different industries, the more they are willing to invest in different aspects of data security,” he says. “The GDPR essentially helps to create more trust so that more consumers will choose to share data. Their choice is not just a result of privacy rights, but the complementary regulations promising data security.”