The Best of Bizcast 2020
A look back at some of our most popular podcasts from 2020.
On November 3, California voters approved Proposition 24, which expanded the state’s data privacy laws making it possible for consumers to instruct businesses to not share their personal data.
The passage of the measure is a milestone in the ongoing ethical and legal debate about who owns personal information and what the relationship among individuals, digital platforms, and tech companies will be in the coming years.
Following introductory remarks by Dean Costis Maglaras, Andrea Prat, the Richard Paul Richman Professor of Business, moderated a discussion held virtually on November 17 with industry experts and academic researchers who are defining the future of that relationship.
Speakers included Marc Rotenberg, the director of the Center for AI and Digital Policy; Steve Satterfield, director of Privacy and Public Policy at Facebook; Steve Tadelis, professor of economics at Haas School of Business at UC-Berkeley; and Catherine Tucker, professor of management at MIT Sloan.
Rotenberg described the development of privacy laws, including the General Data Protection Regulation (GDPR), which was adopted by the European Union in 2016 and protects the personal data of EU citizens and regulates companies that collect and use data.
But no comparable law exists nationwide in the United States.
“The new law in California brings things more in line with the GDPR,” Rotenberg said.
With the 117th Congress set to convene on January 3, Rotenberg said there is an interest among lawmakers in updating federal data privacy laws, and perhaps establishing a new agency for data protection.
“I would not be surprised if the next Congress enacts a new law,” he said. “It may not go quite as far as the GDPR, though it may address issues such as artificial intelligence.”
Legally speaking, Rotenberg said privacy laws focus on the details of the rights and responsibilities associated with the collection of personal data.
“If a company chooses to collect data,” he said. “it’s necessarily going to take on those responsibilities, and consumers will gain certain rights.”
Tucker discussed what’s known as the “privacy paradox,” through an experiment conducted at MIT, where one group of students was asked to hand over their friends’ email addresses, and another group was asked to provide email addresses in exchange for free pizza. The latter was more willing to reveal the email addresses.
“People say they care about privacy,” Tucker said. “But when we observe how they behave, it often does not match what they say.”
When Tucker presented her research to Congress, she said it “made a lot of people happy,” as it could be used to support a variety of conflicting policies.
She said that one side said that the experiment suggested that privacy laws are unnecessary because people do not pay attention to them, while others said it supported increased legislation.
“They said ‘Even if MIT students need help with privacy concerns, that means everyone does,” Tucker said.
Giving an overview of the latest research areas in the data privacy field, Tucker said there is currently work being done on pop-up web ads and whether laws such as GDPR advantage larger firms or smaller companies.
Tadelis also noted the unintended consequences of laws such as GDPR, such as the decreased effectiveness of targeted advertising and the cost of providing data protection, which can be onerous for smaller firms.
Such laws also make it difficult for new firms to compete with existing companies, many of whom still maintain troves of data gathered before the more stringent regulations became EU law.
“This hampers innovation,” Tadelis said.
Turning to an actual example of privacy concerns, Satterfield said that the 2018 Cambridge Analytica incident in which millions of Facebook users’ personal data was acquired by a third-party developer without the individuals’ knowledge so it could be used in political advertising, marked a new era for the social media giant.
“The company has more vigorously pursued third-party developers who violate the Facebook platform policies and limited the amount of data that all developers may access,” Satterfield said.
Satterfield said the aftermath of Cambridge Analytica has led to a “reckoning” at the company and in the tech industry at large.
“For Facebook that means a greater investment in more rigorous technology and product engineering, and not just on the legal team,” Satterfield said. “We’re putting privacy at the core of our work.”
The scandal also led to increased accountability for the company, including a new consent order from the Federal Trade Commission, which requires Facebook to show they are maintaining a robust privacy program with appropriate risk assessments and safeguards.
“Not only we are complying, but senior executives, including Mark Zuckerberg himself, are now accountable for that compliance,” Satterfield said. “You’re also seeing this at other companies as well.”
In the coming years, Rotenberg said that there should be a focus on transparency, or the ability to audit and inspect a company’s data practices, which he said is one of the best mechanisms of strong privacy laws.
He also noted that a user’s consent is an incomplete solution to privacy protection and would need to be readdressed.
“Companies can possess data by virtue of consent, but they are still subject to data breaches,” Rotenberg said.
Rotenberg said he supports the establishment of new regulatory agencies to monitor the use of data, both for the benefit of consumers and to bolster democratic institutions.
“With the use of AI and the tension around cyberwar, there is a risk that if we don’t safeguard the data of people in democratic societies there could be problems greater than Cambridge Analytica.”
Tadelis cautioned against casting the digital economy as a uniform collection of companies, drawing distinctions, for example, between search engines, e-commerce, and “sharing” companies such as Uber.
Tadelis said that future discussions of data privacy will concern a more nuanced account of what is personal property.
“For example, if I do a search on Google, I generated the keyword, and then their algorithm generated what was listed on the page,” Tadelis said. “If I then clicked on something, it is a joint venture between me and Google.”
Taken all together as one dataset, Tadelis said the question becomes: who should own what?
“This is a very thorny area,” he said.
Tucker noted that future policy becomes more complicated once you include, for example, genetic data, which she said has enormous consequences for individuals.
“I’m going argue that some data is permanent, which you can’t change about yourself,” Tucker said. “And there’s evidence to suggest that people’s privacy protections change over time, so they might want more protection on permanent data.”
Satterfield said that there should be increased attention to international laws that allow both privacy protections and ensure vital data flows to firms.
Domestically, Satterfield said Facebook welcomes new and comprehensive federal privacy laws.
“This is a good thing from our perspective,” he said.
Satterfield expects new legislation to address user data rights and data portability.
“And there should be an accountability mechanism,” he said.
A look back at some of our most popular podcasts from 2020.
Groundbreaking research sheds light on the 2007 French presidential election and the relationship between campaign contributions and executives’ behavior.
Keeping your vote a secret from family and close friends could result in feeling regretful and inauthentic.
The COVID-19 pandemic has changed the world of business, while bringing historical inequities and injustice into sharp relief.
Subscribe to Leading Through Change to receive the latest insights from Columbia Business School to help you navigate this unprecedented time.