Cambridge Analytica couldn’t have misused the data of Facebook users if it was instead stored in a permission-based decentralized blockchain, agree three CEOs.
At the core of the recent controversy that saw Facebook CEO Mark Zuckerberg apologizing on a live interview with CNN on Wednesday night is an alleged abuse of user data. Canadian and former Cambridge Analytica employee Christopher Wylie sparked the fire now embroiling Facebook in yet another privacy uproar. He explained that data collected for academic purposes was actually used to target political ads by the Donald Trump campaign during the U.S. presidential election in 2016.
The resulting social media campaign to #DeleteFacebook and demands that Facebook explain itself before U.S. Congress is fueling the arguments of those who say Facebook’s entire approach to data privacy is flawed. Rather than have one company that’s able to collect personally identifiable information from more than 2 billion users worldwide and manage how its accessed by third-parties, using a decentralized technology such as blockchain would put control back in the hands of individuals.
“Facebook just gave a gift to everybody that believes in decentralization,” says Jeremy Epstein, the CEO of Never Stop Marketing, a company that provides marketing for blockchain startups. “This is what happens when you give up control of your data.”
Also a faculty member of the Blockchain Research Institute, Epstein says a decentralized identity paradigm would allow users to renegotiate how their data is used and be able to prevent unwanted uses of it. Instead of a corporate entity like Facebook generating billions of dollars in revenues from ads sold from that user data, members of a decentralized network could share in the value equally.
A permissions-based model could ensure that the network not only knows who is in control of data, but what purposes it can be used for. Certain limits on how detailed the information is, how many times it can be accessed, or whether there’s a price to be paid for doing so could be set by the data owner.
“The data could only be presented in encrypted form and only decrypted by a smart contract that certifies that it’s only being used for academic purposes,” Epstein says.
For example, if academic research was conducted on a decentralized network, users could choose to opt-in, but strip all personally identifiable information from the data. So if the research required only to know if participants were older than 21, that would be a binary “true” or “false” value instead of a specific age.
At SecureKey, a Toronto-based firm collaborating with banks and telecom carriers to develop a nation-wide digital identity network, a different approach to managing user data is being taken. According to SecureKey CEO Greg Wolfond, a distributed ledger will be used to allow individuals to grant permissions for their data held by trusted service providers to be shared. For example, a home buyer could show a mortgage broker they qualify to make a down payment by sharing bank account details.
“Only I know where that’s going,” Wolfond says. “We make sure that it’s immutable and that it’s coming from a trusted source.”
It’s all about collecting specific consent from users any time data sharing is even considered, he says. Clear language would be used when asking for permission and when proof is collected, it’s hashed in a way that proves its authenticity.
“This is absolutely not the Facebook approach.”
Delvinia is a Toronto-based firm that operates data-collection firm AskingCanadians. Its research community of 1 million Canadians is routinely incented to participate in market research surveys in exchange for earning loyalty program points. It’s not using any distributed ledger technology today, says CEO Adam Froman, but still collects clear consent from its community members before sharing their data and also strips that data of any personally identifiable information.
The difference between Delvinia and Facebook is that Delvinia operates only in market research, while Facebook’s purpose is to enable personalized advertising, Froman says. While Facebook’s business model requires that its advertising engine can apply data to target ads, what happened in the case of Cambridge Analytica should never have been allowed.
“Maybe there was a bit of arrogance because of their scale,” he says. “The fact that there was the ability to do it is where Facebook has to accept responsibility. In a digital world, if you can, you will.”
The feature of the data-sharing API that allowed an academic researcher to collect information about 50 million Facebook users was deprecated in 2013. Previously, a user could grant a third-party app permission to see information about people on their friends’ list.
With notes from Alex Coop