Not to defend Facebook, but we should be able to trust that companies have deleted data if they say they have. Especially if they have signed documents to that effect.
CNET reported that:
When Facebook learned about Cambridge Analytica buying 87 million people’s profile information without user permission in 2015, the social network’s executives said it requested the data be deleted. Facebook said that it received “certifications” that had happened, but now says it’s learned the U.K.-based data profiling firm lied.
Later, the article calls these actions a “light level of scrutiny.” Writer Alfred Ng does not suggest what the correct level of scrutiny would have been except to mention that the request did not require a notary or any sort of legal procedure.
This is not the last time data issues will happen. In fact it has happened again this month with Facebook, University of Cambridge and Alexandr Kogan and their myPersonality quiz harvesting data from 3 million people. The original app that impacted 87 million people was called “thisisyourdigitallife.” Facebook has been investigating apps for other misuse of data and so far has suspended 200 apps. If it is happening so routinely, there should be a routine response expected from social platforms.
Are we going to expect everyone to be clawing through everyone else’s files looking for traces of their data? There are also concerns about derived models.
So now we need to follow all the trails showing what they used the data for, the conclusions they came to and the actions they took from it? This kind of audit work is usually done by third parties.
In this case, it was not a third party. CNET also says:
During a Senate Judiciary committee hearing on Wednesday, Wylie, a former Cambridge Analytica employee and whistleblower who detailed the extent of the company’s data collection in March, told senators what that certification entailed.
Wylie said he received a letter in 2016 from Facebook, writing that they knew he still had data from Cambridge Analytica’s harvesting program, and asked him to delete it.
“It requested that if I still had the data to delete it and sign a certification that I no longer had the data,” Wylie said. “It did not require a notary or any sort of legal procedure. So I signed the certification and sent it back, and they accepted it.”
In an interview with CBC in March, a University of Colorado – Boulder assistant professor, Casey Fiesler, in the Department of Information Science argues Facebook could prevent the next Cambridge Analytica scandal if they gave researchers more access to data, which would open it up to academic review.
Do we want the standard procedure for a data breach to become more data being shared? There does seem to be an argument that there should be a watchdog or an audit process. The people involved have to understand the nuances of the IT platforms and know what appropriate data usage means.
Fiesler says “there have been a lot of calls for more ethics education, for computer scientists, for data scientists, for more ethical thinking” and she promotes a code of conduct for the tech sector. The U.S. does not have the legislation that we have in most Canadian provinces making the CIPS organization and their Code of Ethics the IT professional association that can certify in Canada.
There should be a certified IT professional that is required to do the review when you can’t trust a company’s signature.