‘Utterly horrifying’: ex-Facebook insider says covert data harvesting was routine

0
258

March 20, 2018

March 20, 2018

Cameras are trained on the the building housing the offices of Cambridge Analytica in central London – REUTERS/Henry Nicholls

Hundreds of millions of Facebook users are likely to have had their private information harvested by companies that exploited the same terms as the firm that collected data and passed it on to Cambridge Analytica, according to a new whistleblower. 

Sandy Parakilas, the platform operations manager at Facebook responsible for policing data breaches by third-party software developers between 2011 and 2012, told the Guardian he warned senior executives at the company that its lax approach to data protection risked a major breach.

The building housing the offices of Cambridge Analytica is seen in central London – REUTERS/Henry Nicholls

“My concerns were that all of the data that left Facebook servers to developers could not be monitored by Facebook, so we had no idea what developers were doing with the data,” he said.

Parakilas said Facebook had terms of service and settings that “people didn’t read or understand” and the company did not use its enforcement mechanisms, including audits of external developers, to ensure data wasn’t being misused. 

People walk past the building housing the offices of Cambridge Analytica in central London – REUTERS/Henry Nicholls

Parakilas, whose job it was to investigate data breaches by developers similar to the one later suspected of Global Science Research, which harvested tens of millions of Facebook profiles and provided the data to Cambridge Analytica, said that the slew of recent disclosures have left him disappointed at his superiors for not heeding his warnings.

“It has been painful watching,” he said. “Because I know that they could have prevented it.”

Asked what kind of control Facebook had of the data that was handed over to outside developers, he replied: “Zero. Absolutely none. Once the data left Facebook servers there was not any control, and there was no insight into what was going on.”

Parakilas said he “always assumed there was something of a black market” for Facebook data passed to external developers. However, he said that when he told other executives the company should proactively “audit developers directly and see what’s going on with the data” he was discouraged from the approach.

He said one Facebook executive advised him against looking too deeply at how the data was being used, warning him: “Do you really want to see what you’ll find?” Parakilas said he interpreted the comment to mean that “Facebook was in a stronger legal position if it didn’t know about the abuse that was happening”.

He added: “They felt that it was better not to know. I found that utterly shocking and horrifying.”

Parakilas first went public with his concerns about privacy at Facebook four months ago, but his direct experience policing Facebook data given to third parties throws new light on revelations over how such data was obtained by Cambridge Analytica.

Facebook did not respond to a request for response to the information supplied by Parakilas, but directed the Guardian to a November 2017 blogpost in which the company defended its data sharing practices, which it said had “significantly improved” over the last five years.

“While it’s fair to criticise how we enforced our developer policies more than five years ago, it’s untrue to suggest we didn’t or don’t care about privacy,” that statement said. “The facts tell a different story.”

‘A majority of Facebook users’

Parakilas, 38, who now works as a product manager for Uber, is particularly critical of Facebook’s previous policy of allowing developers to access the personal data of friends of people who used apps on the platform, without the knowledge or express consent of those friends.

That feature, called Friends Permission, was a boon to outside software developers who, from 2007 onwards, were given permission by Facebook to build games and quizzes – like the widely popular FarmVille – that were hosted on the platform.

The apps proliferated on Facebook in the years leading up to the company’s 2012 initial public offering, an era when most users were still accessing the platform via laptops and computers rather than smartphones.

Facebook took a 30% cut of payments made through apps, but in return enabled them their creators to have access to Facebook user data.

Parakilas does not know how many companies sought Friends Permission data before access was feature terminated around mid-2014. However, he believes tens or maybe even hundreds of thousands of developers may have done so.

It has been painful watching, because I know they could have prevented it

By that point, Parakilas estimates that “a majority of Facebook users” could have had their data harvested by app developers without their knowledge. The company now has stricter protocols guarding what access third parties have access to.

Parakilas said that when he worked at Facebook the company failed to take full advantage of its enforcement mechanisms, such as a clause that enables the social media giant to audit external developers that misuse its data.

Legal action against rogue developers or moves to ban them from Facebook were “extremely rare”, he said, adding: “In the time I was there, I didn’t see them conduct a single audit of a developer’s systems.”

Facebook announced on Monday that it had hired a digital forensics firm to conduct an audit of Cambridge Analytica. The decision comes more than two years after Facebook was made aware of the reported data breach.

During the time he was at Facebook, Parakilas said the company was keen to encourage more developers to build apps for its platform and “one of the main ways to get developers interested in building apps was through offering them access to this data”. Shortly after arriving at the company’s Silicon Valley headquarters he was told that any decision to ban an app required the personal approval of the chief executive Mark Zuckerberg, although the policy was later relaxed to make it easier to deal with rogue developers.

ABOUT

While the previous policy of giving developers access to Facebook users’ friends data was sanctioned in the small print in Facebook’s terms and conditions, and users could block such data sharing by changing their settings, Parakilas believed the policy was policy was problematic.

“It was well understood in the company that that presented a risk,” he said of the Friends Permission feature. “Facebook was giving data of people who had not authorised the app themselves and was relying on terms of service and settings that people didn’t read or understand.”

It was this feature that was exploited by Global Science Research (GSR), and provided to Cambridge Analytica in 2014.GSR was a company run by a Cambridge University psychologist called Aleksandr Kogan, who built an app that was a personality test for Facebook users.

The personality test automatically downloaded the data of friends of people who took the quiz, ostensibly for academic purposes. Cambridge Analytica has denied knowing that the data was obtained improperly and Kogan maintains that he did nothing illegal and had a “close working relationship” with Facebook.

While Kogan’s app only attracted around 270,000 users (most of whom were paid to take his quiz), the company was then able to exploit the Friends Permission feature to quickly amass data pertaining to more than 50 million Facebook users.

“Kogan’s app was one of the very last to have access to friend permissions,” Parakilas said, adding that many other similar apps had for years been harvesting similar quantities of data for commercial purposes. Academic research from 2010, based on an analysis of some 1,800 Facebooks apps concluded that around 11% of third-party developers requested data belonging to friends of users.

If those figures are extrapolated, tens of thousands of apps, if not more, are likely to have systematically culled “private and personally identifiable” data from data belonging to hundreds of millions of users through, Parakilas said.

The ease with which it was possible for anyone with relatively basic coding skills to create apps and start trawling for data was a particular concern, he added.

Parakilas said he is unsure of why Facebook ceased allowing developers to access friends data around the mid-2014, roughly two years after he left the company. However, he believes one reason may be that Facebook executives were becoming aware that some of the largest apps were acquiring enormous troves of valuable data.

He recalled conversations with Facebook executives who were nervous about the commercial value of data being passed to other companies.

“They were worried that the large app developers were building their own social graphs, meaning they could see all the connections between these people,” he said. “They were worried that they were going to build their own social networks.”

‘They treated it like a PR exercise’

Parakilas said he lobbied internally at Facebook for “a more rigorous approach” to enforcing data protection, but was offered little support. His warnings included a PowerPoint presentation he said he delivered to senior executives in mid-2012 “that included a map of the vulnerabilities for user data on Facebook’s platform”.

“I included the protective measures that we had tried to put in place, where we were exposed, and the kinds of bad actors who might do malicious things with the data,” he said. “On the list of bad actors I included foreign state actors and data brokers.”

Frustrated at the lack of action, Parakilas left Facebook in late 2012. “I didn’t feel that the company treated my concerns seriously. I didn’t speak out publicly for years out of self-interest, to be frank.”

That changed, Parakilas said, when he heard the congressional testimony Facebook lawyers gave to Senate and House investigators in late 2017 about Russia’s attempt to sway the presidential election. “They treated it like a PR exercise,” he said. “They seemed to be entirely focused on limiting their liability and exposure rather than helping the country address a national security issue.”

It was at that point that Parakilas decided to go public with his concerns, writing an opinion article in the New York Times that warned Facebook could not be trusted to regulate itself. Since then, Parakilas has become an adviser to the Center for Humane Technology, which is run by Tristan Harris, a former Google employee turned whistleblower on the industry.


Courtesy/Source: The Guardian