A year ago, The Intercept published a story about a Trump campaign affiliate that was circulating personality tests to collect Americans’ personal information. The company, called Cambridge Analytica, had already been unveiled by the Guardian in a chilling report that detailed its voter-targeting operation. There was every reason to be concerned. These revelations arrived in the midst of a year in which aggressive political campaigning, concerns over fake news, and the rise of bots that spread propaganda gave us reason to question the kinds of information we were handing over to third party applications, like Facebook, and how this freespun data deluge might come back to bite us in the ass.
But this awareness of Cambridge Analytica, and their covert manipulation of our data, didn’t coalesce into rage until late Friday night, when the words of a pink-haired, gay, vegan Canadian hit a cultural nerve. At 28, Christopher Wylie agreed to talk, he told The Guardian, out of a sense of guilt. Four years earlier, Wylie says he came up with the idea to pull big data and social media to fuel a form of information warfare: an idea that led to the creation of Cambridge Analytica. Coming forward involved breaking a nondisclosure agreement, yet Wylie did it, he explained, because he felt morally conflicted. “I assumed it was entirely legal and above board,” he told The Guardian. But he’d helped to create a weapon, and he was ready, as best he could, to participate in its dismantling.
Judging by content alone, Wiley’s reckoning doesn’t make for a huge news moment; the details he reveals about the inner workings of Cambridge Analytica have, for the most part, already been disclosed by investigative reporters. But Wiley triggered something that countless news stories weren’t able to: A latent rage that may lay the groundwork for a movement that demands accountability from Facebook.
The unchecked power of companies that harvest our data is a great problem—but it’s hard to get angry about an idea that’s so nebulous. Like climate change, the reaping of our data is a problem of psychology as much as business. We know that the accumulation of massive power in so few hands is bad, but it’s impossible to anticipate what terrible result might come of it. And if we could envision them, these consequences are imaginary: abstract and in the future. It feels so oppressively intractable it’s hard to summon the will to act.
Like climate change, the reaping of our data is a problem of
psychology as much as business.
Even if we could act, the options aren’t great. Except for the very very rich, or the extraordinarily poor, participating in the economy requires leaving a digital footprint. Most of us scroll through privacy terms on the sites we use without reading them, and accept updates without noticing or understanding the consequences. We all know we’ve been compromised already.
In a flash, Wylie’s story made the idea of misused big data concrete—and urgent. Unlike, say, Phillip Morris, which sold a product that directly caused people to get cancer, the problems of big tech are abstract enough that they require people to illustrate their impact. Wylie is just one in a small-but-growing cadre of digital whistleblowers who have come of age in the early decades of the Internet, and played a hand in helping tech companies and government institutions harness the power of the data that has emerged, and now regret their roles. Former CIA employee and government contractor Edward Snowden leaked classified information from the National Security Agency in 2013 because he said he was concerned about global surveillance techniques. Tristan Harris rose to become a design ethicist at Google before he left in 2016, and concerned that technology companies design addictive software applications, began a campaign to produce technology that is good for people. Former Facebook product manager (and current Wired columnist) Antonio Garcia Martinez helped develop advertising at Facebook; now he speaks out, after writing a book about his experience. Guillaume Chaslot, a former YouTube engineer, detailed his concerns about the platform’s recommendation algorithm to the Guardian earlier this year.
Wylie, like a lot of these whistleblowers, doesn’t come across great in the Guardian piece. He’s young. He’s silly. He used his new Twitter account, which he only just started Friday, to complain that now he’s been booted off Instagram. Like a lot of engineers, he didn’t really care much about ethics when he was creating programs that would redefine ethical boundaries. But that only enhances his case: it provides a window into how little oversight goes into making the tools that have influenced our political system, and by extension, shaken our democracy.
It’s difficult for any of us to understand where our information goes and how it’s used. Companies and governments are rarely transparent about collecting personal information. Even when they are, their data privacy measures can be lax. While Facebook told The Intercept last year that it had asked Cambridge Analytica to delete its data, Wylie said he’d received exactly one email from Facebook asking him to delete. “All I had to do was tick a box and sign it and send it back, and that was it,” Wylie told The Guardian. “Facebook made zero effort to get the data back.”
Wylie may follow in the footsteps of the Cassandras who’ve come before him, parlaying his moment of public attention into a book deal or public speaking platform that raises his own profile more than it helps force a reckoning. But Wylie will not be the last of these digital whistle-blowers. Indeed, his story will likely galvanize a group waiting in the wings. The challenge, however, is how to use this moment to summon the will to lean on governments and companies to better protect individuals before this moment passes completely, and we must wait for the next whistleblower to give us reason to pay attention.