Home of Colorado Guerilla Journalism

YouTube Automates a War Over Truth

YouTube Automates a War Over Truth

The world’s largest video hosting platform employs a new strategy for combating the rise in conspiracy thinking

First, A Primer on YouTube’s Conspiracy Problem

As of 2019, YouTube is bigger than commercial television. It amasses over a billion views a day and is now the second largest search engine in the world.YouTube is TV, if also something else entirely: a user generated ecosystem of media -- an open network --, a global diary of images, the largest experiment in collective sensemaking to date. After the cat videos and memes, we started to ask a lot more of YouTube.

To ask a question today is to “query” the search bar. Assaulted daily by events that defy simple explanation, users of the video platform shake the algorithmic eight-ball for answers. What’s with all the mass-shootings? Why are we in Iraq and do banks have something to do with it? How about the Snowden thing? The YouTube algorithm responds with an invitation to a sprawling universe of conspiracies. Data Journalist Jonathan Albright uncovered a network of over 9000 videos related to the term “crisis actor:” the idea that school shootings are in fact, theatrical spectacles staged by the government so it can take away your guns. The top five videos in this network have amassed over 50 million views. It’s the tip of something much larger. The full scope of YouTube’s conspiracy landscape stretches to encompass any conceivable political topic. Your neighbor’s anti-vax fascination has its own little cluster.

Americans are more anxious than ever, we feel something swirling in the zeitgeist and it’s not good. But the feeling of generalized discomfort also seems impossibly complex, impossible to put a face to. We look to other people, users, who have developed coherent stories about what scares us, a process Ivan Krastev described as the “the rise of the Paranoid Citizen.” With enough anxious searches and an open mind, the grammar of every-day life begins a steady metamorphosis. Banks become Jewish caballs, school shootings become “false flags,” and 9/11 becomes a grotesque assemblage of anti-semitism and elaborate “deep state” fictions. Some people even send pipe bombs.

So what’s YouTube been doing about it?

Above Image: Encyclopedia Britannica entry under video posted by the channel ABOUD, a body builder and school shooting skeptic

Above Image: Encyclopedia Britannica entry under video posted by the channel ABOUD, a body builder and school shooting skeptic

The standard industry approach has been a combination of user flagged content and human moderators contracted to make judgment calls on what content to remove. Alphabet, YouTube’s parent company, recently hired upwards of 10,000 new moderators for the job. In reality, nobody’s found an easy solution to the problem. Many of the human moderators are based out of the Philippines, tasked with filtering visual toxins from the Western internet like an alcoholics liver. The concentration of depraved content is so severe that YouTube’s contractors have been limited to four-hour stints in order to reduce rates of PTSD from too much exposure. Videos of beheadings and hate speech join up with American trash and electronic waste on a journey back to the Global South for reprocessing. It’s hardly a panacea.

YouTube and other platforms have begun to lean heavily on machine learning and other automated systems to keep up with the deluge of content. But for every straight forward breach of community guidelines (torture, ISIS propaganda, death threats, etc.), thousands of other videos resist easy classification, raising familiar questions about free speech. In the early months of 2018, the video sharing platform resolved to address its conspiracy problem after a boom in Parkland “crisis actor” content. Within weeks, YouTube was embroiled in disputes that it unknowingly censored, demonetized, and removed gun advocacy channels which, however misguided, constitute a legitimate discourse in public policy. As journalist Sasha Lekach described the dilemma,  

The solution feels like it should be robot moderators [but] AI doesn't yet grasp context and gray areas. The struggle between free speech and censorship keeps humans necessary in the undesirable role. The iconic photo of a girl running naked during the Vietnam War technically falls within nudity guidelines. "Delete!" says a moderator [...] when shown the photo.

Even for human moderators, it can be fiendishly difficult to parse between the conspiratorial and the conspiracy-adjacent, the ultra-nationalist and calls for ethnic-cleansing. Getting it wrong can mean reinforcing the very distrust of institutions and platforms which nourish conspiracy thinking in the first place; and yet, getting it “right”-- removing the bad guys--, induces a hydra-like effect instead. When the crazies are kicked off the most visible parts of a network, they don’t cease to exist, instead they congregate in pockets that are harder to police, finding themselves in exclusive online enclaves of like minded sympathizers. A study by the Brookings Institute focussing on Twitter censorship of English language ISIS propaganda found that the act of removing these accounts paradoxically produced more intense, insulated communities of Arabic language accounts for ISIS propaganda. InfoWars, once Youtube’s nexus of conspiracy content, was removed from the platform this past August to a chorus of free speech eulogies. Alex Jones and his InfoWars brand now reside in the international waters of the internet. Jones has a proprietary streaming platform as well as a strong presence on Gab.ai, a sleek, well designed social media platform that serves as the community of last resort for far-right users kicked off of YouTube, Twitter and Facebook. It should come as no surprise that Robert Bowers, the shooter at the Tree of Life synagogue, was radicalized on Gab.

"A network graph of interactions among members of the February 2015 collection set, reflecting the impact of months of suspensions. […] As suspensions contract the network, members increasingly talk to each other rather than to outsiders" (Berger & Morgan 57)

In July, YouTube added a new tool to its arsenal. When automated keyword detection is triggered by terms like Holocaust, Oklahoma City bombing, chemtrails, or Sandy Hook, a small window linked to Encyclopedia Britannica or Wikipedia is superimposed at the bottom of the video. It’s an interesting approach which exchanges the blunt instrument of the ban for the nudge of an alternative truth-claim. Whether this strategy is ineffectual or moderately effective remains to be seen. The bitter truth is there will be no silver bullets, only salves and stop-gaps.

We would be naive to limit our critique, or the horizon of our response for that matter, to YouTube and its counterparts. Conspiracy thinking manifests on YouTube and yet originates far before it. Far beyond it. Robert Putnam would point out that the anomie and confusion at the heart of our contemporary social environment has as much to do with the legacy of TV as it does with the internet. Researcher Evgeny Morozov locates the the root of epistemic breakdown in digital advertising models and councils against what he calls “tech solutionism,” the tendency to preference technological solutions for social maladies with political or economic origins. For their part, the technicians of our new town squares can’t help but see these challenges in terms of engineering. They don’t have much of a choice. If only conspiracies were bugs in the software, amenable to a fresh update.






This Land is My Land

This Land is My Land

Loud Voices Carry on an Uncertain Network

Loud Voices Carry on an Uncertain Network