Vox has an explainer today about how YouTube automatically steers users toward deranged and conspiratorial videos, even if the users have started out with perfectly ordinary interests. This has been explained before, but it keeps needing to be explained because it’s so incomprehensible in normal human terms: YouTube’s algorithms, which are built to keep people watching as many videos and video ads as possible, have apparently followed that instruction to the conclusion, as Zeynep Tufekci wrote in the New York Times, “that people are drawn to content that is more extreme than what they started with—or to incendiary content in general.”
The humans who run YouTube (and run its algorithms) aren’t exactly proud of the fact that their product showcases misogynist rants or pseudoscientific nonsense or apocalyptic conspiracy theories. But their position is that what happens inside their black box is extremely hard to correct or regulate, and on the scale at which YouTube operates, it’s impossible to apply human judgment to every case. They wish there was a way to serve up video recommendations without poisoning people’s minds till someone believes it’s necessary to invade a pizza parlor with an assault rifle, but that’s a real tough computational challenge.
What this line of defense leaves out is a very basic, obvious fact: YouTube already has access to an algorithm that can sort through videos without promoting unhinged fringe material. It’s called Google. YouTube is part of Google. When and if Google’s search algorithms start giving Google users fringe results, Google treats that as a failure and tries to fix the algorithms.
In the Vox piece, Jane Coaston writes about what happened when she searched “Trump” on a site that tracks YouTube’s video recommendations:
The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)
Here is what came up when I tried a search for “Trump” on the “Videos” tab of Google.com:
Searching “Hillary Clinton” recommendations from YouTube led Coaston straight to conspiracy theories, including murder. Here’s “Hillary Clinton” on a Google video search:
Somehow, where YouTube declares itself helpless before the enthusiasms of the public, Google is perfectly capable of serving up Hillary Clinton content without going off the deep end.
It’s true that Google and YouTube are different services, with different architecture. Google was built to index the Web and sort through existing material; YouTube hosts video content itself. That distinction, though, isn’t as big as it might seem—Google video search points toward video on the websites of various news organizations, such as the Washington Post or AP News, and YouTube has to point to YouTube, but the Washington Post and AP News are also YouTube content providers. Pretty much everyone is.
And so YouTube doesn’t have to pick out Pizzagaters or MRAs or neo-phrenologists. It has the power to send viewers in the opposite direction. The people who run YouTube made the choice to teach its algorithms to value trash—even if they thought they were teaching the system to value something more neutral, like viewing time. There was a time, in living memory, when the YouTube recommendation system was less aggressive and it acted like Google: stacking up more and more songs by the same band you were listening to, say, or the same subject you were watching a clip about, until you’d had all you wanted and were done.
There is no way to be done in the fever swamps. What distinguishes the people in charge of YouTube from the people in charge of Google search is that the goal of Google search is to settle on satisfying results. The goal of YouTube is to keep people unsettled and unhappy, so they keep watching and keep seeing more ads. A less poisonous index would encourage to people leave the site after they got what they’d come there for. The algorithm the company really doesn’t want to tinker with is the one that tells it to make the most money it possibly can.