After a highly-shared YouTube video showed how to access "a wormhole into a soft-core pedophilia ring" on the platform, Disney, McDonalds, and other companies have dropped their advertising from the site. Now YouTube, which is owned by Google-parent Alphabet, is being accused of only taking steps against harmful content on its website after big public controversies force it to act.
The latest row was ignited after a video posted by YouTuber Matt Watson, who showed how the "wormhole" could be accessed, climbed to the first page on Reddit and rapidly gained over a million views. In the video, Mr Watson demonstrates how clicking on side-bar videos, promoted next to normal monetised videos, led users to YouTube pages completely populated by inappropriate and eroticised clips of young girls.
In his demonstration, Watson searched "bikini haul" and then clicked on a recommended video in the "Up Next" section on YouTube. In two more clicks he was led to pages with sidebars full of videos with young girls, many of which had comments that linked to timestamps that led to "the points in the video where little girls are in compromising positions," Watson explained.
"YouTube's recommended algorithm is facilitating paedophiles’ ability to connect with each other, trade contact info, and link to actual CP [child pornography] in the comments," Watson said. "I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes."
Not the first time
Companies cutting their advertising deals with YouTube did so after learning that some of the inappropriate videos are being monetised. According to Wired UK, many of the videos included "pre-roll adverts from Alfa Romeo, Fiat, Fortnite, Grammarly, L'Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com. Banner advertising for Google and the World Business Forum also appeared alongside some of the videos."
The latest controversy follows a public backlash in November 2017 after ads were shown to appear on YouTube videos that attracted child predators. Earlier in 2017, reports showed that YouTube was placing ads on videos with violent and extremist content, leading to a broad boycott by hundreds of advertisers, including Procter & Gamble, AT&T, Dish Network and PepsiCo.
At the time, YouTube said it would tighten controls over how ads are served and add more restrictions to creators who earn revenue on the site. Those new controls had recently persuaded US telecoms company AT&T that YouTube was “brand safe”, announcing last month that it would resume advertising after a two-year hiatus. Now YouTube is scrambling to reassure companies of its commitment to protect minors on its site.
“Any content, including comments, that endangers minors is abhorrent, and we have clear policies prohibiting this on YouTube,” a YouTube rep said in a statement responding to Watson’s video. “There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
According to YouTube, after Watson’s video was published, the site shut down more than 400 accounts and channels that violated its policies, turned off comments on “tens of millions” of videos, and reported illegal activity to law-enforcement authorities.
Big Tech and self-regulation
Yet the revelations have cast doubts on YouTube’s commitment to weed out inappropriate content on its platform in the absence of a public outcry.
For Counter Extremism Project (CEP) Executive Director, David Ibsen, the service’s after-the-fact actions are part of an ineffective cycle. “YouTube’s response to this latest incident involving the trade of harmful and predatory content follows a familiar playbook,” Ibsen wrote on February 23rd. “Only in the face of widespread public controversy, and after a real threat to their profit margins, will they take any action to eliminate this exploitative material from their platform.”
“For more than a decade, YouTube’s response to platform misuse has been reactionary policies after the damage has already been done," added Ibsen. “YouTube’s promises to improve and pronouncements of policy changes are meaningless when the company fails to consistently and systematically enforce them.”
Haley Halverson, Vice President of The National Center for Sexual Exploitation (NCOSE), made a similar point in an interview with Fatherly. “[YouTube] need[s] to take a more proactive approach in how their algorithms work, using better AI. They need to actually make dealing with sexual exploitation a priority, because right now it’s not. Right now, they’re happy that people are commenting and that videos have high views and they are looking at the corporate profit bottom-line instead of, actually, the health and safety of their users.”
According to Ibsen, this kind of reactionary approach is a trend that afflicts the whole industry. “YouTube’s behavior shows that the so-called self-regulation of the tech industry is no longer an option. Google and other tech firms have made it clear that they will only act when there is outside pressure – namely from advertisers and governments.”