YouTube is disabling comments on millions of videos featuring minors, responding to accusations that pedophiles use comments to network and share links. The move comes a week after Disney, Fortnite maker Epic Games and other companies pulled their ads from YouTube.
YouTube says it has already disabled comments on "tens of millions of videos that could be subject to predatory behavior," and that it will broaden that effort in coming months to include more videos that feature young minors.
The company also plans to bar comments on videos featuring older minors if they're deemed to be "at risk" of attracting pedophiles' attention.
Reflecting the delicate complexities of policing a firehose of both content and comments, reaction to YouTube's announcement ranges from those who welcome it as a responsible change and those who criticize it as heavy-handed — and possibly damaging innocent video creators.
YouTube's new comment policy is the latest response to concerns that made headlines last week, when Matt Watson, a content creator on YouTube, posted a video accusing the service of "facilitating the sexual exploitation of children." Some of those videos also were monetized through ads, Watson said. He also accused YouTube's algorithm of serving up new child-related videos to predators.
Watson's video touched off a number of reactions — from anger and disgust at predators, to questions about how the company should stop the misuse of its service. Within the community of YouTube creators, there also were warnings that if advertisers flee and YouTube adopts a broad shift, it could bring a new "Adpocalypse" similar to that of 2016 and 2017, when advertisers fled the service and it moved against content that included hate speech and extremism.
The firestorm was touched off last week, after Watson described what he called "a wormhole into a soft-core pedophilia ring," in which predators hijack comment sections to highlight video frames or footage that could be seen as sexualizing children in otherwise innocuous videos.
In some cases, the pedophiles have shared links to child pornography. In others, they simply sent one another to videos that showed children performing gymnastics, eating or playing with makeup. As they formed a shadow social media network, Watson said, the predators even shared contact info with each other.
After Watson posted his video about the problem, YouTube advertisers such as Disney, Nestle, AT&T and Hasbro hit pause on their ad spending, saying they needed to act to protect their brands.
That has led some creators to label the incident "Adpocalypse 2" — and they vented their anger at both YouTube for its approach and Watson for the way in which he raised the issue.
"Your platform is very sophisticated at predicting what will keep an individual clicking and watching more videos," Twitter user Tay Zonday wrote in response to YouTube's statement. "It tracks more than 1,000 data points on each person. That same AI can be deployed to expose, purge and report bad actors. Blaming content is knee-jerk and imperfect."
Others called for YouTube to track offenders and report them to the police. Critics have also said the issues Watson raised are ones that YouTube and its community have been working for years to address. And after Watson called out others for their impropriety, his own character also has been questioned.
Watson published his attack on YouTube on a relatively new account — but his critics soon found an older account, where he had posted videos about picking up women, including a segment in which he yells out of a car window to a backpack-wearing girl or young woman on the sidewalk, "Hi. Interested in shooting an adult video?"
The videos were highlighted by Daniel Keem, also known as Keemstar, the host of YouTube channel DramaAlert, who called them "very, very creepy." He also disagreed with Watson's call for big marketers to pull ads from YouTube.
"If advertisers leave YouTube, this isn't going to stop the pedos in the comments section," Keem said in a video posted last week. "This is just going to hurt the livelihood of YouTubers big and small."
In response, Watson said the videos were from a "comedy channel" in which he filmed pranks in public. And he accused Keem of trying to distract people from issues around YouTube's comment and advertising policies.
"I'm cool with what I did," Watson said of his video about predators on YouTube, adding that he had accomplished more than he expected.
When it announced its policy update Thursday, YouTube said it has already removed millions of objectionable comments and that it worked to create a new "classifier" to tag and remove predatory comments.
"This classifier does not affect the monetization of your video," the service added in its note to YouTube video creators. And it said the new tag will help it "detect and remove 2X more individual comments."