For years, YouTube, the world’s most popular video network, has been battling issues with “bad actors” wreaking havoc with the system.
The Google-owned property wants to be a safe haven for advertisers to reach young viewers, primarily, with its mix of original videos and a library with virtually anything ever recorded on video.
Yet once again, YouTube found itself under scrutiny this week for more abuses. Seemingly innocent videos of young girls doing gymnastics were hijacked by adult viewers commenting with time stamps and links to child pornography videos elsewhere on the web.
So after being outed by YouTuber Matt Watson expressing his rage and losing top advertisers like Disney, AT&T, Epic Games and others in response, YouTube said it would change its ways, and disable commenting on any video involving children.
Is that enough to turn the ship around and stop the abuse? This is just the latest, following years of conspiracy theorists posting videos suggesting, for instance, that a Florida high school shooting never really happened, which made it to the top of YouTube’s Trending recommendation engine. The last big snafu with kid videos was in 2017, when another blogger noticed that people were re-splicing videos aimed at children and inserting in sexual and violent content.
YouTube’s response then was that it would hire up to 10,000 human monitors to oversee and look out for videos that had no place on YouTube.
But a network that receives 500 hours of new content every minute seems to be playing a non-stop game of whack-a-mole.
YouTube this week said it had disabled comments from “tens of millions” of videos “that could be subject to predatory behavior.”
But it still has a long way to go. A simple YouTube search Friday for “Girls Gymnastics” added the word “in” automatically to the suggested search bar, which led to “Girls Gymnastics in Skirts,” and a link to a video called “Innocent girl in Short Skirt” from the Erotic Kloud channel. In it, a pre-teen walks around a park, egged on suggestively by the filmmaker. The video has 52,000 views, and 10 comments, including this one from someone referred to as Mr. Robinson. “This girl is from which Country. Please give me her contact.” The comments are not disabled.
Beyond clearing out videos like this one (and there are many more, just as offensive, in the category), what else does YouTube need to do to clean up?
—Monitor and classify kids content. “YouTube needs to prioritize kids’ programming and put on delays before they are allowed to be posted,” says Jamie Cohen, an assistant professor at Molloy College in New York.
Note that kids’ programming is by far the most popular genre on YouTube, even though YouTube’s policy, as with Google and Facebook, is that you have to be at least 13 to use the service or watch with a parent. (YouTube has a separate, kid-focused smartphone app, YouTube Kids, which has also come under scrutiny for letting sexual content slip in the cracks. Unlike the main YouTube account, the kids’ app doesn’t allow comments.)
According to market researcher Tubular Labs, 7 of the top 10 most viewed channels on YouTube are kid related, headed by “Cocomelon Nursery Rhymes,” and “Ryan Toys Reviews,” in which a 7-year-old boy rates the latest toys.
“If kids are such massive viewers, look at what they watch,” Cohen says. “Even a few-second delay before the video posts would give YouTube time to do better work”
—Look at traffic spikes and act accordingly. Haley Halverson, a vice president with the National Center on Sexual Exploitation, says that, instead of waiting for users or the media to report offending videos to YouTube to get pulled down, the company should look for warning signs and act.
“If a young girl makes a video in a swimsuit and it has 20 views, then suddenly jumps to 2 million, that’s a big warning about online predators and should send out alarm bells at YouTube headquarters,” she says. “YouTube needs to take more responsibility to keep the site safe.”
—Stop monetizing these types of videos. YouTube has a rich profit sharing system in place with users, giving creators who generate traffic a portion of the ad revenues, which encourages them to post videos that will get eyeballs. YouTube needs to “make sure that none of these videos of minors being sexualized are being monetized,” says Halverson.
What has happened in the past is a blogger finds these videos, advertisers respond by dropping out, YouTube says it will clean up the mess and all is forgotten until another blogger discovers something else; and the cycle continues.
“I’m very skeptical that YouTube will really clean up the platform,” says Halverson. “They usually just wait for us to forget. I’m hopeful people won’t forget and let YouTube ignore the issue.”
YouTube had no comment beyond its blog post.
“No form of content that endangers minors is acceptable on YouTube, which is why we have terminated certain channels that attempt to endanger children in any way,” it said. “Videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies. We will continue to take action when creators violate our policies in ways that blatantly harm the broader user and creator community. Please continue to flag these to us.