YouTube Moderators: Company Allowed Thousands Of Child Predator Accounts To Flourish
While social media outlets like YouTube, along with Google (its parent company), Facebook and Twitter, think it’s their job to sanitize, censor and state-control your news, whistleblowers at YouTube are coming forward to express their concern that the company has allowed hundreds of thousands of child predator accounts to flourish even as they censor and demonetize political commentary they disagree with.
In a report by Elizabeth Cassin at the BBC, she writes:
Part of YouTube’s system for reporting sexualised comments left on children’s videos has not been functioning correctly for more than a year, say volunteer moderators.
They say there could be up to 100,000 predatory accounts leaving indecent comments on videos.
A BBC Trending investigation has discovered a flaw in a tool that enables the public to report abuse.
But YouTube says it reviews the “vast majority” of reports within 24 hours.
“Although the videos themselves are completely innocent, there are attempts from adults to collect personal information from children, and requests for them to remove clothing,” Cassin said.
“These are a clear violation of YouTube’s child endangerment policies,” she added. “So you might expect that comments like these would be removed immediately once reported—but no.”
“It’s claimed that one key part of YouTube’s mechanism for reporting comments like these hasn’t been working properly for over a year, so some obscene comments directed at children have remained on the site,” Cassin concluded.
Cassin and her team reported 28 accounts to YouTube that they found were in violation of the terms of service.
“Two weeks later, 23 of these accounts still remained on the site,” Cassin said.
“We take child safety extremely seriously and have clear policies against child endangerment. We have systems in place to take swift action on this content with dedicated policy specialists reviewing and removing flaggers material around the clock.”
Swift action you say? How is taking more than two weeks to deal with the vast majority of the violations “swift”?
Ask conservatives like Bobby Powell just how “swift” YouTube can be when demonetizing his videos. They’re practically demonetized before he can even post them! But make no mistake, while YouTube will claims such videos are not appropriate for advertisers, don’t pay too close attention to such demonetized videos or when you click on them you’ll see… advertisements.
Rachel Blevins adds:
As The Free Thought Project reported in June, there are also dozens of videos on YouTube advertised as harmless videos for children that contain everything from violent, bloody scenes with Mickey and Minnie Mouse, to sexually suggestive scenes with Spiderman and Elsa.
There are also a number of videos that feature young girls suggestively eating cream pies, or being taped to their beds—while these may be advertised as harmless videos for children, they also act as “free candy” for sick pedophiles. These accounts have millions of subscribers and these videos have hundreds of millions of views, indicating that the creators are likely making an impressive profit.
All the while, the videos YouTube is actually removing and the creators who are facing consequences are the ones who attempt to document war crimes committed by the United States. YouTube has begun labeling videos that show the U.S. launching drone strikes that have killed innocent civilians as “violent or graphic content,” all the while turning a blind eye to the thousands of videos that show violent and graphic content—and are marketed specifically for children.
It seems YouTube is just as hypocritical as we all thought they were. They were best when they simply allowed people to post videos and make money and inform people rather than become internet Nazis. Now, they can’t even live up to their own standards.