YouTube’s automated video recommendation system has been allowing otherwise innocent videos of children to be categorized with those preferred by pedophiles, says a report at the New York Times.
“YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families,” wrote columnists Max Fisher and Amanda Taub.
The writers spoke with researchers who discovered the problem of a YouTube algorithm that often referred innocent videos to a category of sexually-themed content.
“The result was a catalog of videos that experts say sexualizes children,” they observed.
Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society, identified YouTube’s algorithm as the means for connecting the channels, stated the report.
“That’s the scary thing,” he said, adding that while YouTube never intended to connect family videos of young children to pedophiles, the reality of the situation is “disturbingly on point.”
The Google-owned platform is essentially leading users with pedophile interests to videos of partially-clothed children – possibly in swimsuits outside in their backyard pools – through its progressions of recommended videos.
Fisher and Taub wrote:
So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.
The writers interviewed Christiane C., a mother from the Rio de Janeiro area, whose 10-year-old daughter and a friend uploaded an innocuous video of themselves while swimming in a backyard pool.
When Christiane’s daughter excitedly told her mother several days later her video had 400,000 views, the mother fearfully viewed it again.
YouTube’s recommendation system showed the video to users with pedophile interests, researchers said.
According to Fisher and Taub, when the NYT alerted YouTube to its discovery, the company removed several of the videos, “but left up many others, including some apparently uploaded by fake accounts.”
Additionally, its recommendation system changed and no longer connected some of the innocent videos with those of sexually-themed content, though YouTube said this was only a product of routine tweaks to their system, and not related to the effort to stop exploitation of children.
“Protecting kids is at the top of our list,” said Jennifer O’Connor, YouTube product director for trust and safety, about the company’s commitment to end exploitation of children on its platform.
The NYT columnists, however, noted the fact that YouTube has failed to switch off its recommendation system on videos of children, a situation that is leading to a continuation of its high-risk status.
In February, Wired observed major companies including McDonald’s, Nestlé, and Epic Games, pulled ads from YouTube over reports that many of the platform’s “videos with tens of millions of views are being inundated with comments by pedophiles, with adverts from major brands running alongside the disturbing content.”
The Wall Street Journal reported that, during a conference call with ad buyers Google executives “sought to assuage concerns by explaining the steps the company has taken to address brand safety problems that have plagued the platform.”
“The executives also told ad buyers the company will deliver a timeline in 24 hours outlining new restrictions and product changes, one of the people said,” the report noted.
However, as Breitbart News reported, “Google has struggled with pedophilia on YouTube for years, and in December 2017, the company claimed it would hire ‘thousands’ of human moderators to combat the problem.
According to Fisher and Taub, YouTube described its recommendation system as artificial intelligence, which continuously learns which suggestions will keep its users watching its videos.
Marcus Rogers, a psychologist at Purdue who has conducted research on child pornography, told the writers such a system of gradually moving from innocent videos of children to increasingly sexualized videos based on recommendations is fairly easy to “normalize.”
“A lot of people that are actively involved in chatting with kids are very, very adept at grooming these kids into posting more sexualized pictures or engaging in sexual activity and having it videotaped,” Rogers explained.
Similarly, Jenny Coleman, director of Stop It Now, a group that fights sexual exploitation of children, warned, “Even the most careful of families can get swept into something that is harmful or criminal.”
COMMENTS
Please let us know if you're having issues with commenting.