How YouTube and social media algorithms drive people to child pornography

Algorithms often know people better than they know themselves

When two ten-year-old girls in Rio de Janeiro started recording themselves splashing around in their backyard pool on their phones, their mother thought nothing of it. Parents post photos and videos of their toddlers and young children all the time, after all. What does it matter if they’re in the pool?

But a few days later her concern changed after her daughter excitedly shared that the video had reached 400,000 views on YouTube. For such a dull video, the sheer quantity of views in such a short time seemed absurd.

In YouTube’s pursuit of increasing time people spend watching videos and thus ads, or “engagement” as it’s internally reported, YouTube had identified the kids and their environment and began showing it to a specific, niche audience.

The video — and hundreds like it plucked from family archives, birthday parties, camps, and more — was just one stepping stone in a series of ever-extreme, salacious, and more “interesting” videos. 

YouTube’s algorithm had quietly and perniciously featured the video in dozens of archives, recommendations, “people also watched”, and sidebars on other videos of partially-dressed prepubescent children. To a specific kind of audience with specific tastes, the videos were an entree into the private lives of families and children, most doing seemingly tame acts. 

A young blond girl works on a laptop in a dark room with a glowing screen
Children posting material online can be re-targeted to a select audience with dangerous and illicit consequences.

Videos of young girls performing cheers or splits, boys wrestling on their bed, and young friends diving into a pool all solicited a very specific kind of viewer at a global scale. Sometimes, algorithms may know users better than they know themselves and drive them down a digital rabbit hole.

Researchers in Brazil, the U.S. and elsewhere had uncovered how medical misinformation, conspiracy theories, and child pornography spread through YouTube’s recommendation engine. It was the same strategy researchers tested Facebook Groups and other social media channels against. Each subsequent video dug deeper on a topic of the subgenre.

Algorithms guide users, usually in pursuit of longer views and more time on-site

In Max Fisher’s Pulitzer-prize nominated book, The Chaos Machine – The Inside Story of how Social Media Rewired Our Minds and Our World, the New York Times journalist described some of the research findings:

“Some of the recommendation chains followed unmistakable progression: videos led to another, where the woman [on-screen] and the video put greater emphasis on youth and grew more erotic.” 

A video of a middle-aged woman discussing sex, perhaps innocuously, might have a video recommended afterward of a thirty-something woman breastfeeding. That video shares a sidebar with another video of a breastfeeding woman who also discusses her age as being 18 or 19. The video after that features more breastfeeding, but the young women in it solicit donations. Further still, another video shows a woman claiming to be 18 soliciting donations and she realizes she can make more money if she speaks in baby talk or poses seductively in children’s clothing.

A young brother and sister use a MacBook on a couch in their living room

“From there, YouTube would suddenly shift to recommending clips of very young children caught in moments of unintended nudity,” writes Fisher, citing a girl doing  splits, a cheer, or changing outfits. 

These videos were often posted alongside other videos from the same user or under family accounts making them easy to trace to a home city or address for the right kind of motivated person.

“The ruthless specificity…[suggests] its systems could correctly identify a video of a partially nude child and determine that this characteristic was the video’s appeal.”

Max Fisher

Because American laws prohibit viewing child pornography with few exceptions for researchers, most broad analysis of this and other social media platform’s policies are hard to evaluate. Most studies are inherently small, and most child predators rarely want to speak about their desires.

Internally, most social media platforms are slow or unresponsive to requests to hide, delete, or hinder the spread of any videos or message in any context except outright murder. Some of this is due to fears of regulation and maintaining their “neutrality” under Section 230 of the U.S. Code, which protects companies like YouTube and Facebook as platforms for speech and not editorials or journalism, which exposes them to libel and slander. And in some cases, their lack of desire, Fisher writes, is due to revenue losses. The more extreme the content, the more people watch, read, or engage.

Fisher interviewed two Purdue University researchers, Kathryn Seigfried-Spellar and Marcus Rogers, who, 

“…found that child-pornography consumers often developed that interest, rather than being born with it. People who undergo this process begin with adult pornography, then move to incrementally more extreme material, following an addiction-like compulsion to chase increasingly deviant sexual content, pornography a degree more taboo than what they’d seen before.” 

Their compulsions, Fisher writes, were shaped by whatever they happened to encounter. Akin to training, algorithms powering YouTube, Reddit, Facebook, and other social media sites lack a moral compass and have quietly trained tens of thousands or even millions of people to prefer videos of extreme or salacious behavior. In many cases, users didn’t seek it out — YouTube just presented it to them before they knew they wanted it. For some that might just be more videos about faster cars, more expensive designer jewelry, or cutting-edge technology. For others, it’s conspiracy theories, child pornography, and other illicit material.

After the Purdue researchers notified YouTube of their findings in 2010, the company took down many of the recommendation engines that were in place where they had been for years and refused to comment on the timing. In the intervening months, many of them returned. Today, the YouTube homepage, sidebar, and next-up videos have largely returned with similar recommendations.

For parents like the mother of a ten-year-old swimming with a friend, the only thing to do, she told Fisher, “is forbid her to publish anything on YouTube.”

Skip to content