Pornhub Verified Child Sex Trafficking.

Pornhub Verified Child Sex Trafficking.This is not a site deserving of Section 230 protections.

  • by:
  • 03/02/2023

Pornhub Verified Child Sex Trafficking.This is not a site deserving of Section 230 protections.

Pornhub must held accountable for hosting and verifying a 15-year-old victim of sex trafficking as an over-18 content producer. The most visited pornographic website in the world hosted numerous videos that detailed the rape and sexual abuse of an underage girl.

Porn websites have long enjoyed the same kinds of protections that platforms like Twitter and Facebook have that have allowed them to jettison responsibility over the content that their contributors upload.

The videos were found on a number of platforms, including Periscope, Modelhub, Snapchat, but Pornhub was the only one to provide a verification checkmark to the victim, which gave any unsuspecting viewers the impression that she was a consenting performer over the age of 18. In 2019, the mother of the victim was alerted to the videos a year after reporting her missing in South Florida.

Charged with trafficking and rape of a minor, 30-year-old Christopher Johnson, allegedly kidnapped the victim from a convenience store and forced her to have sex with him at his apartment. Speaking to police detectives, the victim said that she was even forced to have an abortion during the duration of her time in captivity. According to the Sun-Sentinel, police discovered paperwork from an abortion clinic to back up her allegations.

Of course, platforms should not be liable for the actions of rogue operators—Twitter, YouTube, Facebook, and even alternative social media sites like Gab and Minds, wouldn’t be able to exist if the courts shut them down the instant a bad actor uploaded a piece of illegal content onto the website. That said, it is the responsibility of social media platforms to moderate user-made content to the best of their ability, and they’re legally bound to do so. Section 230 of the Communications Decency Act was designed to encourage platforms to remove objectionable content, promising that they wouldn’t be held liable if they screwed up in the process.

Pornhub, however, has clearly made no such good faith effort.

Porn websites have long enjoyed the same kinds of protections that platforms like Twitter and Facebook have that have allowed them to jettison responsibility over the content that their contributors upload. This has allowed them to scale rapidly, both in terms of distribution and profit.

It’s high time, however, that we start to distinguish the kinds of content that platforms upload, and find legal avenues to hold Pornhub accountable.

[caption id="attachment_181724" align="alignnone" width="1920"]Pornhub "verified" a victim of sexual abuse as a consensual performer. Pornhub "verified" a victim of sexual abuse as a consensual performer.[/caption]

MONETIZING SEXUAL ASSAULT

A quick search on the platform for “GirlsDoPorn” yields over 400 results with potentially illegal footage of women who were sexually exploited by the owner of GirlsDoPorn, Michael Pratt. Pratt and his associates are currently facing a federal indictment over the production of child pornography and trafficking of a minor. He has since fled the United States and is the subject of a federal warrant.

"I sent Pornhub begging emails. I pleaded with them. I wrote, 'Please, I'm a minor, this was assault, please take it down.'" She received no reply, and the videos remained live.

GirlsDoPorn originally presented itself as a legal enterprise with consensually procured content, so Pornhub and other platforms can’t be blamed for hosting these videos in the first place. But the problem is Pornhub hasn’t removed the content. It may have banned GirlsDoPorn’s original channel, but re-uploads of the illicit material remains online and continue to be uploaded.

The platform, which makes millions of dollars in daily traffic, has a “verified user” system like the checkmarks used on Twitter, Facebook, and Instagram to determine if someone is who they say they are. But, as detailed by Laila Mickelwait for the Washington Examiner, the site’s verification system is rudimentary.

On mainstream platforms, users requesting to be verified need to submit proper legal documentation to prove their identity. This isn’t the case with Pornhub. “It took me under 10 minutes,” Mickelwait writes, “to create a user account and upload blank test content to the site, which went live instantly. I could have then gone on to become Pornhub-verified, and all I would need to do is send a photo of myself holding a paper with my username. That’s it.”

According to Mickelwait, it wasn’t difficult for the 15-year-old’s alleged rapist to upload videos of the rapes and quickly monetize them without ever running afoul of the site’s negligent moderators.

In a similar case, reported earlier this month, a 14-year-old girl named Rose was violently raped at knifepoint for twelve hours. The BBC reports that the footage ended up on Pornhub, and it took the victim’s mother months of pleading to convince Pornhub to remove the graphic rape, which was distributed over several videos.

“The titles of the videos were 'teen crying and getting slapped around', 'teen getting destroyed', 'passed out teen.' One had over 400,000 views,” the victim recounted. "The worst videos were the ones where I was passed out. Seeing myself being attacked where I wasn't even conscious was the worst."

Pornhub relies on user reports to handle illegal material, but no one would ever suspect that a verified account to be anything but above board. Furthermore, even when the material is reported—as was the case with the videos of Rose—the platform can take months to remove the footage, if it even bothers to do so.

Rose said she emailed Pornhub several times over a period of six months in 2009 to ask for the videos to be taken down. "I sent Pornhub begging emails. I pleaded with them. I wrote, 'Please, I'm a minor, this was assault, please take it down.'" She received no reply, and the videos remained live.

Just like in the case of the victim from South Florida, and the countless women involved in the GirlsDoPorn case, Rose’s case is not extraordinary. Given Pornhub’s careless moderation policies, and how easy it is to even monetize rape, it wouldn’t be too much of a shock to know that more of this goes on that anyone realizes or wants to think about.

[caption id="attachment_181723" align="alignnone" width="1920"]Section 230 of the Communications Decency Act. Section 230 of the Communications Decency Act.[/caption]

QUALITATIVELY DIFFERENT CONTENT

Running a porn site has very few legal risks at the moment, even when its users upload illegal material. According to the Pornhub itself, the site saw 42 billion visits in 2019, 39 billion searches performed site-wide, with an average of 115 million visits per day, and 6.83 million new videos uploaded.

Yes, it will severely curtail their business, but that’s simply outweighed by the consequences. They’re already making hand over fist. They can afford to moderate what goes on their site and take responsibility for the content they post.

Porn websites must be subject to content audits or shut down if they fail to meet new legal standards. They should not be afforded the same Section 230 CDA protections that platforms like Twitter or Facebook are–the content they traffic is qualitatively different. (Query, too, whether Twitter and Facebook continue to deserve those protections). Porn sites need to vet everything that gets uploaded. Yes, it will severely curtail their business, but that’s simply outweighed by the consequences. They’re already making hand over fist. They can afford to moderate what goes on their site and take responsibility for the content they post.

20 years ago, when the internet was new and needed delicate handling, Section 230 was a useful tool to incentivize growth and investment. It made sense. Today, these platforms have swallowed up entire economies; it no longer makes sense to treat their business model with such caution. The kinds of content these sites facilitate—pro-eating disorder posts, content about self-harm and suicide, and of course, fake news and political disinformation—are having devastating effects on society. Going after porn sites is the best way to start that conversation: giving them broad liability protections has clear consequences that most people can agree are intolerable.

There is no moral ambiguity here. Sites like Pornhub need to verify the age and identity of every model who is posted on the site, and to stop publishing anonymous content from producers.

Image: by is licensed under
ADVERTISEMENT

Opinion

View All

German crypto CEO under house arrest in NYC skips $5 million bond and is now a fugitive

$4 million of Jicha's bond was personally guaranteed by his partner, children, and 3 other people liv...

STEPHEN DAVIS: Minneapolis food bank turns white people away, only serves black and indigenous people

The pantry had a sign on the door which read, “The resources found in here are intended for Black & I...

JACK POSOBIEC: JD Vance, Donald Trump's lives 'increasingly at threat' as GOP ticket surges in polls

"He's not going to be golfing between now and the election, because of the security threat that is ve...

Boris Johnson shuts down CNN's Jake Tapper over Trump-Russia narrative

"I had dealings with President Trump over Russia, like when the Russians poisoned people in the UK, i...