Elon Musk has stated tackling child sex abuse content on Twitter is his number one priority, which comes as a refreshing change after the popular social media platform has spent years dawdling on the issue.
Twitter before Musk expended much of its content moderation energy on banning people for misgendering or stating basic facts of biology, while turning a blind eye to the child sexual exploitation (CSE) content being shared widely on the site.
But all that is about to change, and indeed, positive steps have already been taken. In less than a month, Musk has managed to virtually eliminate the three biggest hashtags used by abusers to share child sex abuse material on Twitter, as well as making the reporting of such material simpler with the addition of a direct reporting option for child sexual exploitation content.
[RELATED: Major Companies Are Pulling Twitter Ads After Learning They Appear Alongside Posts Peddling Child Porn]
A 2021 report by The Verge revealed Twitter's child pornography problem, and stated that Twitter's executives were aware of the problem but failed to commit adequate resources to detect, remove, and prevent such content from being shared on the platform.
Eliza Bleu, a human trafficking survivor and advocate, has been attempting to get Twitter to take the issue of CSE material seriously for the past two years. There was a glimmer of hope in February of this year when the platform added a feature to make reporting such content easier, but just a few months later, the feature was removed without explanation.
In an op-ed published by Blaze, Bleu explained the importance of having an easy way to report child sexual abuse material.
“When a platform thinks about creating ways to report child sexual exploitation, it needs to think of a very young child in a panic trying to figure out how to report their own exploitation,” said Bleu. “It should be very clear and easy to report, not a vague labyrinth for the minor to stumble through.”
While the Twitter of the past claimed to have a zero-tolerance policy when it came to child sexual abuse materials, that wasn’t evident in its handling of reported violations. The most widely known case is that of two minors who were groomed by sex traffickers and blackmailed into providing sexually graphic photos and videos, which were shared by their abusers on the platform. Even after multiple reports, including by the minors themselves, Twitter refused to remove the content, stating that it “didn’t find a violation.” The video had over 167,000 views and 2,223 retweets, according to the New York Post. One of the victims, and his mother, have since launched a federal lawsuit against the company.
Back in 2020, an investigation by the Post Millennial revealed that the distribution of child pornography was happening in plain sight all over the website, with child sexual abuse materials being shared via coded hashtags, and even Twitter's own guidelines permitting “artistic representations” of child pornographic images.
Lisa Haba, one of the lawyers representing the minor suing Twitter, explained in a video how Twitter would previously help child abusers hone their sex for illegal child sexual exploitation materials by suggesting words and phrases in the hashtag search. She pointed out that Twitter was bringing in advertising revenues from ads strategically placed throughout threads of child sexual exploitation materials.
It is rather telling that advertisers are reportedly leaving the platform due to Musk’s commitment to freedom of speech and his reinstatement of Donald Trump, when they were perfectly happy to advertise on a platform that promoted child pornography.Twitter is not the only social media platform failing to protect children from predators. Earlier this month, it was revealed that TikTok has a rampant child sexual exploitation issue, with the platform allowing predators to groom minors and share illegal images through private accounts despite the popular social media app claiming to have a "zero tolerance" policy for child sexual abuse material.