Major Companies Are Pulling Twitter Ads After Learning They Appear Alongside Posts Peddling Child Porn

Major advertisers are suspending campaigns and pulling promotions from Twitter because their advertisements are ending up displayed alongside Tweets soliciting child sex abuse material, according to a new report highlighted by Reuters.

The report reveals that advertisers like Dyson, Mazda, and Ecolab have already suspended or pulled ads or campaigns. And that at least 30 other brands, including Disney, Comcast, NBCUniversal, and Coca-Cola have been featured on Twitter profiles that push links to material that is sexually exploitative of children.

The report was conducted by cybersecurity group Ghost Data, and examples of the disturbing advertising juxtapositions were reviewed by Reuters.

Ghost Data’s report also identified 500+ Twitter accounts that it says openly solicited or shared child sexual abuse material on the platform over just a 20-day period in September, over 70% of which Twitter failed to remove.

“Some of tweets include key words related to ‘rape’ and ‘teens,’ and appeared alongside promoted tweets from corporate advertisers,” Reuters reports, adding “In one example, a promoted tweet for shoe and accessories brand Cole Haan appeared next to a tweet in which a user said they were “trading teen/child” content.”

Cole Haan brand president David Maddocks told Reuters that the company is “horrified" at this revelation, adding “Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.”

In a Monday tweet, Ghost Data founder Andrew Stroppa revealed that his research found that Twitter “has a severe problem with child pornography,” but that the public release of the report would be delayed because it is “a sensitive topic.”

Stroppa, an Italian Data Scientist who has previously battled online bots for the World Economic Forum’s annual Davos meeting, has more recently made headlines for his apparent involvement in the Elon Musk Twitter Bot Battles.

Now, Stroppa says he personally funded research into Twitter’s child sexual abuse problem after a tipster turned his attention to the issue.

Eliza Bleu, a Human Events contributor and human trafficking survivor advocate has drawn public attention to Twitter’s repeated severe mishandling of child sexual abuse material posted on the platform. She says that if advertisers have any compassion for children or other vulnerable populations, they should pull their ads from Twitter immediately.

“Twitter could have dealt with their child sexual abuse material problem at scale long ago. Instead, they chose to put the censorship of words over removing human rights violations committed against children. The initial child sexual abuse is not Twitter’s fault. The failure to remove it at scale and in a timely fashion is,” said Bleu.

“If advertisers truly care about vulnerable populations they will remove their advertisements from Twitter at once. It’s time to hit Twitter where it hurts. This is the only language abusers understand."

“I will continue to stand by the two minor survivors suing Twitter. I applaud them for their bravery,” Bleu added.

The cases Bleu references are not new news. In one case, Homeland Security had to step in to force the removal of images and videos of a teenage sex trafficking victim. Twitter had refused to remove the material, claiming that it did not “violate policies.”

A federal lawsuit filed by the victim and his mother alleges that the minor male was lured into sharing sexually explicit images of himself with traffickers who posed as a teenage girl and then used the images to blackmail the boy into sending explicit videos of himself and another child.

After requesting ID from the child, known only as John Doe, to prove his age, Twitter eventually responded notifying the family that the images would not be removed.

At that point, the material had already been viewed 167,000 times.

“What do you mean you don’t see a problem?” Doe responded upon receiving Twitter’s decision. “We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down.”

Twitter employees themselves admit that the platform for whatever reason is ill-equipped to moderate content for “child sexual exploitation and non-consensual nudity at scale.” An internal report from a team tasked with assessing the viability of a plan to position Twitter as an OnlyFans competitor determined that the company needed to shelve its plan to monetize adult content. Why? Because the social media company was already failing to monitor sexual content in a way that ensured the safety, consent, and adulthood of those pictured.

Employees who spoke to The Verge “reiterated that despite executives knowing about the company’s CSE problems, Twitter has not committed sufficient resources to detect, remove, and prevent harmful content from the platform.”

These known infrastructure failures have resulted in numerous horror stories of child sex abusers being allowed to use the social media platform as an active abuse tool, sometimes for years without repercussions.

Stroppa promised in a Wednesday tweet that he would reveal more information about “child pornography on Twitter” later that day, adding “I have a lot to say.”


Opinion

View All

WATCH: Posobiec and Former Green Beret Jim Hanson Discuss Musk's Purchase of Twitter 'Crime Scene'

'We all know intuitively and demonstrably that Twitter interfered in elections for their preferred ca...

Kanye West Claims 'I Like Hitler' During Interview With Alex Jones

'I see good things about Hitler, also,' West said to a speechless Jones. 'Every human being has value...

MARSCHALL: Toxic Masculinity and ‘The End of History’

Western men used to wear makeup, heels, long hair, and ornate costumes–all while supporting political...

New York City, NYPD Employees Busted for $1.5 Million Covid Relief Funds Scam

'As public employees, these folks should have known better.'...

© 2022 Human Events, Privacy Policy