BLEU: The EU Is Using Child Sexual Abuse to Excuse Privacy Rights Violations

There is a long-standing debate between privacy rights groups and those who advocate for the removal of child sexual abuse material online about how best to accomplish that task. On its face it may seem callous that anyone would question taking any and all actions available. Who wouldn’t want to help as many children exploited […]

  • by:
  • 03/02/2023

There is a long-standing debate between privacy rights groups and those who advocate for the removal of child sexual abuse material online about how best to accomplish that task. On its face it may seem callous that anyone would question taking any and all actions available. Who wouldn’t want to help as many children exploited […]

There is a long-standing debate between privacy rights groups and those who advocate for the removal of child sexual abuse material online about how best to accomplish that task. On its face it may seem callous that anyone would question taking any and all actions available.

Who wouldn’t want to help as many children exploited as possible especially given the ever-rising number of victims? In 2021, the National Center for Missing and Exploited Children (NCMEC) CyberTipline received 29.3 million reports of suspected child sexual exploitations, an increase of 35% from 2020. In 2021, over 29.1 million of the 29.3 million total reports were from electronic service providers like Facebook/Meta.

Obviously, there is a massive problem of online exploitation which often destroys the lives of victims and survivors. We can’t just let tech companies off the hook with this problem. It must address it at scale but, at what cost?

The European Commission has proposed a new regulation that would require chat apps, including tech and social media platforms, to selectively scan users’ private messages for child sexual abuse material (CSAM) and “grooming” behavior.

A knee-jerk reaction to this news for the average citizen may be, “Yes please get all of the predators!” I’m with you if that’s your first thought but, I must think of the unintended consequences, and draw the line somewhere.

Unfortunately, there is not a simple answer, because many of the solutions offered can cause more problems than they solve. I often think about the Patriot Act and how it was sold to the American people. Who wouldn’t want to protect America against terrorism, right?

As we now know, the Patriot Act was used far outside the scope originally intended. Innocent American citizens had their privacy violated, because the government knew how to pull our heartstrings while simultaneously playing on our worst fears. We should learn from that mistake and not repeat it. Mass government surveillance is terrifying, to say the least. There is a track record of misuse and abuse, as well as the inability of government and third-party storage centers to sufficiently protect against data breaches.

The FBI has openly targeted parents, recently calling some a domestic terror threat for attending school board meetings. This all started with a Sept. 29 letter from the National School Boards Association that asked Biden for federal resources to help monitor “threats of violence and acts of intimidation” against public school members and other school officials. “As these acts of malice, violence, and threats against public school officials have increased, the classification of these heinous actions could be the equivalent to a form of domestic terrorism and hate crimes,” the six-page letter asserted.

Increasing government surveillance and scanning or collecting private social media messages could be used to target political opponents or protest groups communicating legally and well within their rights.

Despite what your feelings on January 6th may be, the government took steps to violate the privacy rights of citizens involved in peaceful protests, including those who may have just been in the area of the Capitol and did not attempt to enter any buildings. This shouldn’t sit well with anyone.

Whenever governments want to ramp up surveillance on private citizens, you can always expect them to claim one of two things: terrorism or child sexual exploitation. These are topics about which folks are under-informed, with narratives that folks are often unwilling to question. Politicians will claim they care about these kinds of topics; it gives them more control. Edward Snowden covered some examples of these tactics in an article about child sexual abuse material.

 There are other ways that the EU and the rest of the world could address these issues without violating privacy rights and potentially eroding end-to-end encryption. But instead, they opted for mass surveillance first. That makes me extremely skeptical that they actually care about abused children.

The EU could have supported internet safety courses for minors and parents to combat self-generated content, which makes up a large and growing part of the problem.  Consider the following report from The Guardian:

“Children as young as between three and six years old are becoming the latest victims in a growing trend of self-generated child sexual abuse, a report from an internet safety watchdog has said.”

The Internet Watch Foundation said over a one month period it saw 51 examples of self-generated abuse imagery – where children are manipulated into recording their own abuse before it is shared online – including a child aged between three and six. More than half of the cases involved a sibling or friend of the child.”

“The findings were included in IWF’s annual report, which underline the predominance of self-generated images in CSAM cases. It said 2021 was its worst year on record for child sexual abuse online as it reported 252,000 URLs containing images or videos of children – people under the age of 18 – being sexually abused, compared with 153,000 in the previous year. Of that total, there were 182,000 instances of self-generated material, with the biggest age group within that segment being 11 to 13 year olds, where there were 148,000 reports.”

This type of education would be preventative and put the responsibility on parents and caregivers.

The EU could also have helped by giving aid to Malaysia, a country that is begging for support in tackling these issues.

“The Royal Malaysia Police (PDRM) have received tens of thousands of internet protocol (IP) addresses suspected of sharing child pornography on the internet from international authorities over the past six years. However, they are unable to act on the information received due to manpower shortage,” reports Malaymail.

The internet is a global community, and there needs to be global cooperation to solve online crimes.

Governments could incentivize the platforms to remove the child sexual exploitation material on the public portions of the platforms before it makes its way into private messages. If the technology being used can’t remove CSAM and grooming behavior from the main social media feeds, it has little hope to catch it in messages. I’ve suggested a tax credit in the United States for platforms that innovate, review, and report content in a transparent and time-sensitive way.

I’m particularly distressed by the bold notion that the EU can effectively track down grooming behavior behind the scenes, given the vast quality of private messages on social media apps. I don’t want any child to be groomed, but once again, it is the responsibility of parents and caregivers to educate about the dangers online. It is not a tech company or government’s responsibility to take on this roll.

There is no reason for governments or tech companies to have open access to private conversations. This type of access endangers the lives of journalists, activists, survivors, political dissidents, and whistleblowers around the world. For instance, a survivor talking with a trusted friend privately about being abused by an individual in the government could be in danger. We need to make sure that global citizens keep their right to privacy.

In an open letter, 50 organizations supported the EU’s plans, including many US-based organizations. I believe that their hearts are in the right place, but they have failed to consider the downside. How long are these organizations expected to sit around while tech companies do the bare minimum to protect children? And while the world’s most vulnerable are being exploited sexually? I get it. I’m sure that those who signed onto the Patriot Act had a similar feeling that they were doing something great to protect all Americans.

The governments and those seeking power have no problem using a child’s very real trauma as a shield in order to shoehorn in their agenda. A portion of the letter reads:

“Focusing on the online dimension of child sexual abuse, EU Commissioner for Home Affairs Ylva Johansson emphasizes that there has been a 6000% increase in reports of child sexual abuse online in the EU in the last ten years alone. Most of the images and victims remain hidden, their abuse unseen and unreported. But even the tip of the iceberg is enormous: the National Center for Missing and Exploited Children received close to 85 million files containing child sexual abuse material in 2021. In the previous year, that number was 65 million. Over 62% of online child sexual abuse material (CSAM) worldwide is hosted on servers based in the EU. It will take collaboration between citizens, institutions, policymakers, tech companies, and nonprofit organizations to tackle a problem at this scale.“

As an advocate for survivors, solving this problem is very important to me, and I want to make sure it’s done right. One issue that stands out is how platforms are processing the content by sending it to impoverished countries for review and removal. This is one step forward and two steps back.

A lawsuit was recently filed in Kenya by one of the individuals tasked to remove child sexual abuse material, among other horrific things, for $2.20 an hour for Facebook/Meta. They are suing Facebook/Meta for human trafficking. Are we pushing platforms to create more victims of human trafficking to remove child exploitation content? These are the types of issues that need to be addressed first before we discuss anything else.

I am advocating for platforms to remove this content in the most humane and trauma-informed way possible. I am asking that governments look for ways to support parents, communities, and companies in solving these issues in ways that don’t create more vulnerabilities or trample the privacy rights of the people. I am hoping that the people will take the time to get informed about these crimes and help in finding solutions instead of passing that task off to their governments­–who could use the opportunity to take away the liberties of individuals.

Eliza Bleu is a human trafficking survivor advocate and a survivor herself. On The Eliza Bleu Podcast, she dives into difficult, but rewarding conversations about the trials and traumas people experience and the triumphs and healing possible.

Image: by is licensed under
ADVERTISEMENT

Opinion

View All

JACK POSOBIEC and ROGER STONE: 'The deep state is committed to try to derail the Trump revolution' by not confirming his nominees

"You weren't supposed to oppose a nominee just because you disagree with their political philosophy o...

Muhammad now most popular name for baby boys in UK

The name stems from the prophet Muhammad, the founder of Islam....

UK woman laments spending her life pursuing career after finding herself alone and childless after 40

"Because I put my legal career first," she said, "There was no time for children...I grieved the loss...