Twitter Sued for Alleged Refusal to Remove Child Sexual Exploitation


A story broke on Thursday with the New York Post reporting on a lawsuit filed against Twitter, targeting the company’s unwillingness to remove child pornography from their platform. The lawsuit claims that a response to the request for the removal of content by a victim of child sexual exploitation online read that no violation of their policies had been found. This would not be out of line with Twitter’s long-standing problematic history on the issue of exploitative content.

The lawsuit has been filed by the National Center on Sexual Exploitation and alleges that Twitter has allowed child-exploitative material to circulate on the platform, possibly due to negligence. It also claims that Twitter has profited from this material being shared and viewed through the increased reach of the advertisement hosted on the platform.

“This lawsuit seeks to shine a light on how Twitter has enabled and profited from [child sexual abuse material] on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity.”

“Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material. Twitter’s own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.”

The recent New York Post article recounts the story included in the lawsuit of a now-17-year-old, who was harassed and threatened into recording and sharing explicit material showing himself and another then-13-year-old. While the exchanges with sex traffickers posing as an underage girl started and took place on Snapchat, they started spreading on Twitter in 2019. Despite being reported multiple times during that year, Twitter did not act to remove the child-pornographic videos. Twitter eventually took action against the exploitative content, but only after the involvement of US federal law enforcement.

Before being removed in response to pressure applied on Twitter by the US Department of Homeland Security, the material featuring sexual imagery of the victims of abuse represented by the lawsuit amassed a staggering number of views and retweets.

Twitter’s ‘Child sexual exploitation policy’ overview, last updated in October 2020, claims that Twitter has ‘a zero-tolerance child sexual exploitation policy’ in place, stipulating that such content constitutes ‘one of the most serious violations of the Twitter Rules.’

If the lawsuit proves successful, it will show that, in contrast with these claims, Twitter actually benefits and profits from the proliferation of child exploitative material on the platform, and therefore might be less cooperative in its removal when called upon to do so.

The recent stories and allegations follow a trend in Twitter’s history. In 2012 reporting on this issue pointed out the company’s inability or unwillingness to combat exploitative content. The Guardian documented how Twitter was swift to take action on a potential doxxing of a public figure but did not react to numerous legitimate reports of child pornography. Only when the hacker group Anonymous got involved and initiated a mass-flagging campaign was the account taken down. It is not clear whether this was done manually or whether a certain number of reports was at the time enough for any account to be taken down automatically.

In response to this scandal, the Mirror quoted child protection expert Mark Williams-Thomas as saying that “the use of Twitter by paedophiles was ‘out of control’.” The newspaper’s description of the platform was:

“Twitter is a paedophiles’ playground,”

Several years later, a probe into Twitter’s practices in 2016 by independent journalists and groups showed that Twitter does little to stop the spread of child pornographic material on their platform. The report also showed how ISIS, the then-spreading terrorist organization, was using Twitter to share explicit images of killings and torture as well as to organize.

Twitter’s alleged refusals to remove child pornography from the site coincides with large-scale censorship efforts. Campaigns are ongoing to remove material and accounts from the platform for political reasons, an alleged link between speech on Twitter and real-world violence being cited. 

Famously, Twitter has recently removed the account of US ex-President Donald Trump along with many of his supporters. The issues raised with Twitter’s handling of the child exploitation issue also comes amidst the company’s moves to remove all content and accounts tied to the QAnon movement, whose central aim supposedly is to expose and fight child sexual exploitation which they claim proliferates in society’s elite circles.

Check out our premium content.


Subscribe to Newsletter

Share:

Comments