Follow Us

Twitter faces suit for not removing pictures of teen sex trafficking victim

The lawsuit alleges that Twitter knowingly hosts accounts that use the site to engage in the exchange of child pornographic material and make money from it by way of advertisements.

Published: January 21, 2021 2:41pm

Updated: January 21, 2021 5:06pm

Twitter refused to remove widely shared pornographic material of underaged sex trafficking victims on their site because an investigation "didn't find a violation" of the companies policies, according to a federal lawsuit.

The victim, a 17-year-old boy, and his mother filed the lawsuit in the Northern District of California and alleges that Twitter made a profit off of the videos, according to the New York Post.

The video showed the victim, who was 13 at the time, engaging in sexual acts, which is a form of child pornography, the lawsuit states. 

The victim, whose identity is kept private, is now living in Florida and was between the ages of 13 and 14 when sex traffickers posing as a 16-year-old female classmate messaged him on Snapchat, the lawsuit alleges.

The victim and the traffickers exchanged nude photos before blackmail ensued, according to the suit. If the victim refused to share more sexually explicit material, the traffickers would reportedly share the previous content with his "parents, coach, pastor" and others.

The victim originally gave in and even included another minor in the videos, at the demand of the traffickers, the suit alleges.

Eventually, he blocked the traffickers, which resulted in them halting harassment. However, in 2019, the videos were posted on Twitter by two accounts that were known for sharing child pornographic material, according to court documents. 

The videos would be reported to Twitter at least three times over the course of a month, with the first one occurring on Dec. 25, 2019. Twitter did not do anything about it prior to a federal law enforcement officer getting involved, the suit said.

The victim did not become aware of the tweets until January 2020, when classmates teased and harassed him for it because the videos had been viewed by them. This led to him becoming suicidal, according to court documents. 

His parents spoke with his school and reported the situation to police while he filed a complaint with Twitter, saying that two tweets were on their site that contained child pornography of him and they should remove because it is illegal to have them up, were harmful and were in violation of the site’s policies.

Twitter replied to him, saying they refused to take down the material, which had over 167,000 views and 2,223 retweets, the suit states. 

"Thanks for reaching out," the response reads, according to the lawsuit. "We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time. If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities."

The victim, shocked at Twitter's response and refusal to take action, replied to the company in disbelief.

"What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken," he wrote back to Twitter. "We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down." 

The victim's mother connected with a Department of Homeland Security agent who had the videos removed on Jan. 30, 2020, the suit states. 

"Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children," according to the lawsuit, filed by the National Center on Sexual Exploitation and two law firms. "This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children."

The lawsuit alleges that Twitter knowingly hosts accounts that use the site to engage in the exchange of child pornographic material and make money from it by way of advertisements.

Twitter has not commented on the lawsuit.

The Facts Inside Our Reporter's Notebook

Just the News Spotlight

Support Just the News