Instagram 'most important platform' for connecting child pornography distributors to buyers: report
"Instagram appears to have a particularly severe problem" with "Self-Generated Child Sexual Abuse Material" being sold, the report states.
Child pornography distributors are using Instagram to commission and sell child pornography as the purveyors use the social media network's algorithms and hashtags to promote content, according to a new report from the Stanford Internet Observatory.
"Instagram is currently the most important platform" for networks of child pornography distributors, as the platform has "features that help connect buyers and sellers," the report, which was released Tuesday, states in one of its key takeaways.
"Instagram appears to have a particularly severe problem" with "Self-Generated Child Sexual Abuse Material" being sold, the report also states. This type of pornography is illegal to possess and distribute, but children, often teenagers, voluntarily share these images with each other.
The distributors would not publish images of pornography but menus that would allow users to decide what image they want to see of various sexual acts. Some menus, however, offered things such as in-person sex or images of self-harm.
Twitter has also been used to distribute and promote child pornography, the Stanford report found, but the platform would remove inappropriate content "more aggressively" than Instagram and most of the pornographic accounts identified by researchers were removed within a week.
Meta, which owns Instagram, said it was creating an internal task force to investigate child pornography networks.
"Child exploitation is a horrific crime," the social media company said, according to The Wall Street Journal. "We're continuously investigating ways to actively defend against this behavior."