Elon Musk is cracking down on child pornography on Twitter. The task is proving daunting, and not everyone is happy about his strategy.
This December marks the latest development in a scandal that’s been growing for at least a decade but which gained momentum from January 2021, when the blogging platform was widely exposed as supporting child pornography. Before Musk’s purchase of the social media company, ‘John Doe’ sued Twitter to have his pornographic images taken down. The suit filed at the Northern District Court of California reads that John Doe, aged around 13-14, was approached by a sex trafficker posing as a 16-year-old female on Snapchat. They exchanged indecent images and videos and engaged in sex acts. He was then blackmailed and threatened that previous images would be shared with “parents, coach, pastor” and others. He acquiesced in sending more explicit and graphic images, which were monetised on Twitter. The clips went viral in 2021 and were shared hundreds of thousands of times.
The parent’s petitions to Twitter were met with indifference. They “didn’t find a violation” of the company’s “policies,” and refused to take the tweets down.
Before Musk took over, there were 87,000 complaints of child abuse in one year alone.
Musk made the protection of children one of his top priorities when he took the helm. All hashtags and accounts, which could be identified as affiliated with child abuse, were suspended, with tallies in America of 44,000 on the single day of December 5th. Looking at the scale of the problem, it would seem Musk’s efforts would meet with universal approval.
Nevertheless, on December 8th, three of Twitter’s Trust and Safety Council team promptly resigned, citing worries about “hate speech,” “safety,” and “wellbeing.” But according to the numbers, the council that was in place since 2016 appeared to have done little to curb child pornography.
Elon Musk called it a “crime that they refused to take action on child exploitation for years!”
The former CEO of Twitter Jack Dorsey responded by saying, “this is false.” But Musk was having none of it and shot back, “No, it is not. When Ella Irwin, who now runs Trust & Safety, joined Twitter earlier this year, almost no one was working on child safety. She raised this with Ned & Parag [Agrawal, former Twitter CEO], but they rejected her staffing request. I made it top priority immediately.”
Yoel Roth, the newly-resigned head of Trust and Safety at Twitter, has also been called out as holding permissive views toward the sexualization of children. Musk found Roth’s Ph.D. thesis, where Roth suggests that hookup sites like Grindr should adjust their services to accommodate “queer youth [under 18] culture.”
Other online searches lead to accusations of pedophilia, such as the resurfaced (but now-deleted) 2010 tweet wherein Roth solicited debate over the question of sex between “consenting” schoolchildren and teachers.
Musk’s intervention against child pornography and the priority he has given the matter has garnered much media attention. However, critics wonder if Musk’s strategy, which resulted in the disbanding of Twitter’s Trust and Safety Council on December 12th, will prove effective. According to Liz Wheeler at Head Topics USA, child pornography is still very easy to find, and threats against the well-being of Yoel Roth are circulating online, forcing him into hiding.
The record shows that prior to Musk’s takeover of Twitter, the people in charge were either negligent towards or condoned child pornography. However, despite the good feelings being generated by Musk’s efforts to shut down Twitter’s facilitation of skyrocketing increase porn addiction, the problem is still very much in need of further attention. According to the National Center for Missing and Exploited Children (NCMEC), “not much has changed since Musk took over in late October, despite his recent statements.”
Critics speculate that unless Musk repopulates the Trust and Safety Council with an ample and responsible team, eliminating access to abusive content via search tools and hashtags will not solve the problem. According to Eirliani Abdul Rahman, the co-founder of YAKIN (Youth, Adult survivors and Kin in Need),
it’s always good to have people on the ground, who are trusted partners who can tell you locally, OK, this is what’s needed and this is what they’re using, and it’s trending. You can’t just say, oh here are the known hashtags in English and it’s done. It’s not. It’s not that simple.
Despite the dissolution of the questionable Trust and Safety Council, Twitter has yet to see Musk’s intended Independent Content Moderation Council put in place.