‘Priority #1’: Elon Musk To Address Twitter’s Child Sexual Exploitation Problem

 Twitter CEO Elon Musk said Sunday the social media platform would make addressing its alleged child sexual exploitation problem his number one priority.

Musk, who has made sweeping changes within his newly purchased $44 billion company, responded to a report shared by Twitter user EvaFoxU, which features a human trafficking survivor about how the company “has begun addressing the issue of posting child sexual exploitation content on Twitter after years of the platform’s inactivity on the subject under past management.”

“Priority #1,” Musk said in response to the EvaFoxU tweet.

Twitter’s alleged child pornography problem has been an issue for over a decade. Advocates for removing the material have been pleading with previous management to create immediate solutions for the sake of children who have been exploited for profit from material showing graphic pedophilia acts from their abusers.

Twitter spokeswoman Katie Rosborough told The Verge last summer that the platform has “zero tolerance for child sexual exploitation.”

“We aggressively fight online child sexual abuse and have invested significantly in technology and tools to enforce our policy,” Rosborough said. “Our dedicated teams work to stay ahead of bad-faith actors and to help ensure we’re protecting minors from harm — both on and offline.”

Yet, while the amount of child sexual abuse material online has increased, “Twitter’s investment in technologies to detect and manage the growth has not,” according to a Twitter Red Team report from a project called Adult Content Monetization.

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team concluded in April 2022, according to The Verge.

Employees told The Verge that despite executives knowing about the company’s child sexual exploitation problems, Twitter did not commit to providing sufficient resources to detect, remove, and prevent harmful content from the platform.

Human trafficking victim “John Doe” filed a federal lawsuit last year against Twitter, alleging the company profited off widely shared pornographic images and videos of him when he was 13 years old. But despite attempts to have it removed, Twitter allegedly refused to take down the material because the investigation didn’t find a violation of the company’s policies.

Andrew Stoppa, the cybersecurity group Ghost Data founder, released a new research report in September about child sex abuse online that allegedly identified links to exploitative material containing child pornography. The report said tweets soliciting child sex abuse material have appeared alongside or on profile pages of at least 30 major advertisers’ Twitter accounts, which led some companies, like Dyson, Ecolab, and Mazda, to pause or pull their ads or campaigns from the platform.

Stroppa, who personally funded research into the platform’s child sexual abuse problem after receiving a tip about the severity of the issue, said in a tweet following the report that Twitter “has a severe problem with child pornography.”

According to the report, more than 500 accounts openly shared or requested child sex abuse material for 20 days during September 2022. Of those accounts, more than 70% were still active during the study.

Eliza Bleu, a human trafficking survivor and advocate for those affected by modern-day slavery, posted a series of tweets earlier this month calling on Twitter to remove Child Sexual Exploitation Material amid a groundswell of changes to the platform following the official acquisition by Musk.

Bleu has been speaking out against Twitter’s alleged child pornography problem since joining the platform in 2019. Her voice grew exponentially on the platform throughout the COVID-19 pandemic, opening the door to meeting with Twitter officials set up by former CEO Jack Dorsey.

“I tell people, look, when you go into this world — you’ll never be the same,” Bleu told The Daily Wire. “It’s a dark underbelly. If you’re not fully looking for it — unless you’re looking at the other adult content — you might not stumble on it. But I’m telling you, it’s there in droves.”

Bleu said time is the main difficulty in addressing Twitter’s alleged child pornography problem.

“You don’t have a lot of time,” Bleu said. “I don’t want more victims to be exploited on a massive scale — but I also don’t want the issue to be that Twitter is sued by more survivors as a result. I want them to be a success.”

Bleu laid a series of steps to create a two-click reporting system — easy enough for a child to report their abuse material, which she said would serve survivors without violating digital privacy rights.

Through the system, as instructed by Bleu, it would prioritize the removal of reports of child exploitation first, remove hashtags known to be used to sell child sexual exploitation, and provide resources for minor survivors during and after the reporting process.

Less than two weeks later, Twitter added an option to report child sexual exploitation under sensitive or disturbing content, which includes consensual nudity and sexual acts and non-consensual nudity intended to spread hate based on someone’s identity.

Bleu applauded the Twitter management for taking the problem seriously, adding three main hashtags used to sell or trade child sexual abuse content on Twitter have been removed since Musk took over.

Since going public with her personal story as a victim of human trafficking two years ago, Bleu has felt immense support from her community on and off social media. Her voice has become a source of information many say they can trust.

“It really means a lot to me that now I’ve become a woman that people can trust,” she said. “It’s like pretty much the biggest honor only second to the survivors on Twitter getting justice.”

“If I can see the survivors at Twitter, getting justice — that would make it a bigger honor,” she added.

Powered by Blogger.