Twitter’s Ban on ‘Groomer’ Term Reveals Platform’s Warped Priorities
New reporting suggests that Twitter, influenced by recent calls from left-wing groups, will ban the use of the word “groomer” as a criticism “when in context of discussion of gender identity.”
“We are committed to combating abuse motivated by hatred, prejudice, or intolerance, particularly abuse that seeks to silence the voices of those who have been historically marginalized,” Twitter staffer Lauren Alexander told the Daily Dot. “For this reason, we prohibit behavior that targets individuals or groups with abuse based on their perceived membership in a protected category.”
“Use of this term is prohibited under our Hateful Conduct policy when it is used as a descriptor, in context of discussion of gender identity,” Alexander said.
I understand that there are individuals that don’t prefer the term “groomer” or how it’s now being used. But we should be asking why child abusers seem to have more protection on Twitter than minor victims of grooming and sexual abuse. Because while the social media platform is prioritizing reports about specific words and policing how words are used, some of the most egregious reports of crimes against children on the platform are not being addressed.
There is actual grooming and child sexual exploitation happening on Twitter. If the platform is going to focus on anything, it should be the complete removal of child sexual exploitation at scale—not language policing those calling out problematic behavior.
“Sextortion” is Happening On Twitter
On July 22, according to the Department of Justice (DOJ), “Matthew K. Walsh, age 24, of Baltimore, Maryland, pleaded guilty to sexual exploitation of a minor in order to produce child pornography. Walsh admitted that he created fictitious online profiles purporting to be a minor female to contact and induce minor males between the ages of 12 and 17 to send sexually explicit images and videos the individual they believed to be a minor female, but was, in fact, Walsh.”
Here’s more disturbing information from the DoJ:
“To date over 40 minor males have been positively identified as victims of Walsh’s conduct. At least 30 victims’ pictures and videos were sold and/or distributed to others by Walsh.
Walsh also uploaded the minor males’ files to various Twitter accounts and sold the sexually explicit files of the minors to others, obtaining approximately $8000 from the sale of the files. Specifically, Walsh communicated with at least 50 different Twitter users interested in purchasing either individual files of child sex abuse material (CSAM), or Walsh’s ‘collections’ of CSAM. The ‘ucollection’ contained over 100 different victims’ files.
In several messages, the Twitter users were aware that some of the individuals in the sexually explicit files were as young as 14-years-old. Several Twitter users exchanged ‘tips’ with Walsh on how to evade law enforcement and discussed methods for enticing and extorting victims’ nude images and videos. Walsh was also a member of online groups which included other offenders who would post, sell, and trade CSAM.
Walsh reportedly had 22 Twitter accounts. The behavior that Walsh was engaging in was against Twitter's terms of service and against the law. The 50 different Twitter users interested in purchasing child sexual abuse material were also operating on Twitter against terms of service and against the law.
In some of the communications, by text, email, and video, the minor victims are crying and begging Walsh not to send the images and videos to their families and classmates, to leave them alone, and not to make them do more, but Walsh persisted with his threats and demands. Walsh admitted that he harassed some of the victims for years and obtained hundreds of files depicting sexually explicit conduct from some of the victims. In total, Walsh obtained approximately 2000 images and videos depicting sexually explicit conduct of the various minor males.”
Walsh is just one example of a grooming predator running rampant on Twitter.
Another recent case involved a child abuser that sometimes groomed children on Twitter, sexually abused them, documented the abuse, and posted the imagery on the main feed of his Twitter account. He used the platform for four years and sold child sexual abuse material to his 290,000 followers.
Unfortunately, this situation is not the first of its kind. These types of operations keep popping up on Twitter globally.
Twitter Has Mishandled Reports of Child Sexual Abuse Material
When a report of child sexual exploitation is made on Twitter it should be handled quickly and correctly.
In another case, we saw a clear example of what a platform should not do. A lawsuit against Twitter alleges that, in this case, two minor males had their child sexual exploitation material posted to Twitter. The video racked up over 167,000 views and 2,223 retweets.
Both minors were 13 years old in the video. When Twitter finally responded to reports of the video, the platform reportedly said, “Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”
The Department of Homeland Security reportedly had to step in to have the video removed.
What Should We Be Doing?
Twitter is one of many platforms facing this issue, but instead of asking these companies to ban conversations about grooming, I’d like to ask how we can expand that conversation.
Parents and caregivers definitely need to speak to their children about internet safety, especially the risk of sextortion. The FBI has tried very hard to get the word out that this crime is getting increasingly worse, but the message isn’t reaching parents quickly enough
I’d like to see anyone with a platform educating about these issues, including faith-based communities, schools, and the corporate press. For example, if Tucker Carlson did a segment on sextortion it would reach millions with the information they need to protect their children and communities.
We are at crisis levels of grooming and exploitation of children, especially online. Unfortunately, in a heartbreaking story, one victim of sextortion recently passed away by suicide.
Instead of urging platforms to remove non-violent speech, we should be encouraging them to stand by their own terms of service and actually remove child sexual exploitation at scale.
I believe a completely free speech version of Twitter would allow the platform's reporting system to prioritize the most egregious reports.
What About the Government Getting Involved?
It’s common for folks to see this conversation and request that the government step in. I see that as one of the most dangerous solutions—because then we run the risk of government overreach and a loss of digital privacy rights. Mass surveillance is never a great solution.
Banning the word "groomer" isn’t the correct place to start this conversation either, as it doesn't solve the problem and risks punishing advocates and survivors. The best solutions are to educate your families, raise awareness, and apply pressure on the platforms to get their priorities straight and make impactful changes to address the removal of child sexual exploitation.
24-Hour Call Center: To report information about a missing or exploited child, call the 24-Hour Call Center: 1-800-THE-LOST (1-800-843-5678). Report child sexual exploitation online at CyberTipline.org.
Eliza Bleu (@elizableu) is a survivor advocate for those affected by human trafficking. She is also a survivor of human trafficking.
Like this article? Check out the latest BASEDPolitics podcast on Apple Podcasts, Spotify, or below:
https://www.youtube.com/watch?v=AW-avieCqmc&t=1937s