As Twitter was crumbling under Elon Musk’s new leadership in 2023, various online circles found themselves flocking to alternative platforms. While some may have kept using Twitter (now known as… X), a non-negligible number of communities migrated over to Mastodon and other smaller platforms. Meanwhile, Meta shipped its own textual social media platform, Threads. The service initially launched in most parts of the world except for the European Union, but it’s been available in Europe for over six months now and has seen its usage soar.
For many, Threads understandably felt like a breath of fresh air following the chaos that engulfed Twitter. Unlike the latter, Threads is not run by someone that I and many others find to be an exceptionally despicable human. Its algorithmic timeline contrasts with Mastodon’s exclusively chronological feeds, and its integration with Instagram has attracted a number of big names and stars.
I’m an activist. In my daily life, I work and advocate for the advancement of trans people’s rights in France. As a result, my expanded online social circle mostly consists of LGBT people, and most of them are activists, too. However, in the span of a few months, almost everyone in that circle who was excited about Threads launching in Europe has now stopped using it and migrated back to Twitter, Mastodon, or elsewhere. When I ask around about why those people left Threads behind, their responses vary, but a trend persists: most felt like they were being shadow-banned by the platform.
Without hard data, it is difficult to investigate this feeling, to understand if it is truly widespread or specific to some online bubbles. But one thing is certain: Threads hasn’t felt like a breath of fresh air for all who tried to use it. In my experience as a trans woman, at its best, it has felt like Jack Dorsey’s old Twitter: a social platform overrun by an opaque moderation system, free-roaming hate speech, and a frustrating algorithm that too often promotes harmful content.
As months go by, incidents where Threads consistently failed to uphold its understood promise of a better-moderated Twitter-like platform have added up. Today, for many non-white, non-straight, non-male users, it is a repulsive social media experience, where their voices are silenced and where hate speech offenders who target them go unpunished.
Let’s talk about this.
This prevalent sentiment among LGBT people has increased over time. You need only rewind back to January 2024, soon after Threads launched in Europe, to find something pretty disgusting that started happening in people’s For You feeds. Homophobic and transphobic posts kept plaguing the app, despite people’s efforts to report these posts and hide them from their timeline. This wave of hateful content also included anti-abortion posts, and reports of this happening were widely shared. Worse, these posts were often pushed into people’s feeds on Instagram as well, prompting some trans people to stay clear of Threads altogether.
This widespread incident is the only one so far that has prompted an official response from Threads, through the voice of Instagram’s head Adam Mosseri. However, that response is very telling of Meta’s approach when it comes to addressing these moderation issues. In his 23-second video, Mosseri acknowledges that there have been “low-quality recommendations” in users’ feeds but makes no explicit mention of what these recommendations were actually made of: anti-abortion comments, sexism, and violent homophobia and transphobia.
In March, this spike of violent hate speech targeted at LGBT people was followed by the release of a damning report from GLAAD, a renowned non-profit organization focused on LGBT advocacy. This report, entitled Unsafe: Meta Fails to Moderate Extreme Anti-trans Hate Across Facebook, Instagram, and Threads, paints a picture that has come as no surprise to any trans or marginalized person who has ever used any of Meta’s platforms for a significant amount of time. It puts forward a sample collection of viral harmful posts that, despite being in clear violation of Meta’s policies, have remained in circulation across Facebook, Instagram, and Threads.
For anyone doubting that there is a consistent, reproducible, and wide-ranging problem of Meta failing to moderate harmful content targeted at marginalized people on its platforms, I highly encourage that you set aside some time and read through the report.
In its introduction, GLAAD writes,
Characterized by fear-mongering, lies, conspiracy theories, dehumanizing tropes, and violent rhetoric, these posts — many by high-follower accounts — aim to boost engagement, generate revenue, and seed hateful narratives about trans, nonbinary, and gender non-conforming people. These accounts profit from such hate, and so does Meta and its shareholders. Meanwhile, LGBTQ people and other targeted groups experience an increasing number of well-documented real-world harms stemming from these long-term anti-LGBTQ propaganda campaigns, driven by the anti-LGBTQ extremists that Meta allows to flourish on its platforms.
Since the release of the GLAAD report, Meta has not offered any further acknowledgment of this long-standing issue. Instead, in February 2024 – a couple of weeks before the report was released – the company introduced a worrying and seemingly unrelated policy change: Meta would now opt all of its users out of “political content” in their platform’s algorithmic timelines. Along with that change, a new toggle was added in Instagram’s settings panel. If you want to see political content in your feed, you need to flip the switch.
Since it was announced, the nature of what would be filtered by this new, on-by-default ‘Political Content Control’ on Instagram and Threads has never been clarified. On an Instagram support page, the company states: “Political content includes content that mentions: Governments, Elections, Social topics.”
What are “social topics”? Does that include black people speaking out against systemic racism? Does that include LGBT people speaking out against homophobia and transphobia? Does that include anyone identified as a part of a socially marginalized group of people? I am worried that it does – especially thinking back to my online circle of LGBT activists who have all left Meta’s platforms behind in favor of Twitter, Mastodon, or Bluesky, because they were all under the impression that they were being shadow-banned. I feel certain that opting everyone on the platform out of seeing political content has only contributed to this impression.
But I’m not alone. In the wake of this change, hundreds of political and news content creators, LGBT activists, and journalists, have signed an open letter to Meta asking the company to reverse its decision.
They write:
…Meta’s vague definition of political content as “likely to mention governments, elections, or social topics that affect a group of people and/or society at large” endangers the reach of individuals and organizations whose identities and/or advocacy have been rendered a ‘social topic’ in this country. This undermines the reach of marginalized folks speaking to their own lived experience on Meta’s platforms and undermines the reach of advocacy work in important areas that have become ‘social topics’ including climate change, gun violence prevention, racial justice, transgender rights, and reproductive freedom to name just a few.
Meanwhile in France, during that same month of February, Le Coin des LGBT+ (@lecoindeslgbt) was inexplicably suspended from Threads and Instagram. The account is extremely popular in the country, as it focuses on relaying LGBT news and events and plays a key role in online advocacy of LGBT people’s rights. Today, the account is still up and running on Instagram, but if you try looking for it on Threads, all you will find are countless messages of outrage from people asking Meta to reinstate the account.
While Le Coin des LGBT+ was restored a few days after its suspension without any explanation, the account’s holders chose not to stick around on Threads, limiting their activity to Instagram and Twitter. I would have made the same choice. For the few weeks that it was around on Threads, every single one of the account’s posts was bombarded with insults, death threats, and literal nazi imagery. Day after day, I spent hours trying to report every single one of them. Not only did the swastikas stay up and reappear faster than I could report them, but every single one of my reports was also met with a cold, disturbing, automated message from Instagram in my email inbox telling me that, “No violation was found”.
The extremely worrying thing is that we don’t know how often this happens. Le Coin des LGBT+ has a huge following, and as result, the issue became widely relayed on both Instagram and Threads. But what about all of the victims who have less than a few hundred followers? Is light ever shed on the threats, insults, and hate that they have received?
This is all starting to add up. If you’re an activist, a journalist reporting on issues affecting LGBT people, or an LGBT content creator, Threads is now both silencing your voice and exposing you to death threats.
Quite recently, I once more experienced this frustrating and harmful opacity around Meta’s moderation. Snap legislative elections took place in France this month. I will spare you the summary of this chaotic political situation at home, but long story short: we were at risk of an extremist far-right party rising to power. For all minorities and LGBT people in France, sharing our voices and our concerns online about this imminent threat was of crucial importance.
On the opposite end of the French political spectrum, an alliance of all the left and green parties was formed, and it became the only real hope for all marginalized people to elect a government that would further their rights instead of repressing them. This alliance is called the New Popular Front. One of its main ways of raising awareness online during the campaign was sharing its political manifesto, accessible via the following URL: nouveaufrontpopulaire.fr.
However, a mere week before the first round of the legislative elections, people started to realize that this URL was blocked by Meta. It suddenly became impossible to post a link to the New Popular Front’s website on all of its platforms: Facebook, Instagram, and Threads. I documented this finding myself on Mastodon and Threads. The URL was disallowed in posts, in Instagram stories, and even inside conversations in Messenger and Instagram DMs with no explanation.
The New Popular Front’s website worked and could be shared on Twitter, Mastodon, and other social platforms without any issues. While many of us thought this ban was perhaps applied to all the running parties in the election, it turned out that sharing a link to the far-right party’s website on Threads, Instagram, and Facebook was still possible. To add insult to injury, Meta started removing all posts on Threads that contained a link to the New Popular Front’s website; one of mine, posted one week prior, was affected. My Instagram and Threads accounts were even suspended for a few hours after that post was removed.
There we were, seven days before a crucial election that could upend our entire lives, unable to share links to one of the few platforms that could save us. This suspension lasted for more than 24 hours, which felt like an eternity in an incredibly fast-paced campaign where every second counted. When Meta lifted the ban on shared links to the New Popular Front’s website, it once again didn’t issue any statement. Posts that were previously removed were quietly restored, and moderation appeals that had been filed by hundreds of French users have seemingly been lost into Meta’s automated moderation void.
This was probably a mistake, a glitch in Meta’s automated process that perhaps erroneously flagged the URL as dangerous. But even if that was the case, why did it take more than 24 hours for the company to resolve the problem? Why would they refuse to even acknowledge that something happened? Like I said, it starts to add up. Now, as a black or LGBT person, in addition to having your voice silenced and exposing yourself to death threats on Meta’s platforms, you are also at risk of having your account suspended for sharing a link to a political manifesto just days before a crucial election.
As Threads is slowly opening up to the fediverse, perhaps federation can feel like an escape hatch. It could be reassuring to think, “Perhaps if I can’t possibly stay active on Threads, I’ll be active on Mastodon, and thanks to federation, people on Threads can still read and follow me.” But I’m not hopeful. Last month, Threads published a new page that lists all of the fediverse servers that it blocks and does not federate with. Any account hosted on one of the Mastodon servers listed on this page cannot follow and interact with Threads users. And when Threads starts allowing its users to follow Mastodon accounts, they will be barred from following any account hosted on the listed servers.
Of course, this is not surprising by any means. All Mastodon servers maintain such a list on their About pages. (You can take a look at my own server’s list of moderated servers here.) These lists often contain the same set of long-known offenders: servers harboring and promoting hate speech, CSAM, and a wide range of gruesome things. Motivated by the recrudescence of Meta’s moderation failings, a number of Mastodon servers have also chosen to list Threads as a blocked server.
However, if you take a look at Meta’s list of moderated servers, you are going to notice a pattern. Alongside the well-established set of known offenders, Meta is also blocking a number of well-known Mastodon servers that host LGBT people and marginalized communities. Here is just a handful of them:
- tech.lgbt (listed reason: “Violated our Community Guidelines or Terms of Use”)
- eldritch.cafe (listed reason: “Violated our Community Guidelines or Terms of Use”)
- octodon.social (listed reason: “No publicly accessible feed, violated our Community Guidelines or Terms of Use”)
- queer.party (listed reason: “Violated our Community Guidelines or Terms of Use”)
- disabled.social (listed reason: “Violated our Community Guidelines or Terms of Use”)
There are even more with less obvious domain names. While some of these servers owners don’t mind being blocked by Threads, having themselves already blocked Meta’s platform, this still raises several questions. Why were these servers blocked? Was it just out of reprisal for blocking Threads? Which community guidelines did they breach? Is it nudity, even when it’s only allowed behind content warnings? If they ever reach compliance with those guidelines, will the suspension be lifted? How long does that take? Meta has a form for appealing blocked server decisions, but how are those reviewed? Is it just as automated as the majority of the moderation decisions made on the platform?
Mastodon and the fediverse are often described as a web of interconnected servers that enforce their own moderation policies towards each other, sometimes at the cost of user experience and clarity when selecting a new server to join. From what I can tell, this not only applies to Threads now, but also adds another layer to the ongoing moderation issues that contribute to Threads and Meta’s other platforms being a dangerous and repulsive environment for marginalized people. The escape hatch that the fediverse seemed to represent can now also act as a banishment system. With Meta is already unilaterally blocking LGBT accounts and letting the authors of hate speech go unpunished, who’s to say they won’t arbitrarily block other LGBT servers on the fediverse in the future as well?
It adds up.
You may ask, “Why write this on MacStories now? Do you expect Meta to change anything?” I realize that I’ve asked a lot of unanswered questions here, but no, I don’t. I don’t expect Meta to ever really address the fact that brick by brick it has rebuilt the same sort of problematic foundation on which Twitter was built under Jack Dorsey’s leadership. I don’t expect it to move away from its dangerous automated moderation systems that silence activists trying to speak out for their lives and the lives of their peers. I don’t expect the company to acknowledge that it may have played a part in artificially boosting far-right discourses on its platforms by banning all links to their main opponent during the last stretch of the French legislative elections.
However, I do have some sincere expectations for you all who are reading this. I expect tech journalists to more systematically report on everything that fits into this pattern. I expect tech podcasters to acknowledge that Threads is only a great alternative to Twitter if you’re a straight, white male. I expect that we all start understanding why some marginalized communities are staying on Twitter despite all of its horrific flaws. I expect that awareness of Meta’s consistent pattern of silencing marginalized voices can help direct funds, donations, and efforts to Mastodon to make it a durable, more wide-reaching alternative.
More than anything, I just expect to be heard. And for Threads, that seems too much to ask for.