Twitter officials say they tracked QAnon for years, as adherents of the conspiracy theory spread misinformation and vitriol on the platform.
While Twitter monitored, collected data and tried to suppress the reach of QAnon accounts, it had stopped short of outright banning them. That changed after the Capitol riot. On January 12, six days after the insurrection, Twitter publicly disclosed it had suspended 70,000 accounts. A Twitter spokesperson now tells CBS News the number has more than doubled — with more than accounts suspended for engaging in “sharing harmful QAnon-associated content at scale.”
The accounts didn’t come from overseas and they weren’t “bots” spewing automated falsehoods. The vast majority belonged to Americans and were “real people,” though some held multiple accounts, according to senior Twitter officials.
In interviews with CBS News, senior Twitter officials described a yearslong effort to fight domestic conspiracy theories. They said they adapted similar strategies previously used to combat international terrorism and child sexual exploitation. They recounted how for months before January 6 they limited the visibility of QAnon-associated accounts — hoping to encourage users to modify their behavior — and ultimately took more decisive action after the riot that left five dead and dozens injured.
“That was a moment of reckoning where we realized that the approach that we had put in place the previous fall, of attempting to reduce the influence of this movement, wasn’t sufficient,” one Twitter official said.
The Twitter officials spoke with CBS News on the condition that their identities not be revealed, citing security concerns. They said some who work for the company on sensitive issues have been threatened or doxxed, a term referring to the publication of private or personal identifying information on the internet by strangers in an attempt to encourage harassment.
“It really just kind of snowballed”: Twitter’s evolving approach
The QAnon conspiracy theory began in October 2017, spreading from obscure posts on an image board called 4chan to more popular social media sites like Twitter and Facebook. The conspiracy theory accuses prominent liberals of involvement in a Satanic cabal that orchestrates crimes ranging from cannibalism to human trafficking, and plotting against former President Donald Trump.
But the Twitter officials say the history of their efforts predate the theory itself. In late 2016 another conspiracy theory, called Pizzagate, proliferated on the platform. The movement hit an inflection point when a man named Edgar Welch drove from North Carolina to Washington, D.C. and opened fire in a pizza restaurant he wrongly believed was associated with child sex trafficking.
After the shooting, Twitter began removing tweets. It was a decision that “really backfired,” according to one of the senior Twitter officials.
“What we saw was when we removed tweets, people were like, ‘Oh, Twitter is removing these. So we must really be onto something. We’ve hit a button, we’ve hit a nerve. We are right about this and Twitter’s in on it too.’ And it really just kind of snowballed,” the official said.
That experience led Twitter to take a different approach with QAnon.
Alexandra Reeve Givens, the president and CEO of the nonprofit Center for Democracy and Technology, said some believe Twitter should have acted more firmly with QAnon earlier.
“There are a lot of disinformation experts who think that the signs were on the wall and action should have been taken sooner. I think you can imagine that Twitter was struggling with the gravity of the situation,” Givens said.
Twitter’s began a major effort in July 2020, four months before the election. Twitter announced that QAnon and accounts that promoted it were in violation of the site’s coordinated harmful activity policies.
Twitter used “a combination of human review and then very high confidence machine learning to help us identify not just who is sort of QAnon adjacent, but really what the core community of accounts is,” according to a senior Twitter official. The company had for years used similar processes “to address everything from child sexual exploitation, to networks of terrorists, to spam.”
Asked if Twitter used the same strategy last summer against antifa supporters, as riots erupted in Seattle and Portland, Twitter said its approach was different. The company pointed to FBI Director Christopher Wray’s congressional testimony that antifa is an ideology not an organization.
Ultimately, Twitter announced in July 2020 that it would lower the visibility of QAnon-associated profiles on the network.
QAnon topics were no longer recommended among Twitter’s “trends” and accounts related to QAnon were no longer suggested when you used the search function.
Speaking with CBS News, senior Twitter officials described the effort as part of their focus on “deradicalization” and “rehabilitation” of the people spreading QAnon.
“We really want to create opportunities to come back from the fringes of this conspiracy and be sort of turned into healthy participants in the conversation on Twitter,” a Twitter official said.
Givens said the suppression of accounts is a thorny issue for tech companies.
“It’s one of the tools that platforms are increasingly thinking about as they try to balance this really hard trade-off between the fear of silencing speech, but also mitigating some of the concerns about it. … It’s a hard balance to strike,” Givens said.
The company keeps what it calls “dynamic records,” evolving profiles of users based on their recent activity and changing behaviors.
After the election, QAnon supporters on Twitter contributed to the “Stop the Steal” movement, spreading the false theory that Mr. Trump had won the election, according to the officials.
“Many of the same QAnon accounts that we suspended after the sixth had previously, already been deamplified. … The calculus on and after January 6th was, ‘Was deamplification sufficient?'” the Twitter official said.
The decision to remove so many accounts after January 6 has opened Twitter up to criticism across the political spectrum, according to Givens.
“There are folks on one side who say that the platforms are censoring and taking down free speech, and there are others who say that the platforms aren’t doing enough,” said Givens.
Twitter has an appeal process for people whose accounts were suspended, but one Twitter official said among those suspended in the January 6 purge “the number of granted appeals is near zero.” The senior Twitter officials did acknowledge that a podcast’s account was wrongly removed and reinstated.
The company’s handling of QAnon, and attempts to thwart disinformation, are likely to be a focus when Twitter CEO Jack Dorsey appears alongside other tech CEOs before the House Energy and Commerce Committee on March 25.
Dorsey acknowledged during Twitter’s 2021 Virtual Analyst Day that some users are increasingly skeptical of the platform
“We agree many people don’t trust us. Never has this been more pronounced than the last few years. This isn’t just about our actions to promote healthy conversation, it goes broader and deeper, down to how we use technology like machine learning algorithms,” Dorsey said.