Skip to content
Link copied to clipboard
Link copied to clipboard

Twitter banished the worst QAnon accounts. But more than 93,000 remain on the site, research shows

Discussion of QAnon has proven resilient on Twitter despite its takedown of 7,000 accounts in July, with tens of thousands of accounts continuing to tout the notorious conspiracy theory.

David Reinert holding a Q sign waits in line with others to enter a campaign rally with President Donald Trump Republican U.S. Senate candidate Rep. Lou Barletta, R-Pa., Thursday, August 2, 2018 in Wilkes-Barre, Pa.
David Reinert holding a Q sign waits in line with others to enter a campaign rally with President Donald Trump Republican U.S. Senate candidate Rep. Lou Barletta, R-Pa., Thursday, August 2, 2018 in Wilkes-Barre, Pa.Read moreMatt Rourke / AP

WASHINGTON - Discussion of QAnon has proven resilient on Twitter despite its takedown of 7,000 accounts in July, with tens of thousands of accounts continuing to tout the notorious conspiracy theory and, in some cases, quote directly from its purported leader, the shadowy figure “Q.”

More than 93,000 active accounts have references to QAnon in their Twitter profiles, and their overall rate of posting has increased, according to new research from Advance Democracy, a nonpartisan research group based in Washington. It also found that while the use of traditional QAnon hashtags has declined in the 10 weeks since Twitter's enforcement action, there has been a surge in alternatives with spelling and other variations that leave their meaning clear to followers.

In its broad strokes, these findings do not deviate significantly from Twitter's public portrayals of the effects of its move against QAnon, which came after more than 2 1/2 years of mounting evidence about the hateful, violent nature of the conspiracy theory and its penchant for sparking real-world crimes. The House of Representatives voted Friday to condemn QAnon.

Twitter has said it sought to eliminate accounts committing violations against its rules on harassment, hate speech and incitement to violence but also wanted to allow QAnon supporters to continue operating on the platform - albeit with new restrictions - so long as they followed platform policies. Overall, the company says its action caused discussion of the conspiracy theory to fall by more than half.

The researchers, however, found troubling evidence that Twitter has not yet done enough and that the conspiracy theory continues to "persist and expand" on the site, said Daniel J. Jones, a former FBI analyst and Senate investigator who lead the review of the CIA's torture program, now president of Advance Democracy.

Some of the surviving accounts have more than 100,000 followers each and have worked to co-opt hashtags not previously affiliated with the movement, including #savethechildren and #inittogether, which started as a call for unity in facing the covid-19 pandemic before being adopted by QAnon supporters, the report found. Followers of the conspiracy theory have consistently downplayed the public health crisis and spread disinformation about its origins, potential remedies and the likely safety risks of a future vaccine against it.

"The QAnon ideology undermines trust in public institutions and sows societal divisions through hate speech and the spread of unfounded conspiracy theories," said Jones. "Addressing this threat is going to require more robust action by the social media platforms, but more importantly, it's going to require that those elected officials sitting silently on the sidelines stand up and address this threat to our democracy."

Twitter announced "a strong enforcement action" on July 21, citing "behavior that has the potential to lead to offline harm." Sanctions included the removal of the 7,000 accounts and efforts to restrict the amplification of 150,000 others active in spreading QAnon content, by preventing the accounts from appearing in recommendations or the "trending" module that can bring such content to wider audiences. Facebook, YouTube and Reddit also have acted against QAnon.

"We aim to be iterative and transparent in our approach, and we recognize we cannot review every Tweet containing a falsehood," said Twitter spokeswoman Lauren Alexander. "That's why, under this expansive framework, we noted that we will prioritize the removal of content that has the greatest potential for harm."

Independent researchers have said the enforcement actions by the mainstream platform came late but at least curbed the spread of QAnon content to new users.

"They used a scalpel not a chain saw," said Darren Linvill, a Clemson University associate professor of communication who studies Twitter. "If they'd used a chainsaw, they would have lost a lot of users" while also further inflaming allegations that the company was biased against conservative voices.

QAnon, which started spreading on social media in 2017, claims falsely that Democrats and Hollywood celebrities rape and eat children and that they also are working to subvert the constitution to take over the country. The supposed leader, Q, is described as an anonymous Trump administration official with a top-secret clearance, and Trump himself is portrayed as secretly battling a nefarious "deep-state cabal" trying to protect the pedophiles.

The posts, called "Q drops," and related discussions have seethed with violence from the beginning, prompting calls to kill prominent Democrats, and has inspired numerous violent acts and attempts at violence. This included a woman in New York, armed with more than a dozen knives, who announced on Facebook in April that former vice president Joe Biden and other top Democrats "need to be taken out."

The Advance Democracy report, based on research using a social-media analytics tool from Zygnal Labs, found a steep decline in the use of several traditional hashtags for QAnon, such as #qanon, #deepstate, #qarmy and #wwg1wga, which stands for "Where We Go One, We Go All," a common oath of allegiance to the conspiracy theory's tenets. These hashtags collectively fell from about 80,000 uses per day before Twitter's enforcement action to about 20,000 per day after.

Alternative hashtags, such as #Q17, #17Anon and #CueAnon, meanwhile, surged in the same time frame but still reached much lower numbers overall, topping out at less than 2,000 a day. The researchers also found, among QAnon supporters, frequent invitations to "follow me on" accounts on other platforms, such as Parler and Gab, which are popular with conservatives and feature minimal content moderation.

The report also listed numerous accounts that have QAnon content in their profiles and yet have followings larger than 100,000 users.

One such account, @QAnon76, which has 554,000 followers, made a series of changes to its profile, dropping the term WWG1WGA, while still posting content related to the conspiracy theory, according to archived versions of the account. A few weeks later, the account updated its profile again, adding a new signifier that had the same meaning but may have been harder for automatic enforcement systems to detect, “W|W|G|1|W|G|A.” Soon the account got a new name too: “Midnight Rider” followed by three stars. The account does not list contact information and did not reply to a tweet seeking comment.