Facebook has taken down or restricted more than 10,000 groups, pages and Instagram accounts associated with QAnon, in the latest effort by a social media platform to combat the growth of the baseless rightwing conspiracy theory.
The moves are the result of a shift in the company’s policy toward movements that have “demonstrated significant risks to public safety” but do not meet the criteria for an outright ban, such as terrorist and hate groups.
Facebook has already removed more than 790 groups and 100 pages linked to QAnon, the company said Wednesday. It also blocked 300 QAnon hashtags and took down 1,500 advertisements.
The company has identified an additional 1,950 groups, 440 pages and more than 10,000 Instagram accounts linked to QAnon which will be restricted and may be removed pending review. “Just because a page or group hasn’t been removed doesn’t mean it’s not subject to removal soon,” said a company spokesperson who asked not to be identified for safety reasons.
Facebook will still allow people to post content that supports QAnon, but “will restrict their ability to organize” on the platform by removing them from recommendation algorithms, reducing their ranking in news feed and search results, and prohibiting them from using features such as fundraising and advertising, the company said.
The move against QAnon comes one month after Twitter cracked down on content and accounts dedicated to the conspiracy theory, whose followers believe that Donald Trump is waging a secret battle against a satanic “deep state” cabal of Democrats, celebrities and powerful figures such as Bill Gates and George Soros who run the world while engaging in pedophilia, human trafficking and the harvesting of a supposedly life-extending chemical from the blood of abused children.
QAnon was identified as a potential domestic terrorism threat by the FBI and has been linked to numerous attempted acts of violence.
“We already remove content calling for or advocating violence and we ban organizations and individuals that proclaim a violent mission,” the company said in a blogpost. “However, we have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior.”
The policy shift also applies to US-based militia groups and what Facebook is calling “offline anarchist groups that support violent acts amidst protests”, some of which “identify as Antifa”. Facebook removed 980 groups and 520 pages linked to those movements and restricted 1,400 related hashtags.
In announcing the policy shift, Facebook provided a glimpse into the size of the QAnon community on the site, though a spokesperson said the company does not have confirmed data on the number of people who participate in the movement on the site. The Guardian had previously documented more than 170 QAnon groups, pages and Instagram accounts with more than 4.5 million aggregate followers, but there is likely significant overlap in group membership.
As of Wednesday, the largest QAnon group on Facebook had just under 230,000 members, up from just 20,000 at the start of the year – growth of more than 1,000%, according to data from CrowdTangle. By midday Wednesday the group had been taken down.
Many QAnon Facebook groups have experienced explosive growth this year and become hotbeds for coronavirus misinformation. Facebook’s recommendation algorithms helped drive new members toward QAnon groups and content, the Guardian reported in June.
The QAnon narrative is based on cryptic messages published by an anonymous person or entity – “Q” – who claims to have inside knowledge of a secret battle waged by Donald Trump against the supposed cabal. While Q emerged on the anarchic image board 4chan and currently posts on a successor site to 8chan, the movement coalesced and grew on mainstream social media sites, including Reddit, YouTube, Twitter, Discord and Facebook.
Facebook groups became crucial organizing hubs for QAnon after Reddit banned the movement in 2018 for violating its policies against incitement to violence, harassment and doxxing.
“It’s important to understand that these groups don’t get this big from infrastructure that they built; they get this big by leveraging the infrastructure of the platforms,” said Joan Donovan, the research director of Harvard’s Shorenstein Center on Media, Politics and Public Policy. “Had Facebook taken action back at the time when Reddit took action, we wouldn’t be in this same position.”
Though the QAnon movement is small, it has gained outsize influence with effective media manipulation tactics, such as attention-grabbing harassment campaigns. Dozens of QAnon supporters have run for elected office, and one, Marjorie Taylor Greene, is likely to be elected to Congress in November after she won the Republican primary in a deeply conservative district in Georgia.
Donovan said the movement will probably adapt to Facebook’s new restrictions and continue to operate. “Limiting features isn’t going to stop this group from changing their tactics and continuing to stay on the platform,” she said.
She also said that the media and social media platforms need to recognize and combat the classic antisemitic tropes that undergird the QAnon narrative.
“People should realize that QAnon isn’t just this outlandish conspiracy theory about child-trafficking and satanism,” she said. “It is incredibly antisemitic, and not taking action on it for that reason alone contravenes the purpose of [Facebook’s] terms of service to begin with.