Fb and Instagram Intention to Take away All QAnon Content material

Facebook says it will scrub content related to unfounded conspiracy theory QAnon from its platform and sister company, Instagram, from all pages, groups and accounts representing QAnon, regardless of whether they share content related to potential violence.

Since beginning its effort to remove QAnon content this summer, more than 1,500 pages and groups representing QAnon and over 6,500 tied to more than 300 militarized social movements have been removed.

“While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm, including recent claims that the West Coast wildfires were started by certain groups, which diverted the attention of local officials from fighting the fires and protecting the public,” the social network said. “Additionally, QAnon messaging changes very quickly, and we see networks of supporters build an audience with one message and then quickly pivot to another.”

The company cautioned that the effort will take time and will be enforced by its Dangerous Organizations Operations team, which Facebook described as specialists who study and respond to new evolutions in violating content from this movement and internal detection.

Twitter made similar moves in July, banning over 7,000 accounts and saying it would no longer include QAnon-related content or affiliated accounts in Trends, try to limit results in search and block URLs associated with QAnon.

Facebook said when it began its efforts against QAnon in August that it banned 790 groups, 100 pages, 1,500 ads and blocked 300 hashtags across Facebook and Instagram as well as imposing restrictions on 1,950 groups, 440 pages and over 10,000 Instagram accounts.

The company said at the time that people could still post content supporting offline anarchist groups and militia organizations based in the U.S. as long as the content didn’t violate its terms of service, adding that its focus was on restricting “their ability to organize.”

On that note, in mid-September, Facebook began down-ranking content from pages and groups that had been restricted but not removed.

Last week, ads promoting QAnon conspiracy theories were formally banned on Facebook and Instagram.

The company also began directing people who searched for certain child safety-related hashtags to credible child safety resources last week in order to counter a QAnon strategy of using the issue to recruit and organize.

“We expect renewed attempts to evade our detection, both in behavior and content shared on our platform,” Facebook said in its update Tuesday, “so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary.”

Join Adweek and Taboola for In With the New: Driving Discovery in 2021, a live virtual event on Oct. 8, to hear from content leaders and brand marketers on what they’ve learned in 2020 and how it’s informing 2021 plans. Register now.

Comments are closed.