Meta takes action against 8,000 Facebook accounts tied to Chinese disinformation campaign

Meta, the company founded by Mark Zuckerberg, has taken action against a Chinese disinformation campaign by removing nearly 8,000 Facebook accounts. This move came as Meta uncovered what is considered to be the largest cross-platform covert influence operation globally. In addition to the Facebook accounts, 954 pages were deleted, and 15 Instagram accounts were removed due to coordinated inauthentic behavior.

Meta’s crackdown extended beyond Facebook, as the network had infiltrated over 50 social media platforms and forums, including YouTube, Pinterest, Reddit, and X (formerly known as Twitter). The purpose of this covert operation was to share positive commentary about China, particularly regarding its Xinjiang region where alleged genocide and crimes against Uyghur Muslims are taking place. Furthermore, the operation engaged in criticizing the United States, Western foreign policies, and those who were critical of the Chinese government, such as journalists and researchers.

The disinformation campaign also promoted false narratives, such as the claim that Covid-19 originated from the United States. This was achieved through the dissemination of spam, links, memes, and text posts. The operation even spent approximately $3,500 on Facebook ads. Its reach was global, targeting various regions including the UK, Taiwan, the United States, Australia, and Japan.

Meta’s investigation revealed links between the operation and individuals associated with Chinese law enforcement. The California-based tech company initially became aware of the network’s activities after reports of its targeting a human-rights non-governmental organization in late 2022. Further scrutiny led to the discovery of its ties to a previous covert operation known as Spamouflage, responsible for numerous clusters of spammy activity since August 2019. Interestingly, Meta also found similarities between the tactics employed by Spamouflage and a pro-Russian disinformation campaign called Secondary Infektion. While the reasons for these parallels remain unclear, Meta postulates that coordinated inauthentic behavior operators learn from each other, potentially influenced by public reporting on covert influence operations by industry professionals and security researchers.

Reference

Denial of responsibility! VigourTimes is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment