Meta removes misleading Facebook information from Russia and China
The Facebook parent company Meta took down two political influence operation account networks.
Meta has removed “influence operations” run by China and Russia from its network. The goals of the two disabled accounts were to sway views on the upcoming U.S. midterm elections and the war in Ukraine. Meta searches for accounts that violate its policy against coordinated inauthentic behavior, or CIB. It then removes these accounts to prevent any further issues.
Russian network targets
The Russian network targeted Germany, France, Italy, Ukraine, and the UK, with hopes of spreading propaganda about Russia’s invasion of Ukraine. The operation began in May of this year. A network of over 60 websites impersonating news organizations in Europe posted articles criticizing Ukraine and arguing against Western sanctions on Russia, Meta said. They would promote these articles on various social media outlets, including Facebook, Instagram, Telegram, Twitter, and petition websites such as Change.org.
As Meta attempted to block certain fake site domains, the network kept trying to create new websites. This was alarming to Meta because it showed “persistence and continuous investment in this activity across the internet.”
Chinese network targets
The Chinese-origin network targeted numerous social media platforms. Meta said it disabled a small network that began in China and targeted the Czech Republic, the U.S. and a few Chinese- and French-speaking audiences across the globe. This campaign focused on a specific audience between Fall 2021 and mid-September 2022.
It was also the first to target U.S. domestic politics ahead of the U.S. elections, and the Czech Republic’s foreign policy toward Ukraine and China. There was also a difference in focus by the Chinese influence operations. This time, the influence was directed to the U.S. internally, however, in the past “Chinese influence operations that we’ve disrupted before typically focused on criticizing the United States to international audiences, rather than primarily targeting domestic audiences in the U.S.,” Meta stated. In the U.S., it targeted people on both sides of the political spectrum.
Each account posted material at low volumes during working hours in China rather than at times when their targets would be awake. Therefore, engagement on the accounts was low and those who did interact called out the network as fake. Meta was able to take down some of the fake accounts due to impersonation and inauthentic content.
The largest operation since the start of the Ukraine war
“This is the largest and most complex Russian-origin operation that we’ve disrupted since the beginning of the war in Ukraine. It presented an unusual combination of sophistication and brute force,” said Ben Nimmo, global threat intelligence lead and David Agranovich, director of threat disruption at Meta.
The network operated in various languages including English, French, German, Italian, Spanish, Russian and Ukrainian. This presented even more difficulty, since it “demanded both technical and linguistic investment.”
Research and investigation into this matter
Meta began investigating this issue after it reviewed reporting from investigative journalists in Germany, along with information from researchers at Digital Forensics Research Lab. Both sources provided Meta with insights into these two networks.
The future of digital influence and accuracy
This operation provides significant insight into the dangerous ways countries can try to influence the public. However, it also shows the importance of counteracting these measures quickly and accurately before they get viewed and spread as being authoritative.
This summer saw a surge in travelers, and their lost or mishandled luggage. Can AI save the mounting baggage crisis?