Russia and Iran are leading the way when it comes to pushing bad information on one of the world's most popular social media platforms, and new analysis finds they are getting savvier at evading detection.
Facebook issued a report Wednesday looking at so-called coordinated inauthentic behavior over the past four years, warning that despite ongoing efforts to identify and remove disinformation networks, there is no let-up in attempts to exploit or weaponize conflict and crisis.
"Threat actors have adapted their behavior and sought cover in the gray spaces between authentic and inauthentic engagement and political activity," according to the Facebook report, which looked at the more than 150 networks from more than 50 countries that its security teams took down from 2017 to 2020.
"We know they will continue to look for new ways to circumvent our defenses," the report added, noting disinformation efforts were evenly split between foreign and domestic efforts.
"Domestic IO also continues to push the boundaries of acceptable online behavior worldwide" per @Facebook "About half of the influence operations we’ve removed since 2017–including in #Moldova, #Honduras, #Romania, #UK, US, #Brazil & #India–were conducted by locals..." pic.twitter.com/e2pLpgLNaJ— Jeff Seldin (@jseldin) May 26, 2021
Russia, Iran influence efforts
Overall, Russia was the biggest purveyor of disinformation, according to the analysis, with 27 identified influence operations during the four-year timeframe. Of those, 15 were connected to the St. Petersburg-based Internet Research Agency (IRA) or other entities linked to Yevgeny Prigozhin, a Russian oligarch with close ties to Russian President Vladimir Putin.
Another four Russian networks were traced to the Kremlin's intelligence services and two more originated with Russian media sites.
Iran was second on the list, with 23 inauthentic networks, nine of which were connected to the government or Iranian state broadcasters.
Myanmar ranked third, with nine disinformation networks, followed by the United States and Ukraine.
Facebook said the culprits in the United States and Ukraine included public relations firms, fringe political actors, and in the case of Ukraine, two political parties.
China's 'strategic communication'
China, accused by U.S. intelligence officials for running multiple, intensive influence operations, did not make Facebook's list of illicit disinformation networks, but not because Beijing was not active.
"The China-origin activity on our platform manifested very differently than IO [influence operations] from other foreign actors, and the vast majority of it did not constitute CIB [Coordinated Inauthentic Behavior]," the Facebook report said. "Much of it was strategic communication using overt state-affiliated channels [e.g. state-controlled media, official diplomatic accounts] or large-scale spam activity that included primarily lifestyle or celebrity clickbait and also some news and political content."
#Election2020: "In the year leading up to the US 2020 election, we exposed over a dozen CIB operations targeting US audiences, including an equal number of networks originating from #Russia, #Iran, & the #UnitedStates itself" per @Facebook pic.twitter.com/MISQHnJigc— Jeff Seldin (@jseldin) May 26, 2021
The Facebook report warned, however, that catching sophisticated disinformation actors like China and Russia is getting more difficult.
"They are showing more discipline to avoid careless mistakes," the report said. "Some are also getting better at avoiding language discrepancies."
Amplifying, outsourcing disinformation
Facebook further warned that countries like Russia and China "are getting better at blurring the lines between foreign and domestic activity by co-opting unwitting [but sympathetic] domestic groups to amplify their narratives."
Another concerning trend identified in the Facebook report: outsourcing.
"Over the past four years, we have investigated and removed influence operations conducted by commercial actors—media, marketing and public relations companies, including in Myanmar, the U.S., the Philippines, Ukraine, the UAE [United Arab Emirates] & Egypt," according to the report.
The report said despite a growing number of influence operations and their growing sophistication, many of them are being identified and taken down more quickly than in the past.
But Nathaniel Gleicher, the head of Facebook security policy, said the social media platform can only do so much by itself.
"Countering IO is a whole-of-society challenge. Defenders are most effective when gov'ts, industry, and civil society work together," Gleicher wrote on Twitter.
"We know threat actors are continuing to innovate, so we can't take our foot off the gas now," he added. "We have to keep pressing to stay ahead of adversarial innovation in 2021 and beyond."