Facebook Data ‘Does Not Contradict’ Intelligence on Russia Meddling
The social platform says it has a problem with government-run efforts to manipulate public opinion.
Less than six months ago, Mark Zuckerberg dismissed the idea that the social publishing platform he founded was being used to manipulate voters as “pretty crazy.”
But in a new report, Facebook now says it has data that “does not contradict” a key U.S. intelligence report that describes “information warfare” ordered by Russian President Vladimir Putin and carried out on Facebook and across the web.
“Russia’s goals were to undermine public faith in the U.S. democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency,” officials wrote in a declassified version of the U.S. Director of National Intelligence report in January. Guided by the Russian government’s “clear preference” for Donald Trump, the DNI report said, Moscow followed a strategy “that blends covert intelligence operations—such as cyber activity—with overt efforts by Russian Government agencies, state-funded media, third-party intermediaries, and paid social media users or ‘trolls.’” Scholars have long theorized about the possibility of people manipulating public opinion on Facebook—Facebook itself carried out a mood experiment on its users—but U.S. intelligence officials call Moscow’s latest meddling “unprecedented.”
Facebook stopped just short of identifying Russia in its report, but also emphasized that it is “not in a position to make definitive attribution to the actors sponsoring this activity,” which it said represented only a small portion of the activity Facebook tracks on its platform.
Facebook acknowledged more broadly that it has a problem with what it calls “information operations,” government-run efforts to use Facebook to manipulate public opinion, distort domestic or foreign political sentiment, and influence the outcome of elections.
In many cases, such information operations are aimed at gaming Facebook’s algorithm, using tactics like the mass creation of fake accounts and the creation of groups populated by those accounts.
Fake accounts and misleading groups then carry out coordinated campaigns designed to amplify a message: They do this by simultaneously sharing and liking the same Facebook posts en masse, rapidly posting the same information across multiple groups at once, and spreading sensationalistic or heavily biased headlines as a way to distort facts and fit a narrative. These groups can be hard to detect because they often post legitimate and unrelated content, as well, “ostensibly to deflect from their real purpose,” Facebook says.
Facebook says it detected several “subtle and insidious” kinds of coordinated attempts to harm the reputation of “specific political targets” during the 2016 campaign, describing “malicious actors leveraging conventional and social media to share information stolen from other sources, such as email accounts, with the intent of harming the reputation of specific political targets.” Spokespeople for Facebook declined to answer my questions about which political target and stolen email information the Facebook report was referring to, but the intelligence report that Facebook links to describes “high confidence” in the assessment that Russian intelligence relayed hacked emails between senior Democratic officials to WikiLeaks to undermine Clinton. “Moscow most likely chose WikiLeaks because of its self-proclaimed reputation for authenticity,” the DNI report says. “Disclosures through WikiLeaks did not contain any evident forgeries.”
Without mentioning Russia or WikiLeaks, however, Facebook describes actors that create fake personas on Facebook as a way to direct people to the stolen data. “From there, organic proliferation of the messaging and data through authentic peer groups and networks was inevitable,” Facebook wrote.
At the same time, malicious actors would use fake Facebook accounts to “push narratives and themes that reinforced or expanded on some of the topics exposed from stolen data,” including attempts to seed stories with journalists and other third parties. Facebook didn’t describe the stories themselves. “We detect this activity by analyzing the inauthenticity of the account and its behaviors, and not the content the accounts are publishing,” its report said.
In an attempt to mitigate state-run information operations, Facebook says it is, in some cases, sending notifications to specific people who have been targeted by sophisticated attackers—as well as sending warnings to people who “have yet to be targeted, but whom we believe may be at risk based on the behavior of particular malicious actors.”
Though Facebook says it is already taking steps to crack down on networks of fake accounts—it took action against at least 30,000 fake accounts aimed at manipulating the outcome of the recent French election, it may be more difficult for the social platform to fight back against a larger threat. Namely, state-sponsored efforts concerned with sowing distrust and spreading confusion, including “purposefully muddying civic discourse and pitting rival factions against one another” as a way to weaken people’s faith in institutions.
“In this case, fake account operators may not have a topical focus, but rather seek to undermine the status quo of political or civil society institutions on a more strategic level,” Facebook said in its report. “In several instances, we identified malicious actors on Facebook who, via inauthentic accounts, actively engaged across the political spectrum with the apparent intent of increasing tensions between supporters of these groups and fracturing their supportive base.”
In other words, the problem isn’t just that the Kremlin may be spreading bad information on Facebook as a way to influence the outcome of U.S. elections. It’s that governments are manipulating ordinary Facebook users by getting them to act as unknowing agents of propaganda.
“Everyone is a potential amplifier,” Facebook said. Pretty crazy, maybe. But still true.
NEXT STORY: Space: Trump's Least Controversial Frontier