Facebook is Enlisting These Disinformation Detection Pros to Fight Fake News

The social behemoth takes new steps to protect the world’s elections.

Maciej Luczniewski/NurPhoto/ZUMA Press

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Facebook has taken another step to reassure users and regulators in the wake of its central roles in Russia’s 2016 election disinformation campaign and the Cambridge Analytica data privacy scandal. On Thursday, the social media behemoth announced a partnership with the Atlantic Council’s Digital Forensic Research Lab (DFRLab) to help provide “real-time insights…on emerging threats and disinformation campaigns from around the world.”

“This will help increase the number of ‘eyes and ears’ we have working to spot potential abuse on our service,” Katie Harbath, Facebook’s global politics and government outreach director, wrote in a statement posted to the company’s website, “enabling us to more effectively identify gaps in our systems, preempt obstacles, and ensure that Facebook plays a positive role during elections all around the world.”

The DFRLab is a division of the Atlantic Council, a Washington, D.C.-based international affairs think tank. The lab documents disinformation efforts around the world—including Syria, Europe, and the US—as they happen. It was on the leading edge of exposing the American alt-right’s disinformation efforts ahead of the French presidential elections in 2017. Its researchers also documented doctored photos designed to sway 2017 parliamentary elections in Germany, and Twitter botnet activity ahead of elections in Italy in March.

Past DFRLab investigations have focused on disinformation spread on Facebook. The lab’s host, the Atlantic Council, is funded by various donors that include large international and domestic corporations, foreign governments, government agencies, foundations, and individuals. The organization’s donor policy “stipulates that the Atlantic Council is accepting such contribution on the condition that the Atlantic Council retains intellectual independence.” Still, the sheer size, reach, and makeup of its donors has drawn criticism, with some worrying that the money from foreign governments and corporations inevitably influences the final research product. 

The partnership is just the latest effort Facebook has announced to better police its platform for electoral disinformation. In April, the day before Facebook CEO Mark Zuckerberg sat for two days of Congressional questioning, the company announced it was partnering with several big-name foundations to form a commission to “provide independent, credible research about the role of social media in elections.” This commission will facilitate scholarship and research with the help of “privacy-protected datasets from Facebook,” which the company has promised not to review prior to publication. Facebook has also partnered with dozens of news organizations, civil society groups, and governmental organizations in Mexico to fight the spread of “fake news” ahead of that country’s July 1 election in a project called “Verificado 2018.”

The company has worked to burnish its image via a large television advertising campaign that laments the platform’s promotion of “fake news and data misuse” and promises to do better. Meanwhile, Zuckerberg has so far rebuffed calls from British lawmakers to come and answer questions. Facebook has also tightened public access to data across its platforms (it also owns Instagram and WhatsApp) through its Application Programming Interfaces (APIs), the process by which third parties, including researchers, get data from the company for review. That prompted at least 20 academics from around the world to write Facebook an open letter demanding more access for research.

“Facebook is under immense pressure from lawmakers and civil society organizations to stop enabling malicious actors through willful negligence,” Miranda Bogen, a policy analyst at Upturn, a nonprofit research organization that examines technology and public policy, tells Mother Jones. “While the election interference, predatory advertising, human rights abuses and rampant harassment that has spread across the platform in recent years doesn’t seem to have made a dent in the company’s profits, the platform is skating on thin ice with regulators and the public.”

Bogen recently co-authored a paper arguing that Facebook has made progress on offering users transparency about advertising on the site, but has not gone far enough. She says Facebook is “right to recognize that it won’t be able to spot emerging threats alone, but the specific terms of this partnership and the company’s other research initiatives are far from clear.”

Bogen suggests the public would benefit from additional ways to independently monitor Facebook’s behavior. “Think tanks have taken heat,” she says, “for close association with companies that have an interest in the policy matters those organizations study,” adding that some might question whether organizations like the DFRLab “who rely on a friendly relationship with Facebook will find it difficult to stay neutral” when publishing conclusions that might threaten their access. 

Graham Brookie, the managing editor and acting director of the DFRLab, told Mother Jones that he is well aware of Facebook’s record and its critics. But he sees a role for the DFRLab, and it’s mission to document and expose disinformation, to play inside of Facebook. “The whole point of it is that if we’re going to be more connected than at any point in human history then we need to be more resilient” to disinformation efforts, he says.

According to a blog post Brookie wrote on Thursday, the partnership will allow the lab to “focus more closely on the challenge of defending democratic debate in elections around the world.” He wrote that the lab’s researchers will still “cast an independent and critical eye on all platforms, including Facebook itself,” stressing it will not be monitoring elections for Facebook, but rather monitoring election disinformation activity across all online platforms.

“The research community is, for good reason, highly skeptical,” he tells Mother Jones, adding that that’s what he’d expect. “We’re looking forward to seeing what we can do and hopefully we can have an impact.”

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate