Facebook‘s security and product leads have been acting globally in the company’s effort to regain user trust and protect the integrity of elections worldwide. The world’s largest social network cannot take back what occurred on its platform during the 2016 presidential election, but the team has been trying to prevent bad actors, like Russian trolls, from using Facebook to nefariously affect election results in the future.
“It’s really important for us while we’re solving the problems of the 2016 election that we’re not getting tunnel vision there,” Facebook’s Chief Security Officer Alex Stamos told reporters on Thursday. “We don’t just want to be fighting the last war.”
Facebook’s main goals include “combating foreign interference, removing fake accounts, increasing ad transparency, and reducing the spread of false news,” Stamos said on the call. But as Stamos shared on Thursday, fake news is more than just falsified stories shared on Facebook. His team is looking to fake identities, fake audiences, false facts, and false narratives. One major product change, for example, is allowing fact-checking partners to review and identify stories, photos, and videos as false before Facebook prompts them to do so.
Of course, election integrity isn’t the only problem on Facebook executives’ minds. Facebook is also in the midst of a data privacy issue involving data firm Cambridge Analytica obtaining user data on 50 million unsuspecting Facebook users. That issue has led to a movement to #DeleteFacebook and forced Facebook to release more data privacy protections.
“We know we have work to do to earn people’s trust back,” Guy Rosen, Facebook’s vice president of product, said when asked how many people have deleted Facebook since the Cambridge Analytica scandal. “That’s why we’re here today.”
Meanwhile, the National Fair Housing Alliance sued Facebook this week for enabling discrimination via housing ads on its platforms. Beyond advertising, Facebook also has been blamed for spreading disinformation more generally, with deadly consequences in Myanmar. Facebook didn’t say exactly how they’re measuring progress but told reporters they will have more of these conversations.
Stamos, who is rumored to be leaving Facebook in August, and Rosen were two of several Facebook executives who spoke during a phone briefing on Thursday to share prepared remarks as well as answer reporters’ questions on what they’ve been up to in regards to elections. The main takeaway: Facebook is invested in protecting election integrity all over the world. The staff has been prepping for the 2018 U.S. midterms but also deploying staff in countries like Germany, France, Italy, and Kenya to localize their efforts.
“Each country we operate in will have a different range of actors,” Stamos said. “We’re working with countries to understand the particular actors.”
Money on the line
A Facebook executive told Mashable that election integrity could affect the company’s bottomline, similar to Facebook’s earlier willingness to lower profitability in order to make its products more enjoyable and less addictive to users.
“We’re willing to give up profitability to have this massive impact.”
“Working on this stuff is exceptionally important. We’re willing to give up profitability to have this massive impact. We’re massively growing our team, going from 10,000 to 20,000, that’s a very meaningful investment,” Rosen said.
Stamos and Rosen were joined by Samidh Chakrabarti, product manager, civic engagement; Tessa Lyons, product manager, News Feed; and Rob Leathern, director, product management. Facebook CEO Mark Zuckerberg wasn’t on the call, although he was the one who hosted a press conference, via Facebook Live, back in September when he admitted the company would hand over 3,000 Russia-linked ads to Congress.
To combat these efforts, Facebook is in the midst of doubling its staff committed to safety. Product manager Chakrabarti said Facebook has been taking more a “proactive approach” to “prevent misleading or divisive memes from going viral.”
“We’re on track and our defenses are coming together for the U.S. midterms,” Chakrabarti added.
Facebook’s efforts aren’t all human-based, nor are they just internal. Lyon noted that Facebook continues to work with third-party fact-checkers, such as the Associated Press and FactCheck.org, and academics to identify fake news stories and determine what actions to take.
Facebook has been building a public database for all ads shared on Facebook, which will be available globally in the summer. That’s a far cry from Facebook’s previous strategy to allow brands to buy “dark posts,” a.k.a. posts that are never shared organically to Facebook and only targeted to a subset of users.
Facebook is still allowing political ads on the platform, but it’s working to make them more clearly labeled and limiting who can buy them. To run election-oriented ads, Facebook Page administrators will have to submit government-issued IDs and disclose what candidate they’re working for. Facebook will then mail them back a unique access code.
In the future, Facebook users will be able to see what ads are political ads, who it is sponsored by, how much was spent, the number of impressions it received, and demographic information on those viewers.
“This will offer an unmatched view of paid political messages,” Leathern said.
But Facebook’s ad transparency effort doesn’t include every form of political ads. When asked by USA Today‘s Jessica Guynn about ads that are issue-based but not directly tied to campaign, Learthern said they are focused on federal election ads, for now. The Associated Press pressed on Facebook’s work beyond elections. For example, disinformation has spread on social networks about Myanmar.
“We’re very focused on making sure that our platforms are not abused between elections,” Stamos replied.
As to what Facebook would want more from the government to succeed: speed. Facebook once operated under the tagline, “Move fast and break things.” The government seemingly works more slower, as the Facebook executives said, but Facebook is going to keep moving forward.
“We do it because civic discourse is something we at Facebook strongly believe in,” Rosen said, “and we know it can thrive on our platform when it’s safe.”