Sponsor

Main Ad

Charlie Warzel: What keeps Facebook’s election security chief up at night? https://ift.tt/327fhFY

Facebook is not the entire internet. But it does reflect and account for some of the greater web’s chaos. With just days to go, hyperpartisan pages on the platform are churning out propaganda to millions of followers.

In recent weeks, malicious actors both foreign and domestic have attempted to use inauthentic networks to push narratives to sow confusion and division. Others, including President Trump and his campaign, have used the platform to spread false information about voting while some partisans try to undermine the public’s faith in the U.S. election system. Then there are the conspiracy theorists and the long-running battle Facebook continues to fight against pandemic-related misinformation and disinformation.

Which is to say that all eyes are on Facebook. The security of the platform from outside interference as well as domestic manipulation is a crucial factor in assuring a fair and free election. At the head of that effort is Nathaniel Gleicher, Facebook’s head of cybersecurity policy. I spoke with him on Friday afternoon.

Facebook head of cybersecurity policy Nathaniel Gleicher testifies on Capitol Hill in Washington, Wednesday, May 22, 2019, during the House Oversight and Reform National Security subcommittee hearing on "Securing U.S. Election Infrastructure and Protecting Political Discourse." (AP Photo/Carolyn Kaster)
Facebook head of cybersecurity policy Nathaniel Gleicher testifies on Capitol Hill in Washington, Wednesday, May 22, 2019, during the House Oversight and Reform National Security subcommittee hearing on "Securing U.S. Election Infrastructure and Protecting Political Discourse." (AP Photo/Carolyn Kaster) (Carolyn Kaster/)

This is a condensed and edited version of our conversation for clarity.

Q: What’s your specific role with the platform as it relates to the upcoming election?

A: My work is to find and deal with two kinds of threats: cybersecurity, which is hacking, phishing and exploiting Facebook’s technical assets. The other is influence operations, which is both foreign (Russia, Iran, China) and domestic actors manipulating public debate with disinformation or in other ways.

Q: So you’re not involved in content moderation? Your team doesn’t take down specific posts because they violate a rule?

A: We tend to treat public debate problems online as one thing. But they’re very different. Camille François, a disinformation researcher, has a useful model — You can break the threat into three parts: actors, behaviors and content.

We are on the actor and behavior team. That’s intentional because in influence operations content isn’t always a good signal for what’s happening. We’ve seen Russian actors intentionally use content posted by innocent Americans. We see other people post and share content from Russian campaigns. It doesn’t mean they’re actually connected. In fact, most times they’re not. But that is the point. These foreign operations want to look more powerful than they are.

Q: There’s been a lot of debate on this topic. Namely, that the reach and potency of foreign interference is overstated or at least over-covered in the press and that the biggest problems are actually organic and domestic.

A: One thing Facebook started doing after I joined is we began publicly announcing coordinated inauthentic behavior (a somewhat vague term that means using fake accounts to artificially boost information designed to mislead) takedowns. We’ve found more than 100 of these in the last three years and we announce them and publicly share info and give this to third-party researchers so they can give their own independent assessment of what’s happening. As a result, these operations are getting caught earlier and reaching fewer people and having less impact. That’s also because government organizations, civil society groups and journalists are all helping to identify this.

What that means is that their tactics are shifting. Foreign adversaries are doing things like luring real journalists to create divisive content.

Q: Are these malicious actors trying to use fear to get us to manipulate ourselves?

A: Influence operations are essentially weaponized uncertainty. They’re trying to get us all to be afraid. Russian actors want us to think there’s a Russian under every rock. Foreign actors want us to think they have completely compromised our systems, and there isn’t evidence for that. In a situation like this, having the facts becomes extremely useful. Being able to see the effectiveness or ineffectiveness of these campaigns is useful. It’s a tool we can use to help protect ourselves. We know they’re planning to play on our fears. They’re trying to trick us into doing this to ourselves, and we don’t have to take the bait.

Q: It seems we as a nation are our own worst enemy in this respect.

A: It’s like you wake up in the morning on Election Day and the whole process is this black box. It feels like jumping off a cliff and you land at the bottom when the votes are counted and you don’t really see the things that happened along the way. But really there’s a staircase you can take. There’s a bunch of steps. Voting starts, then officials begin counting ballots. There are controls and systems in place, and at the end you’ve made it to the bottom of the staircase. We need to do our part to show people the staircase and what happens in each moment to say, “There’s a plan to all this.” A threat actor wants to exploit the uncertainty in the election process to make us feel like the system is broken. But that’s harder to do if you can see the system.

Q: How do we protect ourselves and our democracy?

A: One of the most effective countermeasures in all of this is an informed public. So we have to do the best — all of us, not just Facebook — to amplify authentic information. One of biggest differences between 2016 and 2020 is that you have teams in government, in tech, in civil society that understand the risks and challenges and are working together. We didn’t have this four years ago.

Q: What keeps you up at night?

A: A year ago we predicted some trends we thought we might see. A number of them have come to pass, and we’re ahead of a few things. We’ve seen threat actors target smaller communities trying to hide from us. We predicted that bad actors would move from fake accounts to trying to target influencers. They’ve tried to do it but haven’t reached all that many people because we saw it coming.

When we think of things to be worried about, the first is that our elections system is very complex. And there are so many opportunities to leverage that complexity to run a perception hack. A perception hack is an attempt to create a perception that there is a large-scale influence operation when in fact there is no evidence to support it. The other big piece is the very tense civic debate around the outcome as votes are counted. You can imagine malicious actors will try to accelerate that debate, and that we’re certainly focused on that, like efforts to claim, without any basis, that a spike in voting in a swing district or state is evidence of fraud and trying to use that to inspire or incite conflict.

What you want to do is call it out right now ahead of time so that if they do it, people will say, “Hey, look at that claim. We were just hearing that claim might be made to hack our perception.” It’s part of why there is a large effort concentrated around debunking and prebunking.

Q: If an uninformed or rash or gullible public is working against its own interest, that seems potentially outside the scope of you and your team.

A: You have defenders and attackers. Defenders win when they control the terrain of debate. You won’t get bad actors to stop. But if you change the terrain, you make life harder and harder for them over time. Our platform is a piece — an important piece — of the public terrain. It’s our job to keep this debate as authentic as possible by putting more information and context out there. We can force pages that are pushing information to disclose who is behind them, and we do. We have an independent third-party fact-checking network. When we fact-check we put a label up (to say this is disputed information), and what we’ve seen is that 95 percent of people don’t click through the label. That’s a powerful tool as well.

The truth is this: Existing societal divisions are always going to be targeted by bad actors. What we’re doing is ensuring that this debate will be as authentic as possible. And that we can get as much context to people as possible so U.S. citizens can decide U.S. elections.

Q: Authenticity seems like a fundamental problem. Because some of that authenticity — let’s use some of the president’s comments as an example — could become a security threat.

A: I think it is easy to want to resolve this moment to one problem. And I think it’s a real trap. I think there are many things happening at once. Facebook is an important piece of this, but it’s not the whole puzzle. It’s worth noting when you think about the content side — part of this is why whenever anyone makes a claim about trustworthiness of ballots, we can put a clear statement about historical trust and mail-in ballots and how we expect them to be trustworthy in this election, too. We are living through a historic election with so many complex pieces to monitor. The piece that I and my team can help with is that we can make sure we secure this debate.

Q: What will your evening look like on November 3?

A: We have an elections operation center that’s been running for some time now that brings together the 40 teams inside Facebook that work to protect the election that builds into partnerships with law enforcement, Homeland Security, to state and local elections officials, and to civil society groups and other platforms.

Q: You’re working with other platforms?

A: My counterpart at Twitter says I call him more than his mother does. We’re spending lots of time and exchanging information to try and stay ahead of this.

Q: So is it safe to say you’ll see things on November 3 that you’ve likely never encountered before?

A: You want to be ready for what you don’t expect. We do threat ideation where we put ourselves in shoes of the bad guys and ask what malicious tactics they might use in a specific situation and we wind the clock back to today and ask, “What can we do to put ourselves in a better place now?”

Q: Facebook has invested a lot in security around this election, but these threats will persist well after the decision desks call it for Donald Trump or Joe Biden. What is Facebook doing to ensure election security measures continue, especially in other countries?

A: Between 2016 and next week we’ll have worked to protect more than 200 elections across the world. It’s critical to focus on next week, but we also have to remember Myanmar has an election five days later.

The way you keep focus is that to protect these elections you must protect the moments in between them. The question of public debate online doesn’t just come up every two years in the U.S. We’ll take the lessons from this election and bring parts of them to other elections. We’ve learned a lot. It’s been painful at times. But for my team the focus isn’t going anywhere. We’ll continue. At least after we get a little sleep.

(New York Times Photo) Charlie Warzel | Opinion Writer-at-Large The New York Times
(New York Times Photo) Charlie Warzel | Opinion Writer-at-Large The New York Times

Charlie Warzel, a New York Times Opinion writer at-large based in Missoula, Montana, covers technology, media, politics and online extremism.



from The Salt Lake Tribune https://ift.tt/35XXtOT
November 03, 2020 at 08:30AM

Post a Comment

0 Comments