Since leaving the technology industry, former Facebook and Yahoo Chief Security Officer Alex Stamos has become a leading voice and researcher in new and emerging threats to democracy in a fully digital age. His candid talk on the mainstage covered what’s keeping him up at night when it comes to securing our digital democratic systems and suggestions on what we can do to protect them.
“We in tech said, ‘We are going to change the world.’ We deploy these incredibly powerful technologies and gave a huge amount of individual choice to people without thinking through all the consequences.”
Alex spoke about the advantages and disadvantages that the United States faces when it comes to conflicts in cyberspace. While we may have the best offensive capabilities on the planet, the openness of our society and the dependency of our economy on technology puts us at risk. Our free and open elections are the bedrock of our democracy, but also make us particularly vulnerable to cyber attacks. And our hugely powerful tech companies are facing the issue of public legitimacy. These companies have hard equities that need to be balanced: between privacy and safety, and between authenticity and the ability to give someone the power of anonymous speech.
“We need to figure out how to measure whether or not our products are making the world a better place, and not just do people like them.”
While Alex’s expertise is in subjects that err towards “doom and gloom,” he found a more hopeful atmosphere at Summit: one that comes from being in a room full of people dedicated to making government work better, to making our democracy fairer and more representative, and to putting their talents to good use. Because to protect our digital democracy, government, the tech sector, and citizens will need to work together.
To hear the presentation in full, watch the video or read the transcript below.
Hey everybody. Appreciate Jen, Dan, and Corey inviting me to be the Cassandra to follow her uplifting speech there. But if you're good at a tone, then why play against type, right? It's probably just better to stick with what I'm good at, which is doom and gloom. Being a little facetious. There's a lot of wonderful things going on. I'm really glad to be here because it's great to be among a group of people who are dedicated to making our government work better, to making our democracy more representative and fairer for our citizens, and to be with people who are motivated by the right goals. This is not a group of people who are here to make money or optimize ads, which are the groups I'm often around. This is a group of people who want to put their talents to good use.
So I thought I could talk a little bit, though, about some of the risks we are facing from a cybersecurity perspective when it comes to our society overall, and specifically our elections, because there's a lot of things that we're going to need your help on. A lot of things that we've got to work on. The first thing I want to say is that the unfortunate fact is that conflict in cyberspace is naturally something that reduces the advantages the United States has built over our adversaries over the last hundred years. Now, that being said, the United States has undoubtedly the best offensive capabilities on the planet. When you look at the offensive capabilities of groups like NSA and Cyber Command, especially combined with our allies in the Five Eyes, the United States is undoubtedly the 800 pound gorilla when it comes to offensive capability, but the use of those capabilities in the US is actually relatively restrained versus elsewhere in the world.
There's a funny irony here that, people that hack on behalf of the United States are socialists, and people that hack in Russia and China are capitalists. And that if you are hacking on behalf of the national interest of the United States, you're a civilian employee of NSA, you're a military employee of NSA or Cyber Command, or you work for a defense contractor like a General Dynamics or a Raytheon, but you're being paid by the American taxpayer. Where a number of countries have created economic incentives for groups to form to build their own capabilities to fund themselves, who then do hacking on behalf of their national interests as a side project, as a exchange for getting cover and protection from the government. So that puts us at a natural disadvantage in that those capabilities that we have paid for as American taxpayers are rightfully reserved for a much more narrow set of goals than perhaps are seen as national goals and other countries.
Another issue is the nature of our society and our economy makes us vulnerable. We have the world's most technology dependent economy, and the result of that is that makes it much less robust against this kind of attack. So an example of people who were robust against attack was Ukraine. There was a massive attack against the Ukrainian power grid in 2015, almost certainly by elements of the Russian military, and 230,000 people lost power. But they only lost power for six hours and it's because the Ukrainian power grid, while modern in some ways, is still at its base, a massively over engineered, Soviet-era, analog system, and so they could unplug all of their computers and go throw these huge physical switches and turn the power back on. We don't have that here. If the power went out here in Oakland, PG&E doesn't have a huge Soviet switch from 1957 that they can go flip. We have built our power grid in the name of efficiency and control in a way that is completely dependent upon computer control.
We're also uniquely vulnerable to information warfare because of the openness of our society. I want to just do a quick review of what happened in 2016 from my perspective and why it was specifically targeting United States. So there's four main areas of Russian information warfare in the 2016 election.
The first was the social media hacking, the online troll campaign, which took advantage of the fact that we have social networks that don't license people. You don't have to show an ID to get a SIM and to create an account on a social media network like some other countries do. We allow people to have open and free discussion on our social media sites and we also have a very vibrant discussion in which groups of people can push political ideas that were once considered out of bounds of acceptable political discourse, have now become centered in our discourse, and it's those specific movements, for example, in 2016, the Black Lives Matter movement, that were hijacked by the Russians and tried to be subverted into a tool against our own democracy. So the openness we give people to find and to push these issues that are really important to them, also makes us vulnerable.
There was the hack and leak campaign by a different part of Russia, in this case, the GRU, the main intelligence directorate, and that was a direct technical attack. There was a core cybersecurity issue in the security of DNC systems, and the fact that a number of people who used to be super important, who are still super important, but they used to be in the government and protected by professional IT folks, are now just grandfathers with their own Gmail accounts, that those people are legitimate targets for military actors. That is the cyber component, but the real amplification of the hack and leak campaign, the ability for the Russians to change the entire conversation around the election, was really a vulnerability that is part of our free press, and that we don't have a National Secrets Act (although it looks like the DOJ is going to try to backdoor one in through the Espionage Act attack against Julian Assange, which is another issue) but for the most part, we don't enforce national security laws against legitimate journalists.
Every single day we read important stories from leaked information, and every once in awhile, we read really important story that comes from information that was highly classified and we don't put journalists in jail for that and we don't make them give up their sources. But the fact that we have this open and free information economy and that we protect the press in United States also creates this fundamental vulnerability, that if you have hacked information, you can strategically leak little parts of that in a way that changes the overall conversation, with the knowledge that the American media ecosystem, that some part of it will cover the story and that means the entire ecosystem will eventually have to cover that.
We had direct attacks against our election infrastructure, which relies on the fact that we have a distributed federal system. There's something like 10,000 authorities who run election systems in the United States and that is very difficult for us to protect.
Then the fourth category is what is called gray propaganda, which is the fact that we don't license the press and we don't disallow foreign press from operating the United States. So groups like RT and Sputnik are allowed to be on cable stations at Best Westerns across the country, because we don't control whether or not the media, even if we know it's controlled by US adversaries, whether or not they're allowed to have the freedom of speech that we give to our own media. So this overall issue is a set of fundamental issues in American democracy, that those fundamental weaknesses are the weaknesses that we cherish, that we call strengths. I think we should see them as strengths, but it also means that there is a fundamental vulnerability in our system that we're going to have to be really careful about.
The other big issue, obviously that we're going through is the US tech industry is going through a real crisis of public legitimacy. We in tech said, we are going to change the world. We didn't really believe it when we said that. That's usually a phrase that's used to get venture capital or get 22 year olds to sign up at a recruiting fair, but it turns out maybe we should have listened a little better to when we said we'd change the world. We deploy these incredibly powerful technologies and gave a huge amount of individual choice to people without thinking through all the consequences. There are some legitimately hard problems here, and there's some really hard equities that need to be balanced. Equities between privacy and safety, equities between authenticity and the ability to give people anonymous or pseudo-anonymous speech. But the fact is that the companies who are balancing this are doing so generally in a manner that is not open to debate or transparent and they're acting in a pseudo-governmental manner, without the trappings of legitimacy that we often see from democratic governments.
The other issue here is that, until recently, the major tech companies have resisted responsible regulation. Clearly these multinational companies, it is difficult for them to say that they want to be regulated by any one country, but the fact that they have successfully resisted regulation in the United States, has built a huge back pressure elsewhere in the world for regulation that is perhaps not that well perceived. So one of the things that I've learned over the last couple of years is that people in the media, people in government, react really well to the truth and people saying, instead of everything is great and wonderful, which is the standard line you'll hear from any major tech company is to say, the world's actually a difficult place. We operate at a scale at which we know that there will be hundreds of thousands or millions of people who are going to use our product and who are going to try to use it for some kind of harm.
Harm that varies from abusive individuals all the way up until trying to ferment religious violence or celebrate mass casualty attacks, that all of these harms are real things. We're going to deal with it and we want to have a partnership in doing so.
Then the third major weakness we have right now is the United States does not really have a national cyber security strategy or a rational institutional framework. We have hyper competent offensive units in the United States, like I said, and the NSA and Cyber Command are very, very good at what they do. But all of their incentives are lined up on the offensive side. So you don't get the secret photo of you shaking Obama's hand in the Oval Office because you patched 100,000 servers. You get that photo because you blew up Iranian centrifuges. So the entire incentive structure for our offensive units is on the offensive side.
That is reflected in how the government handles vulnerabilities it finds, what it does after those vulnerabilities leak out to the public, and the fact that these units spend a lot of their intellectual time and capital on finding new bugs, more than they do on defending the US. We have defensive units in DHS. They're highly focused on critical infrastructure, which is great. It's also difficult to say that they have the breadth and depth of experience that we have in the offensive side in places like NSA. So effectively, we have the FBI as our defensive cyber security agency in the United States. There's a lot of great people at the FBI who are very, very smart and talented. But as law enforcement, they are trained to watch something bad happen, take very detailed notes, and to indict those people two years later.
That is a model that doesn't work in the cybersecurity world. It is not a model that allows you to lean into these threats and to have a whole of government response. So unfortunately the people who were supposed to pull this all together in the White House got fired. There's some hyper really competent people. So as a result, we don't really have a national strategy of how we're going to deal with these issues. So some things I think we need to think about as a country and an industry and some things we could work on together.
First, tech industry needs to get our shit together, to be honest. We need to think adversarially in everything we do. So whenever we're building a product, we have to educate ourselves in the ways that those products or similar products were misused in the past and prevent those kinds of known bad things.
We also have to think a little more creatively about how they could possibly use in the future. There's an interesting parallel here between what we're going through in the last couple of years and what happened in the run up to 9/11. If you read the 9/11 report, there's a section all about institutionalizing imagination, that there is a lack of imagination in the defensive units in the government and of thinking about the different things they would do if they were a adversary that had the same lack of resources that the US adversary did at the time, al Qaeda. We need to do the same thing in tech and that we've got to be much more imaginative about putting ourselves in the place of the people who want to cause harm with our products and to think and use our imaginations about how that might happen.
We need to figure out how to measure whether or not our products are making the world a better place and not just do people like them. Effectively, most of the things that tech companies measure to design their products come down to customer satisfaction. Do you like this product? So there's another product that has really good short term customer satisfaction. It's called heroin. If you give somebody a shot of heroin and you asked them 30 seconds later, are you enjoying my product? The answer is inevitably yes, if they can talk at all.
The customer satisfaction does not capture the actual impact of that product on that person or the society overall. That is one of our fundamental issues in Silicon Valley, is we don't have the metrics to measure whether or not we're actually making people better informed, whether they're happier after they use our product, not just addicted to using it in the short term. That short term metric focus is one of the things that got us into these troubles. We need to have predictability and transparency in these decisions we make. We're making really important decisions in tech. Decisions like what is acceptable boundaries of parody and satire in political speech in the United States? That is an incredibly difficult issue, and those decisions need to be made in a very transparent and rational manner, so that people get a little bit of predictability, that it's not just a crapshoot of what decisions can be made in the secret board room on any of these issues.
When we do make these decisions, we need to really base them upon the individual rights and safety, and we need to have a good foundation of when companies decide to restrict the freedom of individuals, they're doing so for a definable impact. Not because it helps the company with PR in the short term, but because it actually makes the community healthier or it helps keep people safe in the longer term. We in tech need to figure out some kind of creative way to finance journalism, especially local journalism, in the long run. There's a lot of issues have led to the decline of local journalism, but the fact is that we're at a place where a huge amount of money has been diverted into the tech industry, and that is having real impacts on the ability of people to stay educated. I'm really afraid that we're going to move to a world where you pay for good journalism and crap journalism is free and that is a really bad world for our democracy.
It's something that is not in the best interest of these companies. It's great for billionaires to go buy outlets and to support them for awhile. I'm glad that some billionaires are doing that, but that is not a sustainable solution to figure out a way to share in some of this revenue. We need tech to work with Congress on some reasonable regulatory framework. So I don't have time to go into a lot of the details here, but my colleagues at Stanford and I are publishing 108 pages. So in academia, that's considered a short, white paper. I'm getting used to how much people can write in the weekend in academia. So we have the short beach read, 108 pages, on the election coming out on June 6th, and we are going to be looking at all kinds of areas of attack against the election.
It is effectively a followup on the Mueller report, with recommendations of how do we deal with the facts of what happened that Mueller laid out? So we'll have a lot more detail in there, but some of the things that I'd like to preview now is one thing we need is we need reasonable federal US privacy regulation.
We need a federal privacy regulator who is competent and has the ability to interpret laws to make them easy to use. One of the real challenges for people trying to operate in Europe right now is that nobody really knows what European privacy law says, because it has to be interpreted by 28 different regulators and those regulators are loath to give you a get out of jail free card before there's all the litigation that they want to go through over the next 10 years. That's a really bad thing for startups. If you're a 20 person company, you shouldn't have to have a full time privacy lawyer on staff, trying to decide how the privacy law applies to every single decision. We need to have a privacy law that gives people predictability, so that they can make the right decisions that are privacy preserving, that are good for users, but that they also have the confidence that when they make those decisions that they will not be punished if the goalposts change later.
We need cyber security legislation that helps small and medium businesses. We talk all the time about these attacks against the big companies. But the truth is the big companies have to take care of themselves. There's a whole host of organizations in United States who are completely screwed if they go up against one of these adversaries. That is a serious issue that we don't talk about enough. Around elections, we need to pass something like the Honest Ads Act, with a lot more teeth around non-electionary ads. So this is something we go into great detail in the report, but one of the challenges we have right now is the interference by folks like the Russians, is not actually classified as political advertising under current US law because it doesn't say vote for or vote against. It is about issues. So we need to control that issue advertising, restricted its hyper targeting of it, have real transparency requirements. There've been changes here, but those changes have all been voluntary, and they've only been done by a handful of companies. So we need to bring some kind of standard to the entire advertising ecosystem.
As companies ask for regulation, we need them to do so compliant with a consistent human rights framework. That's one of the real fears I have about what's going on right now, is that pivoting to push all of these problems and decisions on the governments means that a number of governments that are not respectful of human rights are also going to be making some of these decisions. So when we say there needs to be more regulation, we need to put an asterisk on that, saying more regulation compliant with basic human rights standards. Because we don't want the things that we do in the US and other developed democracies to be used as an excuse to turn the internet into a tool of oppression globally.
In the US, we need to have a defensive cyber security agency that just does defense and no law enforcement. Other democracies have this. In Germany to have this thing called the BSI, in France, they have a group called Anssi, but the truth is that, in other countries there are coordinating bodies that have the ability to plan ahead and to coordinate an all of government response on all of these issues.
We need to work towards international norms that are better. The truth is, is we're in this weird world where nation states can do things to each other that, if done in the physical world, would be considered an act of war, but in the online world, we just consider it a Tuesday. That norm is really, really dangerous. It's really dangerous for us as a country, but it's dangerous for citizens globally, because the number of attacks that are done by countries against their own citizens is going up continuously, and allowing these norms to exist where you can hack journalists, hack activists, hack students, and that's just considered a normal part of managing the democracy. That is a completely inappropriate norm to exist.
So there's some things that we've got to do in academia and we're working on this. We need to change how we educate people. So one of the things we're doing at Stanford is we're teaching cybersecurity classes for non CS majors, because we need to have people from all kinds of backgrounds and disciplines in the computer security world to help inform these decisions. These are not just technical problems, they are sociology problems, psychology problems, political science problems. We're also building tools to study these issues. We're building this thing called the Internet Observatory, and the goal there is to build the technology tools and processes necessary to support good public policy and social science research. We're educating the next generation. We have a Masters in cybersecurity policy at Stanford now. So a number of people here might be interested in making a career pivot. This is a great opportunity for both current undergrads and mid career professionals to study both the technical and the policy sides of cybersecurity.
We have the tools and we have the talent, to quote my favorite '80's movie. We have to execute properly and we need to execute as a society overall. One of the things that really makes me afraid right now is that it seems that everybody is lined up in their corners and we can't have that. We need civil society, we need citizens, corporations, and government all working on this together, because in the long run, making the United States secure against cyber attack is not just good for us, but it will set good standards for how people around the world are treated, and for the protection of democracies against autocrats. We really, are war against creeping autocracy. We've got to keep that in mind when we make these decisions. So hopefully we can find ways to work together on this because I'm really afraid that if we don't put all of our best people behind it, we are going to fall behind.
That being said, groups like this make me confident that we're on the right path, so I want to thank you guys. I want to thank Jen and the rest of the team for having me here. I hope you have a great Summit. Thank you very much.