Move Slow And Fix Things
Discussion between Alex Stamos (Stanford University) and Kara Swisher (Recode)
Description: Can big tech bring the shine back to their fallen star? Facebook former Chief Security officer and cybersecurity professor Alex Stamos sheds some light.
NOTE: I tried to capture as much of this panel as possible, however due to technical difficulties, there were some omitted elements, so any errors, so please pardon if anything was missed!
K: I’m gonna go off of our discussion. One of Alex’s contentions is that it’s misunderstood. One of your thoughts is that Facebook is misunderstood, is that correct?
A: I think there’s a lot of directionally correct criticism…I don’t think FB has ruined democracy. One, there’s a whole class of tech criticism that’s actually criticism of people - hell is other people. FB is other people. And when you talk about antivaxxers, crazy parents, that’s the collective decisions of millions and millions of people who didn’t have that ability before…we’re not teasing part what companies are doing actively, and what societal problems have been unleashed by losing the gatekeepers. For those of us in tech, it feels like a lot of media people want to go back to the world where 38 white guys were the gatekeepers.
K: What you’re essentially arguing is that FB doesn’t kill people, people kill people, right?
A: People utilize speech sometimes for good things, and for bad things. When we give responsibility, we give power. I think in a lot of ways, FB is too powerful. And I’m very afraid of this moment when we’re assigning responsibility to half-trillion companies without accountability, or power, and then asking them to fix societal-wise issues.
K: I think some of your fixes we do agree on. The misunderstood part - what direction is it correct, and what is incorrect?
A: A great example is something I’m still active on, my colleagues and I at Stanford are releasing a report on June 6 on what to do about the Mueller report, and we’re asking for self-regulation by tech companies, and asking them to reduce ability to target advertising politically.
K: The ability to target people in precedented ways and manipulate them.
A: Whether it’s Russians or not, it’s a bad thing. This has been a problem for a while, from the 2012 election when Obama kicked the Republican’s butts on line, that was the first election… I saw the Obama tech teach give a big speech where they were talking about pulling data from FB.
K: But there’s a difference between Obama’s team and a group of Russians in St. Petersburg manipulating an election.
A: Well, yes and no. The most effective use of FB in 2016 was by Trump… The Russian Hacking influenced how the media covered the Hillary emails…it’s not that difficult to build a team of edge lords and build a team of targeted advertising… all the Russian election hacking stuff could be done by an American billionaire, like the Koch brothers or George Soros or Reid Hoffman, my real fear is in 2020 it will be the battle of the billionaires, secret groups who are trying to manipulate us at scale online…how do we tell the companies we want them to stop it? It’s easy to say ‘ stop the Russians’ but harder to say that with Citizens United.
K: SO you have attack of the Billionaires, influenced by the Russians, who will continue to do it because it works, presumably, China, Iran…
A: The Taiwanese election in 2020, it will be all over that. India, the election’s over, the counting is going on now, and what’s fascinating about that is the misinformation is being driven by WhatsApp - and that’s the exception that is interesting, because it has no algorithmic ranking, it’s private, and yet it has the ability in India for hundreds of thousands of people to be enlisted to push propaganda.
K: I’ve said that I felt like these companies have amplified and weaponized these things: it’s like going from a gun that shoots 6 bullets to a semiautomatic machine gun.
A: I like your use of the term machine gun, because we tend to forget that these companies are many different products at once. In FB, the top of that inverted pyramid is advertising and recommendation engines, and when you block someone’s access to advertising, that’s the least concern for free speech. So if we’re looking at it from a regulatory perspective, we could expand the honest ads act, transparency requirements.
K: So this is Andy Klobuchar’s bill, and both here and elsewhere on the globe.
A: And actually the real game in town is other countries. Post-Christchurch shootings, the most interesting moves are in the non-US anglophone countries, and because of the lack of First Amendments, these countries can move much more quickly than the U.S. can. I think the US should lead on this, regulating ads, we should have a US federal privacy law. Our reluctance to regulate is opening the door for other countries to do so, and we could help set an international standard. If the US came up with a broader definition of political ads, and then who you’d have to be to run them, transparency, and how much micro-targeting can be done, it would encourage other countries to do that as well.
[missed these questions]
K: What about privacy bills, fines?
A: One of the interesting things about the US is we don’t have a competent privacy regulator. We don’t have privacy laws. The FTC moves the goal posts and this is a problem you see a lot in Europe because GDPR is being interpreted by 28 different authorities. I think in the US we could do a better job, and this is something you see in other countries, like the Irish, they can deal with things before they go nuclear...That allows you to have a negotiation.
K: And antitrust.
A: I think there are legitimate antitrust arguments for breaking up FB and Google. Those arguments are on competition…but breaking up the companies does not solve fundamental issues. Breaking up ExxonMobile won’t address climate change. I think there’s a lot of excitement for antitrust because it’s like “I hate these companies.”
K: So what’s the solution? You’ve proposed a couple, which I think are interesting.
A: FB needs to have an internal revolution on how products are built, and there’s a role model for this, Microsoft 2002. They were facing significant pushback. But it’s hard without making significant leadership changes to do that. So if I were Mark - and a lot of it’s personalized on him, because he controls the company - he needs to give up some of that power. iF I were him, I’d hire a new CEO. He’s already acting as chief product officer with Chris Cox gone. He should hire a new CEO, my recommendation would be Brad Smith from Microsoft, some adult who’s done this before. Change the management structure so that the product is not at the top.
K: Should we go a step further and not allow companies to have these kinds of stock where they have complete voting?
A: I worked for a company that had an investor who cared what Wall Street thought -
K: Yahoo! -
A: I think companies are too beholding to Wall Street. I think they should get rid of stock compensation…this is a crazy world, and if a CEO comes to you and says “You’ve done a terrible job making products that are good for the world, congratulations, our stock went up” - it’s a fundamental issue for Silicon Valley. For startups it makes sense, but for big companies, you’re paying full marginal tax rates. You should be bonusing, and [having that] based on a basket of metrics based on if you’re doing good long term things for the world, versus recent numbers.
K: Because it gives you incentives to do the worst.
A: What kind of message does it send if your compensation is back up on quarterly numbers?…
K: Do you think they’re committed to fixing things at FB?
A: I think Mark is serious about what people remember him for. I think one o the fundamental issues that they use to measure tends of thousands of people who make decisions is incorrect… I think changing up people to change how they measure engagement would be necessary.
K: Does FB actually call you?
A: Lower down people, but we’re not close.
K: If you were running FB on the product area, give me 3-4 things that have to be done to make it healthier for humanity.
A: I agree with the direction of moving toward smaller groups, ephemerality, encryption, to put data out of your research and any one who wants to reach it. The thing that has to happen with that is how do they deal with safety in this situation. There’s a FB where encryption and groups happen, that goes away from the press, but they still exist and the societal impact is bad. That’s a seductive future for the company, and they have to resist that. I believe in encryption and privacy, but we have to balance it. I’d like to see them spend the next year doing that. And I think they need to be honest about what they can and can’t do. They make content decisions based on external pressure, and they don’t base it on a constitution. And the truth is, there should be limits to the company’s power, they should say “there are certain things we won’t do, even if we get yelled at by the NYTimes”
K: Or Ted Cruz or Donald Trump...
A: They've vacillated back and forth...It’s been indicated, like a ref, that they can call things if you flop. And that’s not good for democracy if these decisions can be made secretly in a conference room. They should be made publicly based on core fundamental values. What are the goals of their content moderation policy? Is it to keep people safe? They haven’t explained the fundamental goal, or where they won’t go past.