full

Can We Make Social Media Safe for Democracy?

For many politicians, policymakers, and voters, the 2016 election of Donald Trump was a shocking lesson in the massive role tech companies, like Facebook and Twitter, play in our politics. 

Since then, their role has only gotten bigger. And as our guest on this episode of Trending Globally explains: that’s a huge problem for democracies around the world. 

Frances Haugen worked as a product manager in Facebook’s Civic Integrity Department from 2019-2021. While there she saw firsthand how Facebook’s algorithms are designed to maximize user engagement at all costs, with disastrous effects.

In 2021 Frances anonymously leaked tens of thousands of internal documents to The Wall Street Journal, and became known as the ‘Facebook Whistleblower.’ Since then she’s testified before Congress, and helped start a global movement to better understand and regulate ‘Big Tech.’

On this episode of Trending Globally, political economist and Rhodes Center Director Mark Blyth talks with Frances about the problems tech giants like Facebook pose to our politics, and what we can do to fix them. 

This episode was originally broadcast on the Rhodes Center Podcast, another show from the Watson Institute. If you enjoy this interview, be sure to subscribe to the Rhodes Center Podcast for more just like it. 

Find transcripts and more information about all our episodes on our website. 

Learn more about the Watson Institutes other podcasts

Read the Wall Street Journal’s expose on Facebook.

Transcript

[MUSIC PLAYING] DAN RICHARDS: From the Watson Institute for International and Public Affairs at Brown University, this is Trending Globally. I'm Dan Richards. For many of us, the Twenty-Sixteen election of Donald Trump was a wake up call to the massive role that tech companies like Facebook and Twitter now play in our politics. Since then, their role has only gotten bigger. And as our guest on this episode explains, that is a huge problem for democracies around the world.

FRANCES HAUGEN: Standing there about to cast your ballot, and the things that we even get to decide between have already been vetted by the algorithm. What we don't realize is Facebook voted first.

[MUSIC PLAYING]

DAN RICHARDS: That's Frances Haugen. Frances worked as a product manager in Facebook's Civic Integrity Department from Twenty-Nineteen to Twenty-Twenty-One. While there, she saw firsthand how Facebook's algorithms are designed to maximize user engagement at all costs with disastrous effects for individuals and society.

[MUSIC PLAYING]

In Twenty-Twenty-One, Frances anonymously leaked tens of thousands of internal documents to the Wall Street Journal, and became known as the quote, "Facebook whistleblower". Since then, she's testified before Congress and helped start a global movement to better understand and regulate big tech.

On this episode of Trending Globally, political economist and Rhodes Center Director Mark Blyth talks with Frances Haugen about the problems that tech giants like Facebook pose to our politics, and what we can do to fix them.

This episode was originally broadcast on the Rhodes Center Podcast, another show From the Watson Institute. If you enjoy this interview, be sure to subscribe to the Rhodes Center Podcast for more just like it. All right, here are Mark and Frances.

[MUSIC PLAYING]

MARK BLYTH: Hello Frances, and welcome to the podcast.

FRANCES HAUGEN: Thank you for inviting me. Happy to be here.

MARK BLYTH: So this is a bit of a change of pace for us in the sense that we usually have economists, political economy types. And your work and your influence, particularly since your senate testimony, is really spilling out into these areas.

For example, we have the Europeans bringing out the DSA, the digital Services Act, which really has your fingerprints all over it. And I want to get to that at some point. But let's start with the basics on this. Let's go back to Facebook.

You decided that you had to say something because Facebook causes harm to vulnerable communities, especially young women, because its engagement and algorithmic-based rankings keep people on the site longer and not where they can see more ads. That's the core of the business model.

Now many people out there in political economy line will think of this as a market problem, and say break them up. But you actually think it's much easier to kind of just change the angle, and keep what's good about it rather than breaking up the firm. Can you explain that for us?

FRANCES HAUGEN: So I think it's less a question of I am against breaking up the companies, and more that I believe it's very important that when we talk about problems in the world. We should make sure we clearly articulate the cause of the problem before we try to articulate solutions.

And if we were to look at just inside of Facebook, so we're not even comparing Instagram and Facebook. If we're looking at just a Facebook, we see that engagement-based ranking, that's the process of saying I have these different objects, maybe there are different groups that I could recommend to you, or there are different friends I could recommend to you, or there are different pieces of content I can recommend to you.

When I have to choose from thousands and thousands, tens of thousands, hundred of thousands of things. If the way that I prioritize what I show you first is based on how likely are you to interact with this? Put a comment on it, you're putting a like, you're going to put a reshare.

And what researchers inside of Facebook saw was there were multiple forces, it's things like the angry or common thread is, the more likely you click back to the publisher, which means they get more ad revenue, which means they produce more angry content. The more angry things are, the more likely reshare it because they actually gets you over that activation energy.

Inside of Facebook, we have seen problems not just in the content, we've seen them in things like groups recommendation, where back in Twenty-Sixteen, 65% of all people who joined neo-Nazi groups in Germany joined them because Facebook recommended them to.

So this is not a problem of like this company versus that company, it's like these systems, any place you put engagement-based ranking is going to reproduce these problems. And so the question is, what do we get by breaking up the company?

So 1, we do escape Mark Zuckerberg. Mark has 56% of the voting shares. He's basically impossible to dislodge. Yes, if you broke up the company, you could escape that center, like that black hole. But beyond that, it doesn't necessarily mean you're going to fix these problems, because all the individual components are still going to have the problems.

MARK BLYTH: So how would you fix it?

FRANCES HAUGEN: So I'm a big proponent of chronological ranking, but you can't just take the current product form factor for Facebook and put chronological ranking in, because Facebook has been pushing us into giant groups. We're talking half a million person group for years. And those groups are fire hoses of content.

And so I think there's an opportunity to take a step back and say, what would human scale social media look like? We have hundreds or thousands of years of institutions around, how do we organize people. How many people can be in a single conversation?

How do you hold a conference for 20,000 people? We break it up into smaller conversations. And I think there's some really good precedents we can look at. We can look at things like Slack. And Slack, it's chronological, a sub feeds for comments, but it also has rooms.

And I think right now people sometimes come in and say, well, I don't want to give up my group. I'm a cancer survivor, my cancer support group is so critical to me. Guess what? If you had a thing that was like a discord server or a Slack server, you could still get community. But a computer wouldn't be choosing what you focus on. And the biases of that computer or the biases of that company wouldn't be choosing it. It'd be other human beings that would be choosing what you focused on.

MARK BLYTH: So rather than just the user experience, it really puts users in charge, and in the aggregate when you scale, that overcomes exactly these problems that you've been thinking.

FRANCES HAUGEN: Because remember, right now when we have the algorithm choose what we focus our attention on, we lose the opportunity for good speech to counter bad speech. So freedom of speech people say the solution about speech is not to ban it, it's good speech. Well, when we allow computers to focus our attention, we as humans no longer get to correct problems.

So if we're sitting in a room and we have 30 people and someone says something that's just off the wall, they are flat Earth or something, people can calmly talk to them and say like, hey like, why don't we talk through that? Or how are you doing today?

MARK BLYTH: Any particular reason you wanted to reject all human knowledge?

FRANCES HAUGEN: When we allow computers to focus our attention, when we allow engagement to be the thing that is the assessment of quality, it means that if you write a really angry, enraged post that gets a whole fight in the comments, and I write a very calm detailed methodical thing, be like, hey, let's talk through the assumptions in your thing. I will get no distribution.

MARK BLYTH: And if that's what's driving engagement, you can totally see how it just burrows down this funnel. I mean, that's amazing. All right, elections. So we get very vexed about Facebook and elections and foreign interference, and all that sort of stuff.

From your vantage point, should we be worried about foreign interference? And I believe the figure for the Russians was they spent $100,000 and managed to tap the election, which seems a little bit crazy. Or should we be just more worried about essentially these algorithms deciding what should be an election issue and what we should be focused on? What's the bigger threat?

FRANCES HAUGEN: So back in Twenty-Eighteen, Facebook made a major change to their product. So up until then for years and years, they've been just trying to optimize for how long could they keep you on the site. So they were like, if we show you this, we predict you'll be on the site for this amount of time longer. And they switched and said, hey, we have this problem.

Most people think of Facebook only in terms of consumption. I sit here and I consume off my feed. But in reality, Facebook is a two-sided marketplace. When you write a post, you are a content producer. And Facebook is a system for matching producers with consumers. And so--

MARK BLYTH: It's a shopping mall.

FRANCES HAUGEN: It's shopping mall of ideas and memes. Have some pictures of my breakfast. The only problem is if you lose the producers, you can't have consumers. And so what Facebook was facing in Twenty-Eighteen was over time, the amount of production was falling off.

MARK BLYTH: What was behind that?

FRANCES HAUGEN: I think part of it is so all social media platforms have an issue, which is that in the beginning, you have pretty broad participation, like people are trying out the platform. Over time, some people really get into it. So think of this as the people on Instagram who have drones. You're like oh, wow, like look at that drone video. It's stunning.

I guess Instagram isn't for me. My photos look bad. It's self-censorship. And this is actually one of the big pushes towards things like Instagram stories, like Instagram stories creates a much, much lower threshold for like what it is mean to be content production, because it's going to evaporate in 24 hours.

So Facebook was experiencing this thing where people are making less and less and less content over time. And they ran a bunch of experiments on content producers. So this is one of the things we always have to assume you're always being there seeing, how can they influence you? How can they manipulate you?

And what they found was by artificially giving people distribution, so they made you 10 times as popular as you were before, five times as popular as you were before. They were like, oh, if you get more comments, more likes, more shares, turns out when you give people little drips of dopamine, they give you more things in return.

So a problem happened, though, which was when you optimize for engagement, yes you get more content production, but you also shift what content succeeds. So suddenly across Europe, so they sent a bunch of researchers into Europe in preparation for the European parliamentary elections in Twenty-Eighteen.

And across Europe, this is like less than six months after this change, people are like, hey, it used to be that we could share the bread and butter of the Democratic process, a white paper on agricultural policy. Like stuff that's it's not sexy. It's not riveting. It's not clickbait.

But we could see people read it. We could look at the statistics and we knew people were consuming it. They just didn't leave comments. And now if we share that same thing, it's crickets. We are having to change the content that we distribute because only extreme content now gets distributed.

And we run positions now that we know our constituents don't like, because they're the only ones that get distributed. So by the time we show up in the ballot box, we're standing in there about to cast your ballot, what we don't realize is Facebook voted first. And the things that we even get to decide between have already been vetted by the algorithm.

MARK BLYTH: And it's a classic story of intended/unintended consequence.

FRANCES HAUGEN: 100%. 100%.

MARK BLYTH: So, I mean, to summarize that, right? Basically, you figured out how to do a dopamine hit. You put that out there, people figured out there is a dopamine hit. The dopamine hit really has an anger precursor. So unless you're doing that, it doesn't work. You just change the substance of Democratic politics, and nobody really set out to do that.

FRANCES HAUGEN: Yeah, no one on Facebook said we want to give the most reach the most extreme ideas. Like no one did that. But they did do it was like, and it's fascinating, they call it meaningful social interactions.

And the hilarious part is some of the documents in my disclosures show that when they pulled users six months after it happened, they said how meaningful is your newsfeed? And this change made people's news feeds less meaningful. So we should not call it meaningful social interactions. It's just social interaction.

MARK BLYTH: Social interactions.

FRANCES HAUGEN: Yeah. There's also a secondary thing, which is that same system of saying like high quality equals high engagement. That is also used in ads. So political parties run ads. When they're trying to decide, should they show you this ad, a little auction takes place.

Being like how much is your attention worth? How much is does this person want to get to you? But there's also a secondary feature which is they say, how quote, "high quality" is this ad? Because Facebook's only going to get paid if you interact with it.

And so angry, polarizing, extreme divisive ads, they get reactions. And so you end up having a thing where extreme polarizing ads are 5 to 10 times cheaper than compassionate, empathetic ads. And we cannot have a democracy if we subsidize division.

MARK BLYTH: Yeah, we're basically producing anger. Anger is our product, becomes basically what it ends up.

FRANCES HAUGEN: Division becomes our product, yeah.

MARK BLYTH: Division, wow. So again, let's go back to the way that is normally framed. There are foreign people who are manipulating our elections. Should we be bothered about that? Because everything you're describing is completely endogenous. It has nothing to do with the outside.

FRANCES HAUGEN: So I want to be real clear. Influence operations. So this is the weaponization of the platform, is an extreme, extreme concern, extreme concern. And the reason for that is Russia, China, Iran, Turkey, they're all investing in large information operations, Ukraine.

And the reason why this matters is the platforms have tools for detecting coordinated behavior. The thing that we don't have right now is enough people who actually take the outputs of those systems and pull down the networks.

So when I worked in threat intelligence at Facebook, so that was the last eight or nine months I was there, I worked on the counter espionage team, which is the sister team to influence operations. And our team only ever worked on a third of the cases we knew about, because we are so understaffed. And we never wrote detection software, because we already couldn't handle the cases we had.

MARK BLYTH: Why wouldn't Facebook engage more and provide those resources? That seems to be for a firm under pressure, an easy lift, right? Why would they not go for it?

FRANCES HAUGEN: I think part of it is these are cost centers, and there's no external accountability. So the question of how good a job is sufficient is not currently being negotiated with the public. It's Facebook's deciding on its own.

MARK BLYTH: But with $40 billion in profits, you could chop a billion at it and buy yourself some really good press.

FRANCES HAUGEN: Totally. And I want to really flag for people, they're doing $75 billion with like be nine zeros, $75 billion for the stock buybacks in this 12-month period. So they are lighting on fire $75 billion, and then bragging to the public that they spend $5 billion on safety.

MARK BLYTH: Yeah, there's something deeply wrong on so many levels.

FRANCES HAUGEN: I think part of it is like I went and checked, so last August, I went and checked every single time the Facebook stock price declined versus the NASDAQ by more than 5% over a 10-day period for the last five years.

And there were a very limited number of incidences. It was like 27 instances over five years. A handful of them, maybe 20% were like ones where just all the big techs sold off at the same time, a small correction. But in general, the things that drove the price down were either declines in users or increases in expenses. So when Facebook spends more on safety, even small amounts more, the stock price crashes.

MARK BLYTH: But if you have that type of balance sheet, you can just buy the stock back and do a self correction.

FRANCES HAUGEN: That's what they're trying to do.

MARK BLYTH: And that's what they are trying to do. Absolutely. So let's talk about the rest of the world. You once said that something we don't recognize as a problem is that for much of the global South a billion people or more that Facebook is the internet, because it's there, "it's free", quote unquote, and it works. Why is that a problem?

FRANCES HAUGEN: We need to unpack the it's free, because the only reason why it's, quote, "it's free" is Facebook made a choice, they chose. Twitter didn't choose to do this. TikTok didn't choose to do this. They chose to go into some of the most fragile places in the world, places like Myanmar, and pay for people's data.

So in most places in the world, for every megabyte of data you consume, you pay money. And Facebook London said, if you use Facebook, the data is free. If you use anything on the open web, you have to pay for the data yourself. And surprise surprise, market forces work.

And so for a majority of languages in the world, 80% or 90% of all the content in that language is going to be only on Facebook. And so in the United States, we can step in and say like, well, I made a choice. I stepped away. I don't want to support this. It's not in line with my values.

But for at least a billion people in the world to say I don't want to participate with Facebook, they don't support my language, they don't invest in security resources. I'm using the version of the product that Zuckerberg himself said was dangerous in Twenty-Eighteen, because I don't matter to Facebook. They don't get to leave because any local webpage is actually just a page on Facebook.

MARK BLYTH: And if they have, which I'm sure they do exactly the same problems we were talking about earlier, and some of the most fragile parts of the world, let's think about, for example, languages. How would Facebook or any company in that position police content in languages whereby it's very hard to find specialists talk about detecting nuance, et cetera? How do you begin to solve those problems even if you want to try?

FRANCES HAUGEN: Oh, I'm so glad you asked. So there's an implicit assumption in the way you framed your question, which is that the way to solve these problems is via content, and not via product design. So one of the things I keep, keep, keep emphasizing, and I have an editorial coming out in the next couple of days that talks about this, is there's lots of choices that Facebook has made that have led to the situation we're in where the most extreme ideas get the most distribution.

So you could imagine things like, should you have to click on a link to reshare it? If you do that and you get 10% or 15% less misinformation. But it works in every language, or I'll give you another example. Let's imagine Alice write something. She posts it. Her friend Bob reshares it. Carol reshares it. So now it's beyond friends of friends.

When it lands in Dan's news feed, if Facebook grayed out that reshare button and said, hey, it's beyond friends of friends now, if you can totally share this content, you can do whatever you want, but you have to copy and paste it. That small change, like putting a little bit of human in the loop has the same effect on misinformation that the entire third-party fact checking program has. Only it works in every language in the world.

MARK BLYTH: Wow. So that one simple product design not--

FRANCES HAUGEN: Totally.

MARK BLYTH: One extra step.

FRANCES HAUGEN: And so there's lots and lots of little things like this, where we need to be taking a step back and saying, are there opportunities for intentionality?

MARK BLYTH: I can't help but wonder if part of this is think about the current moment. So we're all obsessed with inflation in the current moment. So a study came out today that the Guardian did. I just posted it on Twitter, oddly, which said the following. Wages overall have risen by 1.6% real, whereas corporate profits for the median firm are up 49%.

So this is basically price gouging. And what they did is they went on investor calls whatever. And as housing firms are like, yeah, we could build more houses, but why should we remember getting half a million a house if we restrict supply?

So that's what's really driving it, and that sort of thing. So let's bring that inflation example back to the whole notion of product design and tweaking. Maybe part of the problem here is that we live in societies where profits are regarded as absolutely sacrosanct. You cannot touch them. And what you're essentially doing is it sounds innocuous, but you're asking them to earn less money. Ultimately, that's it. Is that really the problem we're just afraid to attack the Holy Grail of profits?

FRANCES HAUGEN: I think it's really, really important for us to understand how we got to where we are right now. And so one of the most critical problems is that we have lots of specialists you can hire who are lawyers or political scientists who have specialized in freedom of speech.

We literally graduated zero people, zero people in the United States who talk about things that I just mentioned. So these product safety changes, product design changes around how we make these systems. And the reason for that is intentional.

Right now the only place in the world you can learn about these things is to go to a large social media company, because right now we have zero classes in any University that really give students a firsthand ability to experience the structure of these products. And that's part of why I want to build like simulated social networks, so that we can start to graduate 50,000 people every year who could talk intelligently about this.

MARK BLYTH: But you then still have to confront the fact that I am an incredibly profitable business, almost zero marginal cost. Nothing really is, but almost in a marginal caste. And what you want to do is complicate my business in the name of safety.

FRANCES HAUGEN: But I want to make you more long-term successful, because I think part of what's happened at Facebook is they've deprioritized human judgment. So decisions to ship or not ship are so metrics-driven or goals are defined so numerically that it means that people get locked into a very narrow frame of reference for defining success.

This is the quarterly by quarter kind of financial mindset, putting some constraints on that system if it makes it more pleasant. I think they'll have more users 10 years from now than they would otherwise.

MARK BLYTH: And there's less regulatory risk.

FRANCES HAUGEN: Yeah, 100%.

MARK BLYTH: So a couple of things I want to bring our conversation to a close sadly with two points. You mentioned freedom of speech. And one of the things I particularly enjoyed about your testimony and other things that you've said is you just debunk that this is about freedom of speech. It is not freedom of speech. It's about protecting a business model.

How do you think we can make more people understand that it's not going after freedom of speech? Because if profits is one of the third rails of our society, going against freedom of speech is the other, how do you make people understand that that's a canard? It's not really about that.

FRANCES HAUGEN: I usually bring them back to we talked about this idea of, should you put a human in a loop once a reshare gets beyond friends of friends? I always ask them like if you had to choose between having some, we have no idea who the fact checkers are. We know that there's a certain organizations, but we don't know how many fact checks they do.

We don't know which languages they do to them. Surprise most of them are in English. So part of why Europe passed the DSA, which is the digital Services Act a couple of days ago is because they use the raw version of Facebook much more than Americans do.

87% of the misinformation budget for Facebook got spent on English, even though only 9% of users speak English on Facebook. So I always ask them would you rather say you have to copy and paste after reshare gets to beyond friends of friends? Or do you want this mysterious group of people to say what's true and not true?

And everyone goes, I think we should focus on product safety. And so if we want to protect say places that are at risk for ethnic violence, often are linguistically diverse, they often speak smaller languages. If we want to have products that don't cause ethnic violence, we have to focus on product safety and not on censorship.

MARK BLYTH: It seems pretty convincing to me.

FRANCES HAUGEN: Yeah, that's my hope.

MARK BLYTH: It's a good hope. Thanks very much. It's fascinating conversation. And good luck.

FRANCES HAUGEN: Thank you so much.

[MUSIC PLAYING]

DAN RICHARDS: This episode was produced by me, Dan Richards, and Kate Dario. Our theme music is by Henry Bloomfield. Additional music by the Blue Dot Sessions. If you like this show, please help us spread the word. You can leave a rating and review on whatever podcast app you use, or just share an episode with a friend. We'd really appreciate it.

You can also find the show and transcripts for each episode on our website. We'll put a link to that in the show notes. And again, this episode was originally broadcast on the Rhodes Center Podcast, another show From the Watson Institute. If you enjoyed this interview, be sure to subscribe to the Rhodes Center Podcast for more just like it. We'll be back in two weeks with another episode of Trending Globally. Thanks.

[MUSIC PLAYING]

About the Podcast

Show artwork for Trending Globally: Politics and Policy
Trending Globally: Politics and Policy
The Watson Institute for International and Public Affairs

About your host

Profile picture for Dan Richards

Dan Richards

Host and Senior Producer, Trending Globally