Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Full Transcript: Facebook COO Sheryl Sandberg On Protecting User Data

Facebook Chief Operating Officer Sheryl Sandberg speaks during the 2018 MAKERS Conference on Feb. 6 in Los Angeles.
Vivien Killilea
/
Getty Images for MAKERS
Facebook Chief Operating Officer Sheryl Sandberg speaks during the 2018 MAKERS Conference on Feb. 6 in Los Angeles.

Facebook has been under fire in recent weeks after it was revealed that Cambridge Analytica gained access to millions of users' data while working for President Trump's 2016 campaign. Facebook founder and CEO Mark Zuckerberg is expected to testify before Congress early next week.

In an interview Thursday, Sheryl Sandberg, the social network's chief operating officer tells NPR's Steve Inskeep about the company's missteps, and what it's doing to correct them, and the information being provided to affected users.

Inskeep: I know your time is short, so I'll dive right in. Thanks for doing this, it's great to meet you in person.

Sandberg: I'm really glad to have you.

I want people to know that you have been credited with being part of the reason that Facebook is so profitable. This is a story that's told about you — that the company was popular before you came, that it became much more profitable afterward. Have the events of the last year or two, though, shown that the business model of this company is part of the problem?

I'm going to take a step back and talk about what the problem is, and then get to the business model, because you're asking a really important question. We know that we did not do enough to protect people's data. I'm really sorry for that, Mark's really sorry for that. And what we're doing now is taking really firm action.

Starting Monday we're going to start rolling out to everyone in the world, right on the top of their news feed, a place where you can see all the apps you've shared your data with and a really easy way to delete them. We're being much more restrictive, in the data apps are going to have access to. And just yesterday we announced further steps shutting down data access points in groups and events and pages, and search. And so we are in a process that is evaluating all the ways data is used.

It's going to be long — it took us a long time to get here, it's going to take a long time to find all of this — but it's also going to be ongoing. Because safety and security is never done, it's an arms race. You build something, someone tries to abuse it, you build something. And so the commitment we have is that we have a new approach to not just protecting sharing, but also on privacy.

Then you asked about the business model. And on the business model we have an ads-based business model, just like TV, just like radio. Our content's available to anyone for free because it's ad-supported. And that we feel really proud of, and we're really — we think it's really important. We're trying to connect the whole world. Two billion people use our service; a lot of them would not be able to if they had to pay for the content itself.

And privacy and ads are not at odds. It's a good opportunity to remind everyone what we say all the time, but we need to keep saying so people understand it — which is that we don't sell data, period, and we don't give any advertisers your personal information.

But let's be clear: Part of the business is gathering enormous amounts of data about people, making use of it, and figuring out ways to monetize that. And that does make a lot of money and it may well serve a lot of people, but hasn't it also led to some of the abuses we've seen in recent years.

Well the abuses we've been talking about have not been with ads — they've been with other products. But that doesn't mean there haven't been abuses in ads, and that doesn't mean there won't be, going forward.

So let's talk kind of big-picture about what's happened here. Right? Because the underlying question is, why are we where we are, why are we so slow, do we have control of this massive system? Those fair?

Go for it.

Those are fair questions, and I think those are real questions. So if you take a step back we had a very deep belief in what were social experiences that we would build — an ability for people all around the world to connect, and that people can have social experiences. [If] you and I are friends, we would know each other's birthdays were, we were able to go to events together, we were able to share each other's playlists, and a lot of good got done there.

But I think what we weren't good enough at doing is thinking in advance about the ways misuse could happen on a platform. And when we found things — and we did find things — we shut down that thing. So the specific example in Cambridge Analytica, of friends-of-friends sharing, we shut down in 2015 that specific case.

But what's different now is that we're taking a much broader view and a much more proactive approach. A few weeks ago and the Cambridge Analytica thing happened, we made a commitment that we would go through and find data uses that were potentially too risky, or even ones we didn't want to do. And we have been proactively bringing things to the surface, shutting things down, telling people and we're going to continue to do that.

Let's talk about what you discovered with Cambridge Analytica. Some people will know that you have disclosed this week that it was 87 million people whose information was shared. The overwhelming majority of them without having given explicit permission.

That's not exactly right — let me let me explain. So in 2015 when Cambridge — when we received word that this researcher gave the data to Cambridge Analytica, they assured us it was deleted. We did not follow up and confirm, and that's on us — and particularly once they were active in the election, we should have done that. And again that is on us.

What's happening now is, we don't know what if any data they still have or how much it was. We're trying to do a forensic audit with them, but the UK government is doing theirs first and we had to stand down — they get priority. The 87 million is anyone who Cambridge Analytica might have accessed their data. We're being super-conservative and careful. This is anyone who might have been connected, might have been connected to someone who connected to them

So you still don't know what the number is.

We still don't know.

Well let me ask if you know something else: Were there other firms, political or otherwise, who used data in the same way that Cambridge Analytica did?

We don't know. What we announced when we talked about Cambridge Analytica, is we're doing a thorough investigation and an audit. That is ongoing, and as we find those ,we're going to notify people as we did with Cambridge Analytica. But we're not just looking at apps — that would, you know, I think would have been what we would have done before — we have an app problem we look at apps. Now we're looking much more broadly.

That's why this week we shut down a number of use cases in other areas — in groups, in pages, in events — because those are other places where we haven't necessarily found problems, but we think that we should be more protective of people's data in a much more proactive way.

Talk to me as somebody who's been on Facebook for more than a decade, though. I think about Cambridge Analytica, I think about the fact that this company that I had no idea about was gathering perhaps my data, or somebody that I know, and using it in ways I'd have no idea about. There's an issue there with consent.

But even beyond consent, from the average person's point of view, doesn't Facebook and anybody who deals with Facebook essentially do the same thing Cambridge Analytica did? They gather lots of data about people and use them in ways that, whether we formally consent or not, we really don't understand.

One of the things we're very focused on is making sure you do understand how all of your information is used, and you do understand what information you've shared with Facebook.

We announced we're rolling out privacy shortcuts. The controls have been there, and most have been there for a long time — but you're right, they're hard to understand and hard to find. So this is a very simple way to see where those controls, and control them, including ad preferences. We just updated something and made it easier — it's been there for a long time — called "Download Your Information," where you can look at every bit of information you've shared with Facebook.

And so you are right that the system's been too hard for people to understand, and we're taking very assertive steps to make it easier, simpler, clearer, so that the controls are in one place and people can take the steps they want to take.

But the business is still the same, right? You're still going to have lots and lots of information about me, in ways that might make me uncomfortable.

Well we want you to know all the information we have about you, we want you to know all the controls we have, and we want to make sure you're not uncomfortable.

So let's talk about ads. On ads, we are able to show targeted ads to someone, which means something you might be interested in. We do that ourselves — and then we don't pass any of your personal information to any advertiser, they just get aggregate reports. You can also opt out of different ways we use data. You can opt out of seeing ads from different advertisers through ads preferences.

And again, we do not sell data, ever.

One quick question then I want to move on a little bit here —

Can I? I just — I wanted to finish. This is important.

Please, please — go ahead.

One of the questions you might ask about the business models — why use data at all for ads? Why don't we just show everyone in the U.S. the same ads? Then we could do it without any data

Oh, no, I understand why you do that.

But this is a really important thing — I think it's really important for your listeners to understand, which is that, that's why small businesses can participate in this. That if you just do big ads they go to everyone, only large advertisers can participate.

We have 6 million customers around the world — a lot in the U.S. Forty-two percent of the small businesses on Facebook in the U.S. are hiring because they're growing on Facebook. And that's creating the majority, small businesses create the majority of the job growth on Facebook. So the targeting — which we do in a privacy-safe way — is a big chunk of why small businesses are growing, and that's why we think it's so important. We can do it, and protect your privacy, but it's part of what's making small business — and our economy — grow.

One detail I want to check on then move on to a bigger thing: Regarding Cambridge Analytica, you said "we don't know" if there were other political entities that used information in that way. Do you mean to say you do not at this time know of a single other company or firm that used data in that way?

We don't even know what data they had or used. I don't know what they did, so I don't — we don't know any others, because we don't even know what they did.

I'm curious when you're talking with Mark Zuckerberg or whoever else you may talk with around this company, have you had moments when you've asked the question, "are we as a company too powerful?"

It's an important question — and people have that question about us and others, particularly as our size and scope. And we've had a lot of long and thoughtful conversations about what that means. We know that a lot of regulators have that question. We know that consumers around the world have that question —

Do you take it seriously or does it seem ridiculous to you?

Oh, we take it very seriously. We've always had a deep responsibility for people, but at our size and scope, with billions of people using our products, we have a very deep responsibility.

We're having conversations with regulators around the world, but we're not even waiting for regulation. The most likely regulation in the United States right now is the Honest Ads Act — which may or may not pass. We're not waiting for it. We built a tool that shows every ad that any page is running on Facebook. It's live in Canada. It will be live in the U.S. before the election.

And that's really important because that law is about transparency. We're not going to get dragged to doing it — we're doing the transparency now, ahead of that law, whether or not it happens

Will your tool allow for something that some privacy advocates seem to want and advocate for, which is that people can simply opt out of having so much of their data saved and shared at all?

There are lots of ways to opt out of different data parts on Facebook — and again we are rolling up those controls putting them into privacy shortcuts. We're also looking carefully at the GDPR legislation in Europe. And the majority of those controls and settings we are going to make available around the world — they'll have slightly different formats because some of it's specific to that law, but we are making those controls and settings available throughout the U.S. through this year as well.

We'll make this clear: Europeans have different standards that in some ways would seem stronger than the United States'; you're saying you're going to follow them everywhere. Is that correct?

We're gonna follow all the controls and settings. Not every exact one — I'll give you one example where we wont: In Europe the age of consent is 16, here it's 13. So that's one difference. But the fundamental principle, and most of the controls — I can opt out of ads preferences, I can opt out of this form of ads targeting, I can opt out of using [third-party developers' tool] Platform. Those we're going to find a way, in a local way of people understand them, to roll out everywhere.

Given that the Federal Trade Commission reached a consent agreement with Facebook in 2011 to better protect people's privacy, should you have taken these steps years ago?

Well we're in constant conversation with the FTC, and that consent decree was important, and we've taken every step we know how to make sure we're in accordance with it.

But the bigger answer is, should we have taken these steps years ago anyway? And the answer to that is yes. Like a very clear, a very firm, yes. We really believed in social experiences, we really believed in protecting privacy, but we were way too idealistic. We did not think enough about the abuse cases. And now we're taking really firm steps across the board.

So let's talk about another one that I'm sure is really important to the people listening: Let's talk about election interference. Right? In 2016 the Russian Internet Research Association interfered in the election on our platform — and that was something we should have caught, we should have known about, we didn't. Now we've learned.

Just this week we announced that we've taken down another 270 pages and accounts. They were in Russian, mostly, targeted at Russia. And our answer to that is, these are Russian troll farms, this is deception, and there's no place for it on our platform or any other anywhere in the world. We're not going show in Russia, we're going to show the U.S. — and we're looking for others. That was something we didn't understand then, but we are focused on finding now.

Similarly in the recent Alabama election, the special Senate one, we found Macedonia scammers that looked to be financially motivated, that were going to put up what would have been fake information — and we got those down.

So we learn, we adjust, and we are very focused on making sure that we get this right going forward.

Are you prepared for the 2018 congressional elections, which are essentially upon us?

Yeah. We are doing everything we can to be prepared, and I can talk about some of those specific steps. Fake news — fake news is a really important part of what people are concerned about. We are really going after fake accounts. It turns out that a lot of people think of fake news as politically motivated, and a lot of it is — but even more of its financially motivated.

Sure.

People are trying to write outlandish headlines, right? Get you to click, make money. So we've made sure that we've taken them out of the ability to monetize on Facebook, taken them out of the ability to show ads and make money. That's really important.

We've also done a lot more on fake news. We have a partnership now with the AP set up in all 50 states where we can quickly respond when something looks like it might be false. When something — we either find it which is new, we're doing it proactively — looks false, or someone reports something to us as false, we're relying on third-party fact-checkers.

And if they say it's false we're dramatically decreasing its distribution. We're letting people know "this is false" right before they post it; if you already posted it, we're going back and saying "you posted something our fact-checkers say is false, and we're giving you alternative facts," in the form of related articles right there.

We're also asking people broadly, what news sources do you trust? And we're going to show people more news from the news sources they trust as well as less news from others.

People say they trust The New York Times, they'll get more New York Times,is that what you're saying?

If people in the country broadly trust more The New York Times, they will see more from The New York Times.

Richard Blumenthal, Democratic senator, called upon Facebook to contact 126 million people who were believed to have been touched in some way by Russian disinformation, tell them how they were misinformed, and make sure that they know what's going on. Have you done that?

That tool is up. I believe there is a place you can go and people can see if they might have seen some of that IRA context, so we have made that —

Privacy advocates feel like that's not a very accessible tool. Can you actively reach out to people, and tell them "excuse me, I'd like to let you know that this is something that happened"?

Yeah, I mean...

If there's a consumer watchdog, I might get a letter in the mail telling me that something happened, and I'm owed a rebate by an insurance company. You could actively reach out to users, couldn't you?

Yes. And you know, we're Facebook — we believe that in our interface, reaching out to people is a good way to do that, and you're going to see that on Monday as well.

You're going to see more of that.

Yeah, Monday it's going to start rolling out.

But are you going to tell 126 million people who they were, and say "this is the thing that you were misinformed about"?

So we're trying to reach people the most effective way possible. Some people say they want things in the mail, but some people would say "I've not looked at my mail in years."

Let me make it clear. I don't mean for you to send snail mail. What I mean for you is, are you going to reach out individually to these people? Which seems to be what the senator wants.

We are making sure the information is available in either QPs — so at the top of news feed — or in other ways that they can find it. We're very focused on doing that.

What do you think your company's role is as a publisher in this year's election and in the presidential election that's coming in a few years?

Well we certainly know that people want accurate information, not false news, on Facebook and we take that really seriously, and we just talked about some of the steps we're taking. We also want to make sure that there's no foreign interference.

We are also really taking very aggressive steps on ads transparency. I mentioned how you're going to be able to see any ad a page is running; we're also building an archive of political ads that will run forward and build for four years. So you'll always have, once it builds up, four years of data. Or for any political ad, you'll be able to say who ran it, who paid for it, how much they spent, and the demographics of who saw it. Again, industry-leading transparency.

Oh, so that these groups that track campaign financing can come to you and readily get lots and lots of information about how —

It's going to be available online for anyone to see. Anyone's going to be able to see it.

Because it's clear to you that in 2016, it's hard for anybody to know — or it was hard at the time for anybody to know — just how money was being spent, and by whom.

Well this hasn't happened in our industry, and that's why, again, we're not waiting for the regulation to happen to do this, we're doing it. Because we think that transparency is really important.

What scares you about your role in this democracy at the moment?

We have an important responsibility and we have a big role, because people use Facebook. And I think what really matters is that we learn from what's happened; security is an ongoing game. You build something someone tries to get around it. This is going to keep happening, and we need to learn and iterate quickly.

I'm also very focused on keeping the good that happens on Facebook. I was in Houston earlier this week, and I met these two brothers, Nathan and Austin, and when Hurricane Harvey happened they jumped in their boat and they rescued people — including one elderly woman who an ambulance can't get to, who they say might have died. Do you know how they found those people? Those people posted their information on Facebook. They said where they were publicly, to strangers.

Now I'm not saying every day is Hurricane Harvey, but the good that happens when people share all around the world — the small businesses, I visited many small businesses in Texas this week that are growing, and they'll tell you, just because of Facebook. And so we're focused on preserving the good that we believe in so deeply, while protecting people's information.

But what scares you?

We have a big responsibility. We have to get it right. We didn't foresee the interference in the 2018 election —

2016, you mean?

Sorry, 2016. Yeah, in 2016 when you thought about security for elections, what you thought about was hacking, and people stealing your e-mails and publishing them. This was a new form. We are now focused on that form, but we're increasingly trying to see around the corner and make sure that we know the next form.

And I think it's going to take all of us — were working much more closely with the other tech companies, we're working closely with election commissions all around the world, which is really important. We're going to have to figure out what the next form of the next IRA is and get ahead of it.

Is there something of a contradiction here? In that you want to have a community — or many, many many communities really — you want to be democratic, you want input from people, but no matter how much you talk like that, the reality is it's a company and you make the rules. And it is hard to be democratic in that way when this company is so influential and so vast.

I think the decisions we make are important — and you're right that they have major impacts, and that's why we need to be held accountable to them. We're working hard at explaining them better, at being much more transparent, and especially showing people what's happening on Facebook.

So again to ads transparency: One of the big questions that happened in the [2016] election was who advertised to whom, because things are targeted to different people. That becomes completely open and transparent.

And it's interesting — a lot of the things that journalists will find on our site, we built the tool for people to find them. And that's good! That's good. Because as we open up for more ads, people are going to find bad ads — they're going to find ads that go against our policy, and that's part of how we're going to be able to get those down and get them down faster.

I want to mention that we're in the middle of this corporate headquarters — Mark Zuckerberg strolled by a few minutes ago. I'm curious: Having known him as many years as you have, how has he changed as a leader and as an executive, since he hired you?

Well, Mark was 23 when he hired me, so he's certainly changed a lot of personal levels — he's gotten married, he's had two children. And obviously this company has grown greatly. There are a lot of things that have changed.

We've learned a lot. We've learned a lot from the mistakes we've made, from the steps we need to take to be much more proactive — and much more suspicious of what can be done.

But there are things about Mark that haven't changed at all that I deeply admire. Mark stands up and takes responsibility. You know he runs this company, and through this whole situation he has said "the buck stops here — I own it." And all of us who work for him — because a lot of those mistakes were made by us — have deeply respected that. He cares about people connecting. When I got home from Houston the other night I called him and I told him about the brothers I met and the other people I met, and he cares.

You know the other day I met a woman named Natalie, she's from Little Rock. She did one of our birthday fundraisers — brand new product we have — she raised $4000 for a local women's shelter. And she volunteers at that shelter, and it takes them $1,500 to rescue a woman from an abusive home. And when I talked to Mark about that, Mark has the same belief he always believed, in that, there are really good things that happen when you bring people together. And that that kind of commitment I think is really admirable.

When you say he cares about people connecting, that is really great. And yet that's one of the reasons that I wonder if what is great about Facebook is also the problem.

You probably know that there was a leaked memo from 2016 from a Facebook executive who said "we care so much about connecting people that even if we connected people who used our platforms to coordinate a terrorist attack, we're fine with that, because we're still just connecting people." That was 2016. You still believe that?

We never believed that. The person who wrote it, named Boz, never believed it — he's a provocative guy who was trying to spark debate. But Mark never believed it, I never believed it.

OK so maybe it was hyperbole that he was leaning in the way that he did believe, that maybe you cared too much about this, and too little about other things.

Let's go to the example: There's no place for terrorism on our platform. We've worked really hard on this — 99 percent of the ISIS content we're able to take down now, we find before it's even posted. We've worked very closely with law enforcement all across the world to make sure there is no terrorism content on our site. And that's something we care about very deeply.

But what about the broader point? Essentially he was saying "the company's values are out of whack — we're interested in one really big, important thing," perhaps to the exclusion of other things.

Again, that memo was wrong, and he said he didn't mean it, and Mark and I certainly never agreed. We never only cared about one thing. We cared about social sharing, and we cared about privacy — that's why we put the controls in place. I think the balance was off, because we didn't foresee as many bad use cases — and that balance has shifted, and shifted hard now.

One or two other questions and then there will be a photograph and we'll let you go. I have no idea at the time is ... oh yeah, we're getting to be about that time.

Let me just ask — yeah, a couple more minutes and then we'll do this.

People in Silicon Valley talk so often about changing the world. Do you believe that this company has changed the world, and is there a way that it's changed the world for the worse?

It's such a good question. I believe we've done some really good things, and I believe we've made some really bad mistakes. I believe that every single day good things happen on this platform. And I believe that a bunch of bad things have happened on this platform that we need to do better getting off.

You know someone today is going to find their mother on Facebook, and someone today is gonna find a friend they were lost in touch with. And someone today is going to put up a piece of content that's really hate content that we don't want on there, and we have to find it and get it down fast. All of that's true.

On that last one, are you comfortable being the censor? Which is effectively what you would have to be, wouldn't it?

We're trying to have very good community standards — we're open about what those community standards all around the world, and we're going to get increasingly open about this.

We want to make sure people understand — you know, there's no place for terrorism, there's no place for hate, there's no place for bullying. We don't sell your data ever, we don't give your information to advertisers. You're not allowed to put you know hate content on our site. With news, we rely on third parties — we don't believe we can be the world's fact-checkers — but that doesn't mean we don't have a big responsibility.

And I think in all of this, what we want to do is make the shift we need to make to be more proactive in the protection, so that we can protect something we really believe in and love, which is the sharing that happens on Facebook.

But I think you know what I'm asking you — you have people who want Facebook not to be allowing such manipulation. But at the same time there's somewhere a libertarian listening to us who is saying "I don't want a company to be Big Brother, because no matter how good they get at it sometime it's going to be abused."

And we try to be really careful about that. That's why we do have a lot of free expression on Facebook. While we'll take down things that are absolute hate, boy there's a lot of stuff on Facebook that I don't like. But someone said it, and if you believe in free expression you've got to let them say what they say.

The most important thing for accountability that we can do — because you're right, we are a company — is we can publish those standards, make them open. As Mark said the other day, we're working on a much more open appeals process so people can appeal those decisions. But we're also going to make sure that people understand what those standards are and can see things transparently.

Sheryl Sandberg, thanks very much.

Thank you for being with me.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Steve Inskeep is a host of NPR's Morning Edition, as well as NPR's morning news podcast Up First.