Days after news broke that a political analysis firm with ties to President Donald Trump’s campaign had improperly gained access to data from 50 million Facebook users, Facebook CEO Mark Zuckerberg has broken his silence on the situation. Zuckerberg’s media appearances come as lawmakers in the United Stats and the U.K. are demanding he testify on the matter, while some are calling for Facebook to be better regulated.
In various interviews, the Facebook founder discussed his plans to better protect users’ data moving forward, why his company didn’t take stronger and swifter action against Cambridge Analytica, and more.
Below are some of Zuckerberg’s key comments about the Cambridge Analytica scandal.
Speaking to Wired, Zuckerberg discussed his views on how Facebook might best be regulated in the future:
“There are some really nuanced questions, though, about how to regulate which I think are extremely interesting intellectually. So the biggest one that I’ve been thinking about is this question of: To what extent should companies have a responsibility to use AI tools to kind of self-regulate content? . . . I think there’s this really interesting question of: Now that companies increasingly over the next five to 10 years, as AI tools get better and better, will be able to proactively determine what might be offensive content or violate some rules, what therefore is the responsibility and legal responsibility of companies to do that? That, I think, is probably one of the most interesting intellectual and social debates around how you regulate this. I don’t know that it’s going to look like the US model with Honest Ads or any of the specific models that you brought up, but I think that getting that right is going to be one of the key things for the internet and AI going forward.”
In a CNN interview, Zuckerberg addressed his stance on testifying before Congress or other bodies:
“The short answer is I’m happy to if it’s the right thing to do . . . What we try to do is send the person at Facebook who will have the most knowledge. If that’s me, then I am happy to go.”
In that same CNN interview, Zuckerberg discussed plans to alert people who may have been affected by the Cambridge Analytica scandal:
“One of the most important things that I think we need to do here is to make sure we tell everyone whose data was affected by one of these rogue apps. And we’re going to do that. We’re going to build a tool where anyone can see if their data was a part of this. . . . We may not have all of the data in our system today, so anyone whose data might have been affected by this, we’re going to make sure that we tell. And going forward when we identify apps that are similarly doing sketchy things, we’re going make sure that we tell people then too.”
Speaking with Recode, Mark Zuckerberg discussed the company’s failure to foresee this scenario even though Facebook’s data was long open to outside developers:
“You know, frankly, I just got that wrong. I was maybe too idealistic on the side of data portability, that it would create more good experiences. And it created some, but I think what the clear feedback was from our community was that people value privacy a lot more. And they would rather have their data locked down and be sure that nothing bad will ever happen to it than be able to easily take it and have social experiences in other places.”
In the same interview, Zuckerberg also talked about his role in moderating and policing content and activity on Facebook:
“I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. . . I have to, because [I lead Facebook], but I’d rather not. . . But I just wish that there were a way. . . a process where we could more accurately reflect the values of the community in different places. And then in the community standards, have that be more dynamic in different places. But I haven’t figured it out yet.”
Zuckerberg also discussed with Recode how Facebook plans to crack down on outside apps:
“So we can get a sense of what are reputable companies, what are companies that were doing unusual things. . . Like, that either requested data in spurts, or requested more data than it seemed like they needed to have. And anyone who either has a ton of data or something unusual, we’re going to take the next step of having them go through an audit. And that is not a process that we can control, they will have to sign up for it. But we’ll send in teams, who will go through their servers and just see how they’re handling data. If they still have access to data that they’re not supposed to, then we’ll shut them down and notify. . . and tell everyone whose data was affected.”
In yet another interview with The New York Times, the Facebook founder talked about how he intends to ramp up the company’s security efforts:
“One of the important things we’ve done is, we want to unify all of our security efforts. . . One of the big things we needed to do is coordinate our efforts a lot better across the whole company. It’s not all A.I., right? There’s certainly a lot that A.I. can do, we can train classifiers to identify content, but most of what we do is identify things that people should look at. So we’re going to double the amount of people working on security this year. We’ll have more than 20,000 people working on security and community operations by the end of the year, I think we have about 15,000 now. So it’s really the technical systems we have working with the people in our operations functions that make the biggest deal.”
He also discussed whether or not Facebook would consider rethinking its advertising-based revenue model given the risks presented by sharing data with third parties:
“That’s certainly something we’ve thought about over time. But I don’t think the ad model is going to go away, because I think fundamentally, it’s important to have a service like this that everyone in the world can use, and the only way to do that is to have it be very cheap or free.”
In that same Times interview, Zuckerberg addressed whether he’s worried people will abandon Facebook, as some have threatened to do.
“I don’t think we’ve seen a meaningful number of people act on that, but, you know, it’s not good. I think it’s a clear signal that this is a major trust issue for people, and I understand that. And whether people delete their app over it or just don’t feel good about using Facebook, that’s a big issue that I think we have a responsibility to rectify.”