Facebook CEO Mark Zuckerberg is testifying before Congress this week to discuss the company's data privacy practices, as well as issues like fake news and Russian tampering in the 2016 presidential elections.
According to a transcript of his prepared testimony, he will admit that Facebook has made mistakes on these issues, including privacy: "It's not enough to give people control of their information, we have to make sure developers they've given it to are protecting it too." He will also talk specifically about the Cambridge Analytica scandal, where a developer created a Facebook quiz that it used to build detailed user profiles, which were later passed to the political data analytics firm against Facebook's policies.
But this is not a new issue. Zuckerberg has been thinking and talking about privacy ever since he built the predecessor to Facebook, called Facemash, at Harvard back in 2003. When users of the service complained that their pictures were being used without permission, Zuckerberg took the site down and apologized.
Since then, a fairly consistent pattern has emerged. Zuckerberg and other Facebook employees rarely talk about "privacy," but rather operate from the assumption that people want to share information, as long as they can control how it's used. And as was the case with Facemash, sometimes the company goes too far -- in which case Zuckerberg apologizes, Facebook makes changes, and life goes on.
Here's a comprehensive look at what Zuckerberg has said about privacy and controlling data, according to CNBC research and Michael Zimmer's "Zuckerberg files."
Harvard's student newspaper interviewed Zuckerberg in 2003 about his pre-Facebook project, facemash.com, which asked students to rate the attractiveness of their classmates. Pictures were scraped from housing websites and uploaded.
The website quickly caused outrage and was permanently taken down. A then-19-year-old Zuckerberg said:
"I don't see how it can go back online. Issues about violating people's privacy don't seem to be surmountable. The primary concern is hurting people's feelings. I'm not willing to risk insulting anyone."
In an apology letter, he wrote, "I hope you understand, this is not how I meant for things to go, and I apologize for any harm done as a result of my neglect to consider how quickly the site would spread and its consequences thereafter...I definitely see how my intentions could be seen in the wrong light."
Zuckerberg built the first version of Facebook from the ashes of Facemash. In 2004, with just hundreds signed on, Zuckerberg made sure to clarify that the extensive search capabilities were countered by privacy options for members who didn't want everyone to be able to look up their information.
"You can limit who can see your information, if you only want current students to see your information, or people in your year, in your house, in your classes. You can limit a search so that only a friend or a friend of a friend can look you up. People have very good control over who can see their information."
Early Facebook investor Jim Breyer interviewed Mark Zuckerberg in 2005, during which Zuckerberg responded to an audience question about his approach to the ethical and legal implications of monetizing Facebook.
"But in terms of the ethical implications of creating this? I mean what I kind of saw this as is enabling out for your flow of information.... We're not asking anyone to put anything out there that they wouldn't be comfortable putting out. We're not forcing anyone to publicize any information about themselves. We give people pretty good control over their privacy. I mean you can make it so that no one can see anything, or no one can see your profile unless they're your friend. And I think that we encourage people to use that stuff. We point people to it."
You can watch the full presentation here. Zuckerberg's comments on ethics start around 41:20:
Zuckerberg delivered a guest lecture moderated Professor Michael D. Smith at Harvard in 2005, where he talked about Facebook's expansion from Harvard to other schools. The idea of privacy and data usage came up:
"We have a lot of stuff that we put in place to make sure that people don't aggregate information off of Facebook. Obviously, you can't see profiles of people at other schools. But also, if you try to view a lot of profiles, it picks up that you're just viewing an abnormal number of profiles. By analyzing user activity, we've built Bayesian filters that let us pick out abnormal activity really quickly and just show very limited information to those users .... We're obviously really sensitive to people's privacy."
You can watch the full video here:
Business Insider reported an exchange between Zuckerberg and a friend that occurred shortly after Zuckerberg launched The Facebook in his dorm room:
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask.
Zuck: I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend's Name]: What? How'd you manage that one?
Zuck: People just submitted it.
Zuck: I don't know why.
Zuck: They "trust me"
Zuck: Dumb f--ks.
The New Yorker later reported on the same message, and said Zuckerberg later told investor Jim Breyer that he regretted sending it and similar ones.
In 2007, Facebook faced its first big privacy flap after it had expanded beyond universities to the general population. The company introduced Beacon, which allowed third-party sites to publish user purchases unless users opted out. That meant that some users unwittingly broadcast to their friends information like their movie purchases from Blockbuster or plane ticket purchases from Travelocity.
Zuckerberg addressed it in a Facebook blog post that has since been archived:
"We've made a lot of mistakes building this feature, but we've made even more with how we've handled them. We simply did a bad job with this release, and I apologize for it. . . . Instead of acting quickly, we took too long to decide on the right solution. I'm not proud of the way we've handled this situation and I know we can do better....
We were excited about Beacon because we believe a lot of information people want to share isn't on Facebook, and if we found the right balance, Beacon would give people an easy and controlled way to share more of that information with their friends.
But we missed the right balance. At first we tried to make it very lightweight so people wouldn't have to touch it for it to work. The problem with our initial approach of making it an opt-out system instead of opt-in was that if someone forgot to decline to share something, Beacon still went ahead and shared it with their friends."
Zuckerberg addressed sharing and public posts at the Web 2.0 Summit:
"I would expect that next year, people will share twice as much information as they share this year, and next year, they will be sharing twice as much as they did the year before," he said. "That means that people are using Facebook, and the applications and the ecosystem, more and more."
His remarks were later dubbed "Zuckerberg's Law."
Zuckerberg spoke to Wired in 2009 about the creation of public profiles:
"Just a couple of weeks ago we announced this open privacy setting where prior to that it was impossible for someone to take their profile and say that they wanted it to be open. Now they can do that. They can say it's open to everyone. And what I would just expect is that as time goes on, we're just going to keep on moving more and more in that direction....Just from the launches that we've had, it's pretty clear that we haven't mastered the art of moving people along in terms of change, making these changes; but I think we're getting better at it."
Zuckerberg responded to a question about "pushing the envelope" on privacy during an award speech in 2010:
"It's interesting looking back, right? When we got started — just a night in my dorm room at Harvard — the question a lot of people asked is, 'Why would I want any information on the internet at all? Like, why would I want to have a website?' And then, in the last five or six years, blogging has taken off in a huge way, and all these different services that have people sharing more information. And people have really gotten comfortable not only sharing more information — and different kinds — but more openly with more people. And that social norm is just something that's evolved over time. And we view it as our role in the system to constantly be innovating and updating what our system is, to reflect what the current social norms are. A lot of companies would be trapped by the conventions, and their legacy of the systems that they've built. Doing a privacy change for 350 million users is really it's not about the type of thing that a lot of companies would do. But I think we view that is a really important thing to always kind of keep a beginner's mind and think, 'What would we do if we were starting about the company now, and the site now? ' We decided that these would be the social norms now and we just went for it."
Zuckerberg told Time in 2010:
"The way that people think about privacy is changing a bit ... What people want isn't complete privacy. It isn't that they want secrecy. It's that they want control over what they share and what they don't."
Zuckerberg wrote an op-ed in The Washington Post in 2010, in which he outlined Facebook's principles for privacy:
"We have also heard that some people don't understand how their personal information is used and worry that it is shared in ways they don't want. I'd like to clear that up now. Many people choose to make some of their information visible to everyone so people they know can find them on Facebook. We already offer controls to limit the visibility of that information and we intend to make them even stronger.
Here are the principles under which Facebook operates:
— You have control over how your information is shared.
— We do not share your personal information with people or services you don't want.
— We do not give advertisers access to your personal information.
— We do not and never will sell any of your information to anyone.
— We will always keep Facebook a free service for everyone."
Zuckerberg appeared on NPR in 2010 saying:
"There's this false rumor that's been going around which says that we're sharing private information with applications and it's just not true. The way it works, is ... if you choose to share some information with everyone on the site, that means that any person can go look up that information and any application can go look up that information as well. ... But applications have to ask for permission for anything that you've set to be private."
He discussed the "serendipitous connections" that Facebook enables in on stage in 2010:
"Privacy is very important to us. I think there are some misperceptions. People use Facebook to share and to stay connected. You don't start off on Facebook being connected to your friends, you've got to be able to find them. So having some information available broadly is good for that. Now, there have been misperceptions that we're trying to make all information open, but that's false. We encourage people to keep their most private information private. But some of the most basic information, we suggest that people leave public.
"We recommend settings for people, and we asked that everyone review their settings and make a choice about what they wanted them to be. We didn't simply change them….The big feedback that we got was that the privacy settings had become too complex. Over the years we'd just accumulated many, many settings.
"More than 50 percent of Facebook users have changed their privacy settings at one point. That demonstrates that our users understand the tools, he says. 'To me, that's a signal that on the whole, we're getting it right and giving people the control they want.'
Zuckerberg told The New Yorker in 2010:
"If I could choose to share my mobile-phone number only with everyone on Facebook, I wouldn't do it. But because I can do it with only my friends I do it.
A lot of people who are worried about privacy and those kinds of issues will take any minor misstep that we make and turn it into as big a deal as possible," he said. "We realize that people will probably criticize us for this for a long time, but we just believe that this is the right thing to do."
Zuckerberg shared a Facebook note to his personal page in 2011 after the FTC signed a consent decree governing Facebook's use of personal data. He said:
"I founded Facebook on the idea that people want to share and connect with people in their lives, but to do this everyone needs complete control over who they share with at all times....
This idea has been the core of Facebook since day one. When I built the first version of Facebook, almost nobody I knew wanted a public page on the internet. That seemed scary. But as long as they could make their page private, they felt safe sharing with their friends online. Control was key. With Facebook, for the first time, people had the tools they needed to do this. That's how Facebook became the world's biggest community online. We made it easy for people to feel comfortable sharing things about their real lives....
I'm the first to admit that we've made a bunch of mistakes. In particular, I think that a small number of high profile mistakes, like Beacon four years ago and poor execution as we transitioned our privacy model two years ago, have often overshadowed much of the good work we've done... not one day goes by when I don't think about what it means for us to be the stewards of this community and their trust."
On the decree and the hiring of chief privacy officers, Zuckerberg said:
"[T]his means we're making a clear and formal long-term commitment to do the things we've always tried to do and planned to keep doing -- giving you tools to control who can see your information and then making sure only those people you intend can see it .... As a matter of fact, privacy is so deeply embedded in all of the development we do that every day tens of thousands of servers worth of computational resources are consumed checking to make sure that on any webpage we serve, that you have access to see each of the sometimes hundreds or even thousands of individual pieces of information that come together to form a Facebook page....
We do privacy access checks literally tens of billions of times each day to ensure we're enforcing that only the people you want see your content. These privacy principles are written very deeply into our code. .... We will continue to improve the service, build new ways for you to share and offer new ways to protect you and your information better than any other company in the world."
When Facebook went public in 2012, the company wrote to shareholders:
"[W]e hope to rewire the way people spread and consume information. We think the world's information infrastructure should resemble the social graph — a network built from the bottom up or peer-to-peer, rather than the monolithic, top-down structure that has existed to date. We also believe that giving people control over what they share is a fundamental principle of this rewiring."
In response to reports about the PRISM surveillance program and other revelations from activist Edward Snowden:
"When governments ask Facebook for data, we review each request carefully to make sure they always follow the correct processes and all applicable laws, and then only provide the information if it is required by law. We will continue fighting aggressively to keep your information safe and secure.
We strongly encourage all governments to be much more transparent about all programs aimed at keeping the public safe. It's the only way to protect everyone's civil liberties and create the safe and free society we all want over the long term."
In early March 2014, reports surfaced that the National Security Agency posed as Facebook in controversial surveillance program. Zuckerberg responded to the allegations by telling users that he was "so confused" and frustrated that the U.S. government might have undermined his security engineers.
"To keep the internet strong, we need to keep it secure. That's why at Facebook we spend a lot of our energy making our services and the whole internet safer and more secure. We encrypt communications, we use secure protocols for traffic, we encourage people to use multiple factors for authentication and we go out of our way to help fix issues we find in other people's services....
Unfortunately, it seems like it will take a very long time for true full reform. So it's up to us -- all of us -- to build the internet we want. Together, we can build a space that is greater and a more important part of the world than anything we have today, but is also safe and secure. I'm committed to seeing this happen, and you can count on Facebook to do our part."
Earnings conference call (July 2014)
Zuckerberg spoke about privacy on the company's earnings call in July 2014:
"I think something that's misunderstood about Facebook. One of the things that we focused on the most is creating private spaces for people to share things and have interactions that they couldn't have had elsewhere. So, if you go back to the very beginning of Facebook, rewind 10 years, I mean there were blogs and things where you could be completely public and there were e-mails, right? So, you could circulate something completely privately. But there was no space where you could share with just your friends, right? I mean it wasn't a completely private experience, but it's not completely public and it's 100 or 150 of the people that you care about.
And creating that space which was a space that had the kind of privacy that no one had ever seen before was what enabled and continues to enable the kind of interactions and the content that people feel comfortable sharing in this network that don't exist in other places in the world.
So, we're constantly looking for new opportunities to create new dynamics like that and open up new different private spaces for people where they can then feel comfortable sharing and having the freedom to express things that you otherwise wouldn't be able to."
Reposting a comment from subsidiary WhatsApp, Zuckerberg said: "Facebook stands with many technology companies to protect you and your information."
The original post, written by WhatsApp cofounder Jan Koum, said:
"The purpose of security is to safeguard privacy. Billions of people share their most personal, intimate information using services like ours, and they expect all of us to keep it safe from criminals and other bad guys. Asking a single company to undermine the security of its product for an investigation threatens the security of all of us in the long run.
Today, WhatsApp and other companies are asking a U.S. court to overturn an order that would require Apple to weaken the security of its product. We are proud to stand together to demonstrate how these efforts go beyond what the law allows and how they compromise the values upon which our country is built."
In a long post about Facebook's evolving mission, Zuckerberg wrote:
"As we discuss keeping our community safe, it is important to emphasize that part of keeping people safe is protecting individual security and liberty. We are strong advocates of encryption and have built it into the largest messaging platforms in the world — WhatsApp and Messenger. Keeping our community safe does not require compromising privacy. Since building end-to-end encryption into WhatsApp, we have reduced spam and malicious content by more than 75%.The path forward is to recognize that a global community needs social infrastructure to keep us safe from threats around the world, and that our community is uniquely positioned to prevent disasters, help during crises, and rebuild afterwards. Keeping the global community safe is an important part of our mission — and an important part of how we'll measure our progress going forward."
Zuckerberg posted a lengthy statement to his Facebook page on Wednesday following the misuse of personal data by research firm Cambridge Analytica.
He said, in part, "We have a responsibility to protect your data, and if we can't then we don't deserve to serve you."
Watch: Mark Zuckerberg's 2004 interview on CNBC