Tech

Chamath Palihapitiya: 'Anybody that pretends that they know the future is fundamentally either lying or stupid'

Key Points
  • Palihapitiya got his big break at Facebook.
  • The venture capitalist said that recent reports of fake news only "scratch the surface" of the unintended consequences that will be created by the tech industry.
  • Palihapitiya said that he has a plan over the next 40 years or so to get really good at anticipating the downside of new technologies.
Chamath Palihapitiya: 'Anybody that pretends that they know the future is fundamentally either lying or stupid'
VIDEO2:2402:24
Chamath Palihapitiya: 'Anybody that pretends that they know the future is fundamentally either lying or stupid'

Outspoken venture capitalist Chamath Palihapitiya said that there has to be regulation to deal with the unintended consequences of technological disruption.

In an interview at the recent Black Enterprise TechConnext event, Palihapitiya told CNBC's Jon Fortt that until now disruptive tech companies have not grappled with their effect on the "disrupted" industries, and phenomena like "fake news."

"I want to be clear, we are making a lot of this stuff up as we go along, and anybody that pretends that they know the future is fundamentally either lying or stupid," Palihapitiya said. "We are going to uncover all kinds of unintended consequences for the things that we built. ... We are only starting to scratch the surface now."

Palihapitiya got his big break at Facebook, where he led a team that he says collected data to replace emotional "lore" with fact-based decision-making.

"Humans are telling you, in passive and active ways every day, what's in their heart and mind," Palihapitiya said. "And when you collect those signals and you learn, you can become really good at giving them what they want. But then you also become really good at understanding their behavior. And now, unfortunately, in some rare-edge cases, in manipulating it."

Palihapitiya said people are just now coming to terms with how complicated and nuanced technology products are.

"As much as we all believe we're individuals — we are — we all act within a range of outcomes that frankly, can be modeled mathematically and be highly predictable," Palihapitiya said. "When you go to Google and start typing in a few letters, and it instantly knows, you must think to yourself, 'Is this the same search result that everybody else gets?' And the answer is no. And then you must think to yourself, 'Well, how creepy is it that Google knew what I was thinking?'"

Palihapitiya said that advances in data collection might help companies predict diseases and eradicate them using gene editing, for example. But that same technology could invite other types of abuse.

"This is where I think you get this intersection of governance and politics and technology that we've never had to contend with before," he said.

Palihapitiya's company is aptly named Social Capital, shaped on the idea that data can create its own powerful narratives, just like a "visionary" might. His team invested in companies like Slack, Box and Bustle, and recently launched an unusual effort to take at least one yet-to-be-named start-up public under the Social Capital brand.

He said that he has a plan over the next 40 years or so to get really good at anticipating the downside of new technologies, and building fail-safe measures into future products.

"What I'm trying to do is say, 'OK, on the one hand, can we build the next great chip for machine learning? Yes. But can we also help the company that employs hundreds of thousands of people in the world [know] how to leverage that technology to keep those employees fully employed? That's a great challenge'," he said.