Facebook's troubles with fake news and faulty metrics might show the limits of algorithms, and the advantage of humans, when it comes to nuanced judgment calls, one industry watcher said.
"I sometimes wonder whether humanity can survive the internet," said Walter Isaacson, president and CEO of the Aspen Institute and Steve Jobs biographer. "It's been such a wonderful 20-year, 30-year run on the internet, and now you see the proliferation of fake news, hate sites, that sort of thing."
Critics have homed in on Facebook's role distributing misinformation. After analyzing 1,000 posts, BuzzFeed News found that 38 percent of all posts on hyperpartisan right-wing Facebook pages were either a mixture of true and false or mostly false, while 19 percent of ultra left-wing posts were either a mixture of true and false or mostly false.
Facebook's Mark Zuckerberg has wavered on the social media platform's role in swaying the recent presidential election, calling the idea "pretty crazy." But internally, Facebook employees have reportedly developed tools to stem the spread of false news and have organized a renegade task force. And Facebook and Google have vowed to cut off fake news sites from advertising tools.
After surfacing false results after the election, Google CEO Sundar Pichai told the BBC that "there should be no situation by which fake news gets distributed."
Having humans debunk which stories to feature could help, the way Yahoo and Facebook have done in the past, Isaacson told CNBC's "Squawk Alley" on Wednesday, Another solution is to have algorithms favor stories that are widely "pointed to," like what Google does, Isaacson said.
"It sounds very reasonable until we try to figure out how to do that," Isaacson said. "As much as we think that machine learning and artificial intelligence and algorithms can sort out the wheat from the chaff, it's actually something that human judgment is better at. I would think we might want to go back to having some humans in the loop."
Still, that solution also has serious drawbacks. Facebook axed human-written descriptions from its "trending topics" after facing accusations of suppressing conservative news.
"This is not one with an easy solution," Isaacson said. "There are all sorts of shades of truth, and frankly, I don't want anybody, whether it's Facebook or CNBC, telling me this is the only news I can hear. I want to have a wide range of choices."
Meanwhile, Facebook, the No. 2 power in digital advertising, according to eMarketer, has struggled to correct errors in its measurements of its reach. The company announced on Wednesday that it would open up more third-party verification of its data after the discovery of a bug.
Isaacson defended Facebook, which also reported a bug in its video metrics earlier this year.
"This does not look like it was an intentional mistake," Isaacson said. "It certainly didn't affect ad rates at all, ... so I think that they're trying to get it right and we're in a new age here, and they're being open and honest. They're the ones who revealed this information."
The pressure on Facebook comes as it faces looming competition from ephemeral messaging app Snapchat. Snapchat's parent company, Snap, confidentially filed for an IPO that could value it at $20 billion to $25 billion.
"It's much broader than just quick conversations that disappear," Isaacson said. "To me, the ultimate test of a tech company like this is not just, 'Do you have a cool service,' but 'Do you have a platform upon which other people can build good products and good services?' That's, of course, what made the iPhone so great, ... it became a platform. I think what they're doing with Snap now, is making it a platform."
Over the next two months, Snap will show investors if it can really execute on great new products and move beyond a flash in the pan, Isaacson said.
"Customers are going to value knowing what's true or not, and eventually it will sort out," Isaacson said.