Futurist Mike Walsh says Facebook fake news problem shows need to balance humans, automation
The current furor over "fake news," while worrying in itself, could reveal valuable insights into the changing media industry, according to acclaimed futurist Mike Walsh.
"From the fake news issue, we potentially see the future of news, which is algorithms and human beings working together, where the job of humans is designing the parameters and boundary conditions that drive algorithms," Walsh told CNBC on the sidelines of the inaugural Global Entrepreneurship Community conference in Kuala Lumpur, Malaysia.
Walsh, a consultant who specializes in advising corporate leaders on dealing with disruptive technology, is a leading voice on designing businesses for the 21st century, with two books - "The Dictionary Of Dangerous Ideas" and "Futuretainment" - under his belt.
News outlets needed to strike a balance between delegating jobs to automation and to humans, Walsh said, using Facebook as an example.
In August, Facebook laid off employees that curated popular news on the network's "trending topics" section, amid accusations that the workers' selections were biased against conservative U.S. news outlets. In their place, Facebook automated trending topics, leaving algorithms to select what stories appeared.
But that approach wasn't enough, Walsh warned. "Unless you have humans monitoring them, algorithms can lead to distorted results," he cautioned.
This was what happened to Facebook shortly after the decision to remove human curation of the "trending" list, as fake news stories, including one notable report that claimed Fox News anchor Megyn Kelly had been fired by the network, began surfacing on the section.
During the height of the U.S. presidential election campaign, inaccurate or entirely fabricated reports were widely circulated on digital platforms and promoted on Google through the search engine's algorithm. Many believe the phenomenon influenced the U.S. election result, placing pressure on outlets such Facebook and Twitter — among the most popular networks for media consumption — to combat the issue.
Facebook founder and chief executive Mark Zuckerberg, who declines to identify Facebook as a media company, has denied this was the case, insisting that "this really isn't a problem," while Twitter co-founder and CEO Jack Dorsey said the issue was "complicated."
"We have a role and responsibility to make sure that people are seeing what they need to see and they can have easy conversations and really get to the truth, and that's complicated," Dorsey said.
The issue was brought into sharp focus this week when an American man called Edgar Welsh opened fire in a Washington pizzeria after reading online content that claimed the restaurant was at the center of a child trafficking ring operated by Democratic presidential nominee Hillary Clinton and her campaign chief John Podesta.
The 28-year old said he was conducting a "self-investigation" and wanted to rescue the child sex slaves he believed might be held in the restaurant.
Oxford Dictionaries marked the new news environment by choosing "post-truth" - relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief - as its Word of the Year for 2016.
"The future of news may not be humans collecting news, it may be an algorithm that determines truths from post-truths," Walsh said, adding, "Google's probably trying to figure out that problem now."
Just hours after Walsh spoke to CNBC, the Verge reported that this month the U.S. Trademark and Patent Office published Facebook's application for Patent 0350675: "systems and methods to identify objectionable content."
Follow CNBC International on Twitter and Facebook.