Facebook's attempt to remove bias from its Trending feature by automating it may not solve the problem, according to one artificial intelligence start-up.
"It's an interesting challenge, because it speaks to the problem of our algorithms," said Byron Galbraith, chief data scientist of Talla. "The data that drives the algorithm is inherently biased."
The problem with using a formula to curate news is that it is reliant on the data it is given, Galbraith said. For example, studies have shown that feminine names when put in current algorithms tend to be associated with topics around the home or occupations like nursing or teaching, he said. Masculine names are connected with professions like lawyers or doctors. Neither slant may ultimately reflect the interests of an individual reader.
"Even though the math part of the data is unbiased, if the data encodes these social biases, it's hard to get away from that," Galbraith said.
On Friday, Facebook announced on a blog post that it would be automating its Trending section. The change would remove the need for people to write descriptions for trending topics.
Instead of seeing a topic and a sentence about why it was newsworthy, users will now see a key term and the number of people talking about that topic on Facebook. Hovering over the phrase will show a description pulled from one of the articles being shared on the social network. Clicking on topic will show a list of news articles and posts shared by users around the issue.
"It will help distance Facebook from accusations of bias," said Matt Lang, a senior digital strategist at digital agency Rain.
Part of why people are up in arms is that Facebook is a platform that is just serving up content but now it's being seen as a curator of content, Rain's Lang said. As people bring in fictitious stories and exaggerate things on their personal news feeds, the social network will need quality assurance from human teams as it "trains" technology to figure out what is reliable and not.
"I think it's early days, and they have work to do to clean it up," Lang said. "The sources themselves, which are the primary source for Trending content, those publishers and sites have bias. It's always going to have to involve some element of oversight to make sure the publication that is trending is not overly biased."
Already, this past weekend a false story about Megyn Kelly being fired from Fox News began circulating. It was accepted by Facebook as a legitimate topic, but was ultimately discovered to be a hoax.
The only way to counter these problems is to employ people to skew the equations to be more "balanced," said Talla's Galbraith.
"You can let the algorithm have its bias; we all know it's there," Galbraith said. "But, you can do an additional step. If you're trying to suggest news articles and 10 suggested liberal-leaning articles show up, the algorithm may create a model that makes everything liberal. You can have people throw away too many liberal articles or put in the pool the conservative ones, so it collects a balanced number."
Facebook told CNBC in a comment that people are still involved in the Trending feature but more for guidance. However, rather than editorial skill sets, the team is more operations and technically focused.
"There are still people involved in this process to ensure that the topics that appear in Trending remain high-quality — for example, confirming that a topic is tied to a current news event in the real world," a spokesperson wrote. "The topic #lunch is talked about during lunchtime every day around the world, but will not be a trending topic. These changes mean that we no longer need to do things like write topic descriptions and short story summaries since we're relying on an algorithm to pull excerpts directly from news stories. Our team will still strictly follow our guidelines, which have been updated to reflect these changes."
The company said the decision to automate the Trending feature was an effort to make it available for as many people as possible. Relying on humans to summarize news topics would not be efficient. Instead, using an algorithm could make the process faster and allow more stories to be included.
Earlier in May, Gizmodo reported that Facebook's Trending feature was skewed because it employed "news curators" who put their internal preferences on the news stories and media stories featured. Some employees said they were told not to use conservative news sources, while others said they artificially injected some topics even though they were not that popular, Gizmodo said.