As Google gears up for the move to voice-powered devices, the company is tweaking its search engine to give users the best answer quickly, even if it's not exactly sure what they're asking.
Google said on Wednesday that it's rolling out a new format for "featured snippets" that will try to answer multiple different interpretations of a vague search query. Featured snippets are the boxed results that Google puts at the top of the page, based on an algorithmic determination of the best answer to a query. Try searching "is glitter biodegradable" or "chocolate healthy" to see what featured snippets look like.
While the new snippets technology is first coming to mobile searches, the long-term push is into voice.
When a person asks the Google Home smart speaker a question, the Assistant pulls an answer from featured snippets or its Knowledge Graph (a database of facts that Google has built from trusted sources like Wikipedia). In going up against Amazon's Echo and Apple's HomePod, answers need to be more precise because a person doesn't have a list of blue links to select.
"Obviously voice is a very important surface for us," said Emily Moxley, the director of product management for featured snippets, in an interview. "It's the reason why we're doubling down on making sure that we're providing high-quality, authoritative answers from Knowledge Graph as well as featured snippets."
Google's new "multi-faceted snippets" will provide several actionable answers to a broad query. A search for "garden needs full sun?" will lead to two different answers, one for "What garden plants need full sun?" and another for "What counts as full sun?"
Google is putting its heavy investment in natural language processing and machine learning into action.
"This was something that we spotted as an opportunity early on as we were developing featured snippets, but we only now have the technology to actually be able to do address those sorts of questions," said Moxley. "The actual ability to take a specific query, understand that there are specific sub-topics that a user might be interested in, and then find high-enough quality passages that help the user understand those topics is new."
Google is currently only rolling out multi-faceted snippets that answer different possible implied questions, but Moxley said the plan is to expand the format to guidance-seeking questions with multiple components, like "Is it worth fixing my foundation?" That query could return snippets about cost, duration, and methods.
Google is still figuring out how these instances will work with voice, where Google's Assistant is in a heated battle Amazon and Apple as well as Microsoft Cortana.
'If it's wrong, it's really wrong'
"In a search engine, if the first result isn't perfect but the second one, or the third one, is, it's not a catastrophe for the user," said Eric Enge, CEO of Stone Temple Consulting, which has conducted several studies on both smart assistants and featured snippets. "But with voice, you only get one answer. And if it's wrong, it's really wrong. And it's a really bad user experience."
Because of its featured snippets and Knowledge Graph, Google's Assistant could answer 68 percent of queries, with 90.6 percent of its answers correct, according to Stone Temple's most recent survey, compared to 56.5 percent of questions with 81.9 percent accuracy for Cortana, 21.7 percent and 62.2 percent for Siri, and 20.7 percent and 87 percent for Alexa.
But while most of its answers are correct, Google has still made some unpleasant mistakes, like returning cringe-worthy answers for "Are women evil?" or "Are Republicans fascists?" Most recently, Google hit a public relations snafu when people discovered that Home speakers could tell a user about Buddha or Mohammed, but not Jesus. The company has since disabled answers for all religious figures, since those answers are susceptible to "vandalism or spam"
Moxley said that Google will proactively turn featured snippets off in cases where it can't understand a question well, can't find a highly authoritative or correct source or notices that it's not providing sufficient answers. In the direct aftermath of the shooting in Parkland, Florida, Google Home wouldn't answer questions about David Hogg, a student at the school who survived the massacre.