Google will stop giving snappy answers to stupid questions, the company has announced, as it seeks to improve its search engine’s “featured snippets” service.
That means users should see fewer answers to questions such as “When did Snoopy assassinate Abraham Lincoln?”, to which the service would once merrily respond with “1865” – the right date, but very much the wrong assassin.
“This clearly isn’t the most helpful way to display this result,” said the company’s head of search, Pandu Nayak, in a blogpost announcing the changes. “We’ve trained our systems to get better at detecting these sorts of false premises, which are not very common, but there are cases where it’s not helpful to show a featured snippet. We’ve reduced the triggering of featured snippets in these cases by 40% with this update.”
Snippets, which sometimes show up as a featured response to direct questions asked of Google Search, have long been a cornerstone of the company’s AI strategy. The same technology powers its smart speakers and voice assistants, and lets the search engine satisfy search queries without visitors clicking away to other websites.
But the snippets, which are automatically generated from the contents of websites, have also been a thorn in Google’s side for just as long. In 2017, the company was accused of spreading “fake news” after a featured snippet for the query “Is Obama planning a coup?”, resulted in its voice assistant cheerily telling users: “Obama may in fact be planning a communist coup d’état at the end of his term in 2016,” after it found the information on a conspiracy website.
Other errors have been more comical. The company would gamely tell users that stairs were invented in 1946 – after reading a website that attributed a particular US safety regulation to that date – or unknowingly repeat a Monty Python joke when asked: “Why are firetrucks red?”
In an effort to address the root cause of such mistakes, Google is also rolling out new warnings for times when a search term has hit a “data void” – a question where a good answer might simply not exist.
“It looks like there aren’t many great results for this search,” the site now warns users who hit such a query.
“This doesn’t mean that no helpful information is available, or that a particular result is low-quality,” Nayak says. “These notices provide context about the whole set of results on the page, and you can always see the results for your query, even when the advisory is present.”