Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Forbes
Forbes
Technology
Kevin Murnane, Contributor

Dumb And Dumber: Comparing Alexa, Siri, Cortana And The Google Assistant

Jim Carrey and Jeff Daniels in ‘Dumb and Dumber’

Answering questions posed by users is one of the primary functions of voice-activated personal assistants. Alexa, Cortana, Google’s Assistant and Siri have access to powerful, AI-driven search engines that put the information on the internet at your fingertips. All you have to do is ask.

At least that’s the way it’s supposed to be. Stone Temple put the question-answering ability of the high-profile digital assistants to a rigorous test. The results were discouraging to say the least. Performance ranged from marginally acceptable to abysmal. Only one received what would be considered a passing grade in school. Two couldn’t answer even half the questions posed. These things are zip codes away from smart. Welcome to the world of dumb and dumber.

Amazon’s Echo with Alexa.

The test

Stone Temple asked Amazon’s Alexa, Microsoft’s Cortana. Google’s Home and smartphone Assistants and Apple’s Siri 4,492 questions. An answer was considered “fully and completely” correct if it gave the user “100% of the information that they asked for in the question, without requiring further thought or research.”

This is a demanding test. Providing information that’s related to the question but doesn’t answer it, or which only partially answers the question is counted as a fail. Stone Temple’s criterion of success sets a high bar, but this is as it should be. When we ask a digital assistant a question, we want the answer, not irrelevant or partial information that leaves us no more informed than we were before we asked.

Test results

The results

Stone Temple summarized their results with the histogram shown above which gives the percentage of questions each assistant tried to answer and the percentage of questions it tried to answer that it answered correctly. The percentage of answers attempted is a good index of a digital assistant’s capabilities and more outfits that test these assistants should report it. For example, knowing that Alexa only tried to answer 53.4% of the questions asked, and Siri attempted less than half, doesn’t give you much confidence in either assistant. (Stone Temple provided exact percentages for the bars in the histogram upon request.)

This method of summarizing the results doesn’t provide a straightforward answer to the basic question of how many of the 4,942 questions each of the digital assistants answered correctly. Here are those results based on the raw data which I received from Stone Temple.

Digital Assistant # correct answers % correct answers
Google Assistant Smartphone 3639 74.6
Cortana 2944 59.5
Google Assistant Home 2868 58.0
Alexa 2195 44.3
Siri 1616 32.7

While Alexa and Siri can be fairly characterized as dumb and dumber, none of the digital assistants covered themselves in glory. Google’s smartphone Assistant is the best of the lot, but it only manages a mediocre C on a traditional grading scale. That’s not good, but it’s miles ahead of the others which all received failing grades of less than 60%.

Answering questions is not only one of the main things digital assistants are supposed to do, it’s one of the main things users ask them to do. None of them do it well, two of them do it poorly, and another two can barely do it all.

‘Dumb and Dumber’

Has the bar been set too high?

Maybe I’m asking too much. After all, the user typically doesn’t know the answer to the question either. If they did, they wouldn’t have asked. If neither the assistant nor the user knows the answer, does that mean they’re both dumb? Failing to answer even 60% of the questions asked isn’t good, but getting answers to some things you don’t know is impressive, right?

Ask someone who knows about something you don’t, and they’ll appear smart when they provide the answers. Con artists and manipulators use this strategy all the time because most people don’t consider that being able to answer basic questions about something you already know about isn’t much of an achievement. Everybody can do it, and the only reason it looks impressive is that the “smart” guy knows about something you don’t.

Digital assistants can appear smart for the same reason – they can answer questions the user can’t. But stop and think about the resources the assistants can draw on to answer those questions. Individuals draw on their life experience to answer questions. The digital assistants are accessing the internet. People rely on memory which is notoriously prone to error. The assistants are using powerful AI-driven search engines. We’re talking about Google for crying out loud! Google is so good, Cortana and Siri use it to answer general questions.

Amazon, Microsoft, Google and Apple are tech industry giants. They have billions of dollars, massive server farms and world-leading expertise at their disposal. And the best all of them except the Google smartphone Assistant can do is answer less than 60% of the questions asked? Stone Temple gave several examples of questions an assistant tried to answer and got wrong. In almost every case, the answer is on the internet and easy to find. Yet the digital assistants can’t provide it and often don’t even try. Students who perform at this level flunk out of school.

Google’s Assistant.

I love the idea of digital assistants and interact with them every day. We use Alexa and Google’s Home in our house, and I use Google Assistant on my phone. I think they’re useful now and I expect them to get better. However, none of that changes the fact that if you’re going to sell people a device that’s advertised as being a resource for accessing information on the internet, you’re going to have to do a lot better than this. None of these assistants are smart when it comes to answering questions and most of them look like the modern day dumb and dumber.

If you’re interested in voice-activated digital assistants, here are some other articles you might enjoy.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.