Google Assistant is not intuitive. Gizmodo reporter Michael Nunez found that out the hard way when he tried to arrange a romantic date with it. I had less lofty goals when I played around with Google Assistant. I just wanted to see how well it stacked up to Apple’s Siri. While Google’s AI bot is extraordinary in some ways, ultimately, it’s still a very dumb digital assistant that fails to live up to its own hype.
When Google showed off Assistant at its October 4th event, the company made a lot of promises. Google Assistant wasn’t just going to be a Siri competitor. It was going to be the next step towards a Her-like future, only with fewer dystopian notes and more magic. Google Assistant wouldn’t just learn your habits, it would be intuitive to use to the point that it could become seamlessly part of your life.
Google Assistant certainly understands the simple questions, and as evidenced in the video above, it often returns results for those questions much faster than Siri. Need the time of day or to find out where the hell you are? Google Assistant returns the answer about two to three times faster than Siri.
And Google Assistant learns diligently. Siri can pick up a personal preference or two—I trained the assistant to call me Master of the Universe, for example. Yet what it can learn and how it can learn it feels absurdly limited next to Google Assistant. I asked both assistants how the Cowboys were doing and then asked who my favorite team was. Siri was confused by the follow up question, but Google Assistant immediately understood and asked if I wanted to make it remember I love the Cowboys forever. I said yes. It remembered. It seemed magic was in the air.
But that’s where the wow factor of Google Assistant ended. It’s fast, and it can learn, but it fails at understanding complex questions, and that’s a dealbreaker. Most users aren’t versed in computer logic. They’re my mom who accidentally activates Siri every time she plays Mahjong, or my friends who chat Siri up after a night of drinking.
People want to converse with their digital assistants. They want a relationship, as narrow and limited as it might be. A digital assistant that can’t understand questions as simple as “I need a doctor” or “where can I get some cookies” is not a good digital assistant. Google Assistant only understood “I need a doctor” 50 percent of the time I asked, while “where can I get some cookies” returned a bad YouTube video 100 percent of the time. And it didn’t want to engage in gentler conversation either.
Siri is sweetly sassy when you ask it if it loves you or tease it with a reference to Google Assistant. Google Assistant is coldly robotic. “I do not understand the question” is its favorite response. And for a digital assistant that was supposed to change the world and how we communicate with our devices, that’s a really dumb answer.