Google's speech recognition error rate is getting lower and lower - yesterday, the company said it's now under 5% and has dropped from 8.5% this time last year. And I find that to be more and more the case in my own use: Google seems to recognize almost everything I throw at it now, even when I add Lebanese/Arabic names from my contacts list that I wouldn't expect it to get right.

But if you're wondering how Google's speech recognition fares in comparison to other voice assistants, Wired has made a video in conjunction with Andy Wood and Matt Kirshen (from Probably Science) to show you just that. They tested 4 different questions with some difficult to spell words like Ouagadougou and Benedict Cumberbatch and they did the test on Siri, Alexa on an Echo, and Google Home (no Cortana though), with 8 different accents.

It's a funny video so don't expect anything scientific or any serious data to come out of it, and don't expect accuracy... at all. Some of the accents were stereotypical and not even remotely correct (the Italian one is nothing like what I hear from my Italian friends) and the people were urged to go as crazy as possible with their pronunciation. Still, it's amazing to see how many questions were correctly answered by all 3 systems, and especially how well Google Assistant fared with the most ridiculous enunciations.

Watch and laugh. (You can also cringe at some accents, that's OK.)