In the demo presented by Sundar Pichai, Google CEO, he shares recordings of phone calls made by Google Assistant, and they're pretty remarkable for a number of reasons:
- The conversations sound very natural; the hair salon and restaurant on the other end have no idea they're talking to a machine
- The voices aren't 'Google', they sound like real people
- The language is full of fillers like ums and ahs; when the hair salon says "Give me a minute," Google Assistant replies "Mmm-hmm," which the audience greets with whoops and cheers
- In one case it deals with someone with a strong foreign accent, who misunderstands the original requests and then can't provide the solution, and Google Assistant copes admirably with it (the restaurant doesn't take bookings for small groups, so Google Assistant asks how busy it will be on the day they'd like to visit)
Apparently, it's being rolled out over the next few weeks, but what that means in terms of geographic/device spread wasn't stated. As far as natural language processing, AI and the future of work goes however, it would seem that the future is closer than ever.
Not everyone's convinced. Wired compares this to the unveiling of Google Glass and Pixel Buds, neither of which have delivered quite what futurists would have hoped for. But assuming nothing in business will change is not a prudent approach.
But for the time being, Duplex promises to fix problems – booking haircuts and making restaurant reservations – that are trivial at best. Its real promise – of being able hold a huge range of open-ended conversations in a human-like way – remains a decades–away dream reliant on an artificial general intelligence breakthrough. And if anyone is capable of making that breakthrough, it’s probably Google.