How many times do you have to say, “Hey, Google” on a day-to-day basis? During the Google I/O event, the company announced two new features: “Look and Talk” and expanded “Quick Phrases.”
Look and Talk lets you look directly at your Nest device and ask it a question without using the hotword. You can opt in to this feature, and once you do, all processing is done on-device. This means that Google — or anyone else, for that matter — won’t see your face or hear your commands.
So how does it work, exactly? Google uses six different machine learning models to distinguish a passing glance from sustained eye contact. If you’re asking multiple questions, this is a great solution. The example that Google gave was, “Show me beaches near Santa Cruz,” with follow-up questions about specific answers.
The next is Quick Phrases, a list of questions that Google can respond to immediately without the hotword. These include common commands like setting a timer, pausing the music, and more.
Powering all of these upgrades are new speech models that run on the device itself to make it faster. More comprehensive neural networks can better understand human speech, even if it’s broken and choppy. In the example, Google Assistant gave the speaker a gentle nudge to finish their question but was able to distinguish what the speaker wanted even with incomplete information. Google says the Tensor Chip is the driving force behind this update, providing the necessary computing power to make on-device speech processing a possibility.
For more news about Google I/O 2022, check here.