Skip to main content

Updates to Google Assistant could make it the most natural digital helper yet

Google Assistant will reportedly soon be able to obey three commands at once

Google is making Google Assistant a whole lot better. At Google I/O 2018, the company announced a few new changes to Assistant that should help make conversation a lot more natural and conversational.

Most recently, on Monday, June 11, the Made by Google Twitter page posted a tweet (that was later deleted) noting that Google Home “can perform up to three queries at a time, so you can get more done,” CNET reports. For the time being, Multiple Actions are only available for Home users who have set their language preferences to U.S. English, but support for other languages is also planned.

Recommended Videos

Google previously announced that Assistant will be getting six new voices later in 2018 — and one of those is none other than musician John Legend. In other words, if you’re not a fan of the current Google Assistant voice, you will soon be able to change it to make it a little more personal.

Google Assistant: 6 new voices

New voices aren’t the only update to Assistant. Users have long wanted to be able to have a more natural conversation with Assistant, and Google is making that happen. For starters, the company is adding a feature called “Continued Conversation,” which allows you to chat with Google without having to say “Hey Google” every time you make a request. You’ll say it the first time, but after that, you’ll be able to continue talking to Google the same way you would any other conversation.

Google also recently rolled out a feature called Multiple Actions. As the name suggests, you will soon be able to ask Google to perform multiple actions in one sentence. No longer will you need to make different requests to change the thermostat and turn on the TV — now, simply say, “Hey Google, set the thermostat to 68 degrees and turn on the TV,” and it should be able to recognize the two separate actions it needs to perform. But it’s no longer just two actions that you can string together — rather, Assistant will now obey up to three commands at a time.

Image used with permission by copyright holder

Ethics in technology was also a big theme at Google I/O, and along those lines, Google Assistant will now reward polite interactions with positive reinforcement. This is particularly important for kids, who are still learning how to be polite in their interactions. Now, if someone says “please” in their request, Assistant will say something like “thank you for asking so politely.” The feature is called “Pretty Please,” and is rolling out later this year.

On your phone, Google Assistant is also getting a few visual tweaks that should help make it more helpful. For starters, when you make a request, you won’t only get audio responses — you’ll also get rich visual responses in Assistant on your phone. Swipe up from a response, and you’ll also get a rundown of your day — including things like flight schedules, weather, and so on. Along with those visual changes, Google is making Assistant a little more helpful when you’re navigating in Maps. Assistant is now featured in Maps, so if you’re driving and ask for music, it won’t switch to a different screen while you’re navigating. It should be a helpful feature, and will make using Assistant in the car a little safer.

Google Assistant will also soon be able to order food for both pickup and delivery. Not only will you be able to order food using the chat-style interface like before, but you’ll also now be able to ask Assistant to simply order your usual from Starbucks, for example.

As a digital assistant, Google Assistant can do things like book tables at restaurants and book haircuts — and soon it’ll get a whole lot better at doing so. The new feature is called “Google Duplex,” and basically involves Assistant actually calling businesses to get things done. In other words, if you ask Assistant to book you a haircut, it may actually call the hair salon, and book the haircut by talking to whoever picks up. It actually sounds very natural and represents a pretty big step forward for digital assistants in general. Google has put a lot of work into ensuring that Assistant can understand the nuances of language — and based on the examples Google gave, it seems to be able to do this pretty well.

Updated on June 12: Google Home can respond to three queries at once.

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Gemini has killed Google Assistant to become the AI future of Android
Gemini running on the Google Pixel 9 Pro Fold.

Artificial intelligence is spreading its ample wings throughout the Android operating system, right down to Google's decision to rebuild the assistant experience entirely to integrate it inside Android. It means Google Assistant has gone the way of the dinosaur, relegated to the history books as it’s replaced by the next big thing: Google Gemini. What better way to introduce the changes than letting Gemini tell you itself.

“Gemini, Google AI's latest innovation, is set to redefine the Android user experience. By deeply integrating Gemini into Android's core, users can now interact with the AI more naturally, getting assistance with tasks and information retrieval directly within apps. Gemini can even generate images and summarize calls or organize screenshots, all while prioritizing user privacy with on-device processing capabilities.

Read more
4 ways Google is making Android more accessible to everyone
Updates to Android accessibility features as of August 2024.

While most of the attention will inevitably be focused on the Pixel 9 and Pixel 9 Pro today, Google also made some interesting announcements around accessibility in Android at its Made by Google event. Also, likely to the surprise of nobody at all, they include some AI. Here are the four ways Google is improving accessibility in Android.
Magnifier

Originally released in 2023, Magnifier is a very helpful app that only works on Pixel phones. It uses the camera to help people zoom in on the world around them to make reading signs, menus, and other visual guides easier. By integrating AI into Magnifier, it now has a visual search using keywords so you can find relevant terms quickly. Plus, a picture-in-picture view gives you both an overview of what you’re looking at, along with any zoomed-in area.

Read more
Update your Google Pixel phone right now to fix a big security issue
A person holding the Google Pixel 8a

Google just rolled out its July security update for Pixel devices. While last month's Feature Drop added some cool features, like Gemini Nano on more devices, this month's update addresses a critical security vulnerability. So, if you have a Google Pixel device from the Pixel 5a and later with Android 14, make sure to update it as soon as possible.

What’s the critical security issue? It’s known as CVE–2024–31320, which Google says, under certain conditions, allows third-party apps (“3p”) to bypass user prompts. If you have seen this happening on your Pixel device, then you should be aware that it’s not a good thing to have. So make sure you grab the July security update ASAP.

Read more