At Google I/O today, big changes have been revealed as to how we're going to be able to interact with Google Assistant going forward. Perhaps the biggest change for most is that you'll no longer be required to say "Hey Google" in order to trigger the Assistant. This was previously rumored but Google has now made it official.

Owners of the Nest Hub Max will start seeing the new options beginning today in the United States. The first is called Look and Talk, and it operates exactly as it sounds. You simply look at the Nest Hub Max and start talking. The device will use face match and voice match to recognize you, so you'll still get personalized results, and all processing is done entirely locally. None of this facial recognition data is sent to the cloud and it's an opt-in service.

Google I/O

The second is Quick Phrases, an expansion of how you can interact with the Google Assistant. Again, it foregoes the need to use the familiar trigger phrase, but you'll be able to do things such as set a timer, ask for the time, or turn on and off your lights.

This comes alongside improvements to how Google Assistant understands you. Assistant is now smarter when it comes to understanding natural speech, including those umms and errs that we'll often toss in. The speech models are being moved on device to make the processing faster. This breakthrough has come through building better neural networks on the Google Tensor chip.

Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.

The example used on stage was asking for a song, but pausing, and not quite knowing the full artist's name. Google Assistant is now smart enough to understand the speech and the pause, and figure out what the missing part was going to be.