Google announces new developer features at the “Hey Google” Smart Home Summit
This week, Google is hosting its virtual “Hey Google” Smart Home Summit. This is a 2-day event focused on the new tools and features for the smart home developer community. At the event, Google is announcing a few platform tools and routines for developers to be aware of.
New Devices for the Smart Home for Entertainment Device (SHED)
In case you’re unaware, Google categorizes device types that can work with the Assistant. Each device type also supports certain traits, which are sets of commands related to those device types. Back in April, Google announced a set of Smart Home for Entertainment Device (SHED) types, and it included devices like set-top boxes, speakers, and consoles from brands like Xbox, Roku, Dish, and LG. Today, Google is making those APIs public for any smart TV, set-top box, or game developer to use. Furthermore, Google has announced that they are expanding the SHED options to include AV receivers, streaming boxes, streaming sticks, soundbars, streaming soundbars, and speakers. They’re also introducing a new trait called “Channel” to allow the Google Assistant to recognize commands to change the channel.
For more information on these new device types and traits, check out Google’s webpage under the Smart Home category of the Google Assistant docs.
Smart Home Controls in Android 11
Next, Google reiterates its work on the smart home Device Controls feature in Android 11. As you may know, the power menu in Android 11 can now display controls for smart home devices. With a long-press of the power key, you can quickly access these controls from anywhere. The controls are customizable and can be accessed from the lockscreen as well. It’s one of Android 11’s best features, in our opinion.
Improved state reporting and reliability
To coincide with the new Device Controls feature in Android 11, Google wants to make sure that smart home controls are accurately reporting the state of the connected IoT device. Google says that they will introduce more tools to measure your app’s reliability and latency to help improve and debug state reporting. Google says this will reduce query volume on your servers and “improve the user experience with an accurate device state across multiple surfaces.” Back in April, the company launched the Local Home SDK to support local execution of certain Assistant commands through the local network. The Local Home SDK supports both Chrome and Node.js runtime environments, and apps can be built and tested on local machines or personal servers. All developers can use the Local Home SDK through Actions on Google Console.
Improved discovery with AppFlip
Adding new smart home integrations can be useful to reduce user churn, but getting users to discover those new integrations can be a challenge. To that end, Google is launching “AppFlip” on the developer console to reduce the standard account linking flow to 2 steps. Users will be able to migrate from the Google Home app to your app without needing an additional sign-in.
Google also wants developers to know about recent enhancements to logging tools. The company integrated event logging and usage metric tools from Google Cloud Platform to give developers visibility into their smart home integrations. The Local Home SDK, account linking flow, and Smart Home events have received enhancements in project logging, and developers can analyze aggregated metrics from the developer console or build logs-based metrics to find trends. Developers can also create custom alerts in GCP to find production issues. Lastly, the Smart Home Analytics Dashboard in the developer console comes pre-populated with charts for metrics like Daily Active Users (DAU) and Request Breakdown. Developers can set alerts and get notified if an integration has any issues. This dashboard can be accessed from the “Analytics” tab in the Actions console or the Google Cloud console.
Updates to Device Access program
Last year, Google announced a change from the “Works with Nest” program to the “Works with Google Assistant” program. As part of that shift, Google created the Device Access program for partners to integrate directly with Nest devices. To support the Device Access program, Google will launch the Device Access Console, a “self-serve console that guides commercial developers through the different project phases.”
This console allows developers to manage their projects and integrations, provides development guides and trait documentation for all supported Nest devices, and allows for creating custom automations, but only for the homes they’re a member of.
Lastly, as routines are a big part of smart home technology, Google is expanding what they can do. Later this year, more Google Assistant devices will gain presence detection so they can automatically trigger routines based on whether the user is at home or away, much like Nest devices. The new Lights Effect trait has also gone public to allow developers to slowly brighten or dim smart lights at specific times or when a morning alarm is triggered. Later this year, Google will also enable Gentle Sleep and Wake effects out-of-the-box for any smart light; Google first launched this feature on the Philips Hue last year.
Personal routines will be also extended with support for custom routines designed by smart home partners. Per Google, developers will be able to create and suggest custom routines that can even work with other devices in a user’s home. Users can browse and opt-in to suggested routines and then choose to have their Nest or other smart home devices participate in that routine.
Be sure to tune into the “Hey Google” Smart Home Summit to learn more about Google’s smart home plans.