Google shows off concepts for “socially intelligent” smart displays and tablets with Soli sensors
Google’s Advanced Technology & Projects group, also known as ATAP, is one of the main research and development groups inside of Google. ATAP originally developed the Soli sensor, which recognizes gestures using radar and was later included in the Pixel 4 series and Google Nest Hub Max. Now we have a look at what ATAP is currently working on, thanks to a new YouTube video.
Google ATAP is currently working on a documentary series called “In the Lab with Google ATAP,” where the group shows off its latest research. The first video explores “how the combination of new sensing and machine learning techniques can be used to capture the submillimeter motion of our fingers to create expressive and subtle hand gestures to interact with a variety products.”
Put simply, ATAP wants to use Soli sensors to detect subtle head movements. The video shows off a smart display that pauses a video when someone walks away, or updates the screen with additional weather information when someone turns to glance at the device. Google says its machine learning technology can “estimate head orientation,” which opens up the door for more impressive interaction.
There’s no telling when, or if, any of this technology will appear in commercially-available products. Google’s only current product with a Soli sensor is the Nest Hub Max, which uses Soli for object tracking and face detection. The Pixel 4 and Pixel 4 XL were the only phones with Soli — Google removed the sensor with the Pixel 5 in 2020, to reduce manufacturing costs (and the bezels), and the Pixel 6 and Pixel 6 Pro also don’t have Soli.
ATAP was also the group behind Project Ara, an early attempt a customizable and modular smartphone design. Project Ara aimed to create modules for cameras, storage, displays, and other components that could be swapped and upgraded as needed.