Here’s how Google designed the Soli RADAR gestures on the Pixel 4

Here’s how Google designed the Soli RADAR gestures on the Pixel 4

Google introduced the revolutionary “Motion Sense” technology on the Google Pixel 4 devices. The feature uses a tiny Soli RADAR chip which detects hand gestures and uses them to interact with the phone even without touching it. The Pixel 4’s Soli motion gestures can be used to change tracks on a long list of music apps, silence calls and alarms, or activate the screen when you’re reaching out for your phone. The RADAR technology was conceptualized by Google’s ATAP team and while we saw initial prototypes as early as in 2014, there is still some uncertainty about how the Motion Sense tech actually works. To elaborate on it, Google has released a detailed blog on the working of Soli gestures.


Pixel 4 XDA Forums || Pixel 4 XL XDA Forums

Unlike typical RADAR systems, Soli detects motion instead of sensing the spatial portioning of a solid object. The Soli RADAR chip, which measures only 5 mm x 6.5 mm x 0.873 mm, emits a 60GHz signal with frequency modulation (FM). When that signal strikes the surface of a hand, it is reflected back to the receiver. The Soli chip detects displacements smaller than a millimeter in the user’s hand based on the reflected signal and determines the gesture of the hand and the velocity of the movement after taking Doppler Effect into account.

An early Soli prototype by Google ATAP

The Soli receiver intricately maps the energy received in the reflected signal and this information is plotted against the object’s velocity, with different indicators to determine whether the object is moving closer to the signal or moving away from it. It can also measure the distance of the object from the sensor. The distance and the velocity of the moving body part or hand can be used to estimate its spatial position in 3D. Google has also optimized the signal chain to improve the signal-to-noise ratio and the chip’s power efficiency.

To effectively tackle the multitude of signal reflections and draw meaningful patterns out of it, Google uses machine learning. These ML algorithms have been trained using TensorFlow by means of “millions of gestures recorded from thousands of Google volunteers.” With the Android 11 developer preview, we also witnessed improvements in Motion Sense’s accuracy on Google Pixel 4, and the same can be expected from future updates.

pixel 4 soli prototype

Early Soli prototype vs the final RADAR chip (rightmost)

Google ATAP worked closely with Google’s Pixel and other hardware design teams to ensure that other signals don’t interfere with Soli’s working. This ensures that signals like audio from the speakers don’t distort or impede on the Pixel 4’s Motion Sense abilities. Not only that, the bigger challenge was to reduce the size of the apparatus so it can fit in the forehead of the Pixel 4 and the 4 XL.

While this is the first noted instance of a RADAR being used on a consumer device, the teams shall endeavor to work on the technology in the future – although there are no promises for support for any device in the future.

Source: Google AI Blog

About author

Tushar Mehta
Tushar Mehta

I am a Senior Editor at XDA. I have been reviewing gadgets for over five years and gained experience by working for prominent tech publications in India before joining XDA. I enjoy fiddling with all smart objects and my love for software customization dates back to the Symbian S60 days. I like to devote my spare time idealizing the romantic union of technology and philosophy or spacing out on Pink Floyd. You may email me at [email protected]

We are reader supported. External links may earn us a commission.