Google ARCore, which was recently renamed to Google Play Services for AR, is Google’s attempt at expanding Augmented Reality and its experiences to more and more devices without the need for specialized niche hardware, unlike the erstwhile Project Tango. Google is now making ARCore more immersive for a wider variety of devices through the new Depth API.

Devices under Project Tango, like the Lenovo Phab 2 Pro, relied on dedicated hardware in the form of sensors and cameras to enable the device to perceive depth and 3D space. The need for specialized hardware, however, meant that devices needed to be consciously built for optimal AR experiences, which in turn ended up disturbing the smartphone user experience. ARCore flipped the equation by removing the need for dedicated hardware, thus bringing the optimal AR experience to smartphones that had already nailed down the user experience.

ARCore is now expanding on the availability of its optimal AR experiences through the new ARCore Depth API. This new API improves immersion for devices with a single RGB camera, as it allows developers to make use of Google's depth-from-motion algorithms to create a depth map. This depth map is created by taking multiple images from different angles and comparing them along as the user moves the phone, estimating the distance to every pixel.

Depth data is useful for enabling features like occlusion: the ability for digital objects to accurately blend around real-world objects.

ARCore Depth API: Occlusion

Occlusion as a feature is now available to over 200 Million ARCore-enabled Android devices through Scene Viewer, the developer tool that powers AR in Search.

Beyond occlusion, 3D depth data also enables other possibilities, like more realistic physics, path planning, surface interaction, etc. Depth API can thus enable developers to create experiences that can have objects accurately bounce and splash across surfaces and textures, as well as new interactive game mechanics that enable players to duck and hide behind real-world objects.

Since the Depth API is not dependent on specialized hardware, it will work across a wider range of devices. But of course, better hardware will improve the experience. Additional sensors for depth mapping, such as time-of-flight (ToF) sensors, will allow developers to unlock new capabilities like dynamic occlusion -- the ability to occlude behind moving objects.

If you would like to try out the new Depth API, Google asks you to fill out the Call for Collaborators form over here. Google will then reach out to the collaborators who it feels would be the best fit to push the technology forward.


Source: Google Developers Blog