Google’s augmented reality API will soon use two cameras to make better depth maps

Google’s augmented reality API will soon use two cameras to make better depth maps

ARCore is Google’s SDK for creating augmented reality apps, and it doesn’t require specialized hardware to work, unlike Google’s failed Project Tango experiment. All your phone needs is a single RGB camera, an IMU sensor providing accurate gyroscope and accelerometer readings, and extensive calibration data. Using these basic sensors and a single camera, ARCore’s Depth API can create depth maps for enabling features like occlusion, more realistic physics, path planning, surface interaction, etc. While it’s impressive how immersive the experience already is with just a single camera, the experience would be even more immersive if you could introduce another camera to the mix. That seems to be exactly what Google is planning to do with the latest version of its augmented reality SDK.

Using ARCore’s Depth API to create occlusion using a single RGB camera. Source: Google

As spotted by AndroidPolice, the changelog for version 1.23 of the ARCore SDK mentions “dual camera stereo depth on supported devices.” Interestingly, the release notes for ARCore 1.23 on GitHub don’t mention dual camera stereo depth support, but it is mentioned in the release notes on the Google Developers page. The release notes on the Google Developers page points to Google’s list of devices that support the SDK, which was recently updated to state that “dual camera support will be rolled out in the coming weeks” for the Pixel 4 and Pixel 4 XL.

Google’s Pixel 4 and 4 XL are the only Pixel phones to sport a secondary telephoto camera, while Google’s Pixel 4a 5G and Pixel 5 sport a secondary ultra wide-angle camera. Given that adding support for a second camera likely requires a lot of calibration work, it’s possible that some existing devices won’t get support for dual camera stereo depth maps. However, multiple phones have time-of-flight sensors that improve the depth mapping experience quality by reducing scanning time and improving plane detection. It remains to be seen how much improvement dual camera depth support will bring, but hopefully, Google will show off some demos after making an announcement later this month.

About author

Mishaal Rahman
Mishaal Rahman

I am the former Editor-in-chief of XDA. In addition to breaking news on the Android OS and mobile devices, I manage all editorial and reviews content on the Portal. Tips/media inquiries: [email protected]