“Seurat”, Google’s tool to create high-fidelity mobile VR scenes, is now open source

“Seurat”, Google’s tool to create high-fidelity mobile VR scenes, is now open source

We may earn a commission for purchases made using our links.

Google has announced the open sourcing of its “Seurat” tool, which they revealed at last year’s Google I/O. Seurat is a tool which can be used to create high-fidelity mobile virtual reality scenes, while also reducing the complexity and computing power required to run them. This meant that developers could squeeze more performance out of the same standalone hardware. Seurat’s open source announcement comes the same day the Lenovo Mirage Solo with Daydream has finally been released, an example of a standalone virtual reality headset. We’ve been told for years that virtual reality is going to change the world, and slowly but surely, maybe we’re getting there.

“Today, we’re open sourcing Seurat to the developer community. You can now use Seurat to bring visually stunning scenes to your own VR applications and have the flexibility to customize the tool for your own workflows”, Manfred Ernst a Software Engineer at Google said in an announcement post made on the Google developers blog.

Google isn’t the only company investing in virtual reality. Others like HTC and, of course, Oculus have been at it for years as well. Seurat being open sourced will help further all of these platforms, not just the ones Google wants to help. How Seurat works is through a number of complex algorithms involving geometry and textures – one of the most powerful aspects of it is that items not in your field of vision effectively don’t exist. This reduces the rendering load hugely.

“Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region, and leverages this to optimize the geometry and textures in your scene. It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve”, Ernst further explained in the announcement post.

You can view its code on GitHub, along with documentation on how to use it if you’re interested.

Source: Google Developers Blog