subscribe

VR & AR with Realistic Images – Coming soon!

People have described ray tracing (RT) as the holy grail of computer imaging. RT generates (if done properly) a physically accurate representation of an imagined or reconstructed scene.

The number of variables is bewildering, from location and number of light sources combined with the style of each individual light (point source, diffused, flickering candle, etc.), to the materials of the elements in the scene (leather chairs, glass table tops, gravel road, broken wooden shingled roof, etc.). The materials, which can include multi-layered automobile fenders with different indices of refraction that reflect and may diffuse the variety of frequencies of light differently, are the trickiest part of the problem, and the place where most short cuts are taken.

Regional or hybrid rendering is a trick where only the foreground, and possibly foveated – based on where the viewer is looking, part of an image is treated, or maybe only certain highly reflective elements are treated to the full ray tracing process. Those tricks are done to buy time, to allow the processors to have as many cycles as possible to run the algorithm.

The ray tracing algorithm itself is relatively straight forward, but characteristically, it can be easily interrupted, and predictive branching isn’t really helpful, so a GPU can be tripped up by ray tracing, while the CPU just has to grind on and on until the GPU catches up again.

Because of those characteristics, a few companies over the years have tried making ASICs for ray tracing, and whereas they’ve done a decent enough job, they couldn’t reach the volumes of production to reach the economy of scale needed to make them commonly affordable—as peculiar as it may seem, not everyone needs ray tracing. Almost everyone may want it, but few will pay extra for it as we’ve become that good at generating rendering tricks.

So ray tracing is one of the great challenges in computer imaging, maybe even a grand challenge. Those type of challenges attract smart people, and dumb people who didn’t know it couldn’t be done, so they did it. Dr. Reuven Bakalash, founder and CEO of Adshir is one of the smart ones, and he’s figured out a way to short cut through the projection process of ray tracing. We reviewed the approach in previous issues of TechWatch (V16, no16, 2 Aug 2016, page 6, and v17, no 20, 27 Aug 2017, page 6).

Ray Tracing Comes to AR and VR

The company has since adapted its LocalRay technology to smaller handheld devices and applied the capability to AR and VR applications, and they brought it to a hotel in Las Vegas during CES. They used a Microsoft Surface tablet to run it, and showed a miniaturized dinosaur walking across surfaces of different material, reflected appropriately.

Ty Rex 001Adshir AR raytracing demo running on a PC

The demo is significant because it shows light in the room reflected in real time as the creature ambled across the table, casting shadows and appearing in the objects. It even walked across a phone on the table — and left footprints on the black touchscreen.

The dinosaur is a 20k model, rendered and animated in Unity, uses the PTC Vuforia AR toolset and was rendered in real-time at 60 fps.

Ty Rex 002Adshir AR raytracing demo running on a mobile phone

LocalRay uses proprietary, patented algorithms designed for physically accurate ray tracing in VR and AR applications. Adshir’s approach is based on the elimination of the acceleration structures (AS), which are a core component in every ray tracing system. This elimination reduces the time-consuming traversal time, and saves the repeating reconstructions of AS for every major change in the scene.

Both traversals and reconstruction, which are stoppages for real-time, are now a thing of the past, claims Adshir.The net result is LocalRay requires fewer rays. This AS replacement is called the DAS (dynamically aligned structures), and it’s proprietary to Adshir, however, it uses a conventional GPU pipeline. Also, LocalRay is battery power aware, and Adshir says that performance isn’t affected by it.

Adshir has 10 granted patents and another 11 pending, most of them around LocalRay. The company will introduce a SDK for licensing soon, and will be a plug-in to Unity, Unreal, ARkit, ARcore, Vuforia, and more.

What Does Jon Think?

Real-time ray-tracing is and by itself an amazing thing to contemplate. Running on a smartphone seems almost like something out of StarTrek. Yes, compromises still have to be made (the model for instance was reduced from 40k+ polys to 20k—notice its smoothness in the phone example above), and the resolution has be dropped down a bit, but those are parameters that are tied to Moore’s law, and so will get better over time. And then there’s the magic factor—Adshire could just surprise us again and come up with a tweak to the algorithm and increase everything. And least we forget, this is a full-screen rendering, not a zonal rendering. –J.P.

This article is re-published with kind permission of Dr Jon Peddie and was originally published in ‘Jon Peddie’s Techwatch‘ Jon has recently a book “Augmented Reality: Where We Will All Live” published by Springer and available on Amazon.