Tuesday, July 27, 2010

Real-time ray-tracing

When I came back from Siggraph in 2001, I told my colleagues at Ubisoft where I was working back then that games and real-time 3D applications would render exclusively using Ray-tracing within 5 years. I was wrong about only 1 thing: it was going to take 10 years or so. Siggraph 2001 had been all about the new capabilities of graphics card that allowed them to be programmed. I thought it was only a question of time before the programming languages become flexible enough to do ray-tracing.

For the less technical readers, ray-tracing is a 3D rendering technique that computes the path of the light that hits each pixel of the image to render. Because it is based on the physical laws governing the propagation of light, it produces images of a far better quality than the polygonal rendering techniques traditionally used for real-time rendering. Ray-tracing has been used for years to render animation movies, but it has always been too computationally intensive for real-time applications.

For the last 2-3 years, the idea of doing real-time ray-tracing on the graphics card has gained some interest and a couple of simple hardware-assisted ray-tracing projects have been successful. It is the case with some volume rendering features in medical imaging applications. But today, the technique is going mainstream with the release of NVidia's OptiX. I've attended NVidia's technical presentation this afternoon and it was very convincing. With only a few lines of codes, it is possible to write your own custom ray-tracer.

For the world of 3D computer animation, it means that what they see in Maya or Blender when they animate will get a step closer to the look of the final rendering.

In the world of medical imaging, this kind of tool will be very interesting to create volume rendering applications that consider principles of human perception (how about some depth of field effect to help surgeons perceive depth?).

At the end of the afternoon, I passed by NVidia's booth in the exhibition hall and they are showing real-time stereoscopic volume rendering of a beating heart (scanned with an MRI) with some overlaid blood flow simulation. Everything is computed on the new 6800 Quadro card (including the flow simulation). The women at the boot said the dataset is 4Gb and it's all residing on the card.

No comments:

Post a Comment