Nvidia and Weta Digital have developed an advanced ray-tracing system that is capable of quickly and accurately processing billions of polygons.
The engine, dubbed PantaRay, was designed to accelerate pre-computation of scene occlusion data and dynamic image-based lighting in James Cameron’s Avatar.
“The complexity of Avatar motivated us to think about rendering differently. We do our final beauty-pass renders with RenderMan, but to optimize artistic iterations on Avatar’s huge data sets, we moved the bulk of the calculation to a pre-computation step,” explained Sebastian Sylwan, Weta’s head of research and development.
“The issues we needed to solve weren’t as much about rendering as they were about high-performance computing and we realized that using the massively parallel power of a GPU to solve problems is Nvidia’s expertise.”
According to Sylwan, the above-mentioned approach allowed Weta to render complex scenes in less time, while simultaneously reducing the required number of memory modules and processors.
Consequently, visual artists gained a “critical ability” to execute faster iterations, which facilitated higher quality renderings and photorealistic results.
It should be noted that the development of the PantaRay engine has prompted Weta to “further embrace” Nvidia’s approach to GPU computing.
“Nvidia ported Weta’s PantaRay engine to a CUDA-based GPU driven version that runs 25 times faster, utilizing an Nvidia Tesla S1070 GPU-based server instead,” Nvidia spokesperson Mark Priscaro told TG Daily in an e-mailed statement.
“Weta [now] plans to incorporate PantaRay running on Nvidia Tesla GPUs into its pipeline for the upcoming Steven Spielberg/Peter Jackson film, Tintin, as well as exploring new ways in which PantaRay and GPUs can further accelerate its overall visual effects pipeline.”
Nvidia remains “committed” to gaming industry
Peddie: Intel will “never” buy Nvidia
Nvidia: Our nForce chipsets are “better” than Intel’s
Nvidia applauds FTC action against Intel
Nvidia touts rapid GPU performance boost
Nvidia unshackles Visual Studio with Nexus GPU integration