Friday, March 9, 2012

Epic Demos Samaritan on Nvidia's Next-Gen Kepler GPU


During this years Game Developers Conference, Epic showed the infamous Samaritan Demo running on Nvidia's Next-Generation Kepler GPU.

At the 2011 GDC conference, Epic introduced the Samaritan demo, which provided users a look at the next generation of videogame graphics. The demo utilized a host of advanced rendering techniques to create a realistic environment. The issue with the demo in 2011 was it took three GeForce GTX 580s to run the demo in real-time. At this years GDC, Epic showed the demo utilizing only one next-generation Nvidia Kepler GPU.

Updated Unreal Engine 3 Tech Demo "Samaritan"
In addition to the power of the Kepler GPU, we see the benefit of Nvidia-developed anti-aliasing technique, Fast Approximate Anti-Aliasing (FXAA),  aimed to improve upon the established success of Multisample Anti-Aliasing (MSAA), the form of anti-aliasing most commonly seen in today’s games. FXAA is used to smooth out jagged edges and improve visual fidelity, and anti-aliasing is key to creating the incredible sights of Samaritan.

"Without anti-aliasing, Samaritan’s lighting pass uses about 120MB of GPU memory. Enabling 4x MSAA consumes close to 500MB, or a third of what's available on the GTX 580. This increased memory pressure makes it more challenging to fit the demo’s highly detailed textures into the GPU’s available VRAM, and led to increased paging and GPU memory thrashing, which can sometimes decrease framerates. FXAA is a shader-based anti-aliasing technique,” however, and as such “doesn't require additional memory so it's much more performance friendly for deferred renderers such as Samaritan,” according to Ignacio Llamas, a Senior Research Scientist at Nvidia who worked with Epic on the FXAA implementation.

As reported earlier, we recently saw the first images of the Nvidia GK104 Kepler card and now we are starting to see some visual performance benefits with Kepler. Most reports have Kepler slated for releasing later this month, though still no official word from Nvidia yet.

source

No comments:

Post a Comment