Unreal Engine + $60,000 GPU = Amazing, real-time raytraced Star Wars [Updated]

Unreal Engine + $60,000 GPU = Amazing, real-time raytraced Star Wars [Updated]

http://ift.tt/2u6YqER

SAN FRANCISCO—In the computer graphics community this week, companies from Nvidia to Microsoft have been stressing just how important real-time raytracing will be to making games look more movie-like in the near future. Epic Games used a striking demo at a Game Developers Conference keynote presentation this morning to show just how much better raytracing can make real-time, interactive graphics look with top-of-the-line hardware right now.

The Star Wars “Reflections” demo, made with the cooperation of Nvidia and ILMxLAB, showed two extremely realistic-looking and talkative Stormtroopers clamming up in an elevator when the shiny Captain Phasma pops in. Running on what Epic still refers to as “experimental code” (planned to be introduced to Unreal Engine for production later this year) the raytracing in the demo allows background elements like the guns and the opening elevator doors to reflect accurately off Phasma’s mirror-like armor in real time. These are the kinds of effects that Epic CTO Kim Libreri highlights they’ve “never been able to do before [with rasterized graphics].”

With raytracing, developers can change the shape of the light in a scene on the fly, in turn changing the character and the diffuse “softness” of the shadows in the scene. The way the raytraced shadows allow for gentle gradations of light as characters and objects block parts of the scene is one of the most apparent improvements over the harsher, “pre-baked” light in rasterized graphics. The raytracing technology also gives the scene a realistic, cinematic depth-of field effect automatically, with no need for fancy shaders or tricks implemented manually by developers.

Getting a “cinematic” 24fps with real-time raytracing still requires some serious hardware: it’s currently running on Nvidia’s ultra-high-end, four-GPU DGX Station, which lists for $60,000 [Update: After publication, Epic reached out to clarify that the demo was running on a DGX Station, and not a DGX-1 as originally stated during the interview.] . Even with that, some elements of the scene, like the walls, need to be rasterized rather than made fully reflective, Libreri told Ars. And the Volta technology that is key to powering this kind of performance isn’t even available in consumer-grade GPUs below the $3,000+ Titan V, so don’t expect this kind of scene to run on your home gaming rig in the near-term.

But Libreri adds that the tech requirements have been lessened considerably by a de-noising process that dynamically raises or lowers the number of simulated light rays that are necessary for various parts of the scene, based on their importance to the scene in total. Nvidia’s Gameworks de-noising code can get things down to as low as two rays per pixel (from a maximum of six), severely lowering the GPU overhead necessary for doing these complex lighting calculations in real time.

“We’ll be in a hybrid place for a few years where part is rasterized, part is raytraced,” Libreri said. “The film business was based on rasterization for two decades [and] finally started making the move to raytracing a decade ago.”

He added, “It’s just a matter of time” before raytracing is the standard in games, too.

In the meantime, though, we can enjoy this glimpse of just how realistic PC games could look in the future.

Tech

via Ars Technica https://arstechnica.com

March 21, 2018 at 12:24PM

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.