Open search

Next-Generation Game Engines Revealed

Avatar Mattrunks
Published on 05 April 2013
by Mattias Peresini

On the occasion of the Game Developers Conference 2013, which took place a few days ago in San Francisco, many new developments in video games were presented. Among them were the impressive latest evolutions of game engines and their real-time rendering capabilities. Below are some demos of the next-generation game engines that will power the video games of tomorrow.

Next-Generation Game Engines

Unreal Engine 4

These two technical demos are rendered 100% in real-time by the Unreal Engine 4 rendering engine, developed by Epic. The first one, Infiltrator, runs on a consumer PC with a standard processor, 16GB of RAM, and a GTX 680. The second, Elements, runs on a PlayStation 4. Pretty amazing, right?

Cry Engine 3

Cry Engine 3 is the engine developed by Crytek, the company behind Crysis. You can even download the development kit for free from their website for non-commercial use. Once again, the real-time rendering is impressive. The lighting effects, depth, flare and HDR light management, vegetation rendering, and physical simulations — everything is designed to immerse the player in a lifelike environment.

Fox Engine

Fox Engine is the game engine developed by Konami, which will power Metal Gear Solid 5. The facial system, using 3D scanning of clay sculptures, produces increasingly realistic characters. The similarities between real photos and 3D renders are also quite striking.

Frostbite 3

Frostbite 3 is the engine developed by DICE that will power, among others, Battlefield 4. There’s no technical demo yet, but there’s 17 minutes of gameplay available, offering a concrete look at the game — and therefore what’s rendered in real-time.

Luminous Studio

For its next game engine, Square Enix highlights a complex hair simulation, showcased in the video below.

Techniques Used

Just like the Element plugin, game rendering engines rely on GPUs (graphics processors) to compute images in real-time and do not support raytracing. The main drawback is the inability to generate reflections, refractions, and transparency realistically. Today, only raytracing can handle these physical calculations, but it is very resource-intensive. We’ll have to wait a few more years before real-time raytraced rendering becomes possible.

In the meantime, developers are finding ways to simulate realistic reflections, lighting, and shading effects in real time, without raytracing.
SSAO (Screen Space Ambient Occlusion) and SSDO (Scene Space Directional Occlusion) are examples of techniques used to simulate ambient occlusion in real-time. Bump, Normal, and Parallax mapping are different techniques to create complex relief effects on simple geometries (low polygon count), all with very fast render times.

These new engines introduce additional techniques, enabling even more realistic rendering. Thanks to Tim for the list:

  • Tesselation (subdividing polygons for smoother and more detailed surfaces)
  • Bokeh depth of field (like the After Effects plugin Lenscare by Frischluft, for instance)
  • Particle engines (often animated by vortexes). They should be in every game, much like Particular’s arrival in After Effects.
  • Procedural animation
  • Greatly improved shaders (materials), including SubSurface Scattering for realistic skin translucency
  • Abandoning Blinn–Phong shading for BRDF (a set of functions to precisely model light/material interaction, comparable to real-world materials)
  • Hair simulation

With these new techniques, real-time effects calculated by game engines are becoming increasingly realistic, both in visual rendering and in physical simulation. What remains to be developed is even more advanced Artificial Intelligence, bringing us ever closer to the experience of an “interactive movie.”

The Convergence of Video Games and Cinema

Given the ever-improving quality of these rendering engines, one might wonder when we’ll see animated films made entirely with game engines.
Crytek has already recognized the potential of these engines for the film and VFX industries, developing CryEngine Cinebox.
Cinebox is a platform based on CryEngine, enabling real-time audiovisual content creation. Whether for previsualizations (low-quality previews used to plan scenes before shooting) or final renders, there’s no doubt that game engines have a bright future in post-production studios. Digital Domain and Fox CineDev are among the beta testers of this new tool.
Game cinematics, which are already visually stunning, might soon be created directly from in-game images — who knows!

When Will We See Real-Time in Our Apps?

Even though things are evolving, in the world of 3D animation as we know it, most of our software still relies on raytracing, and few use GPU-based computation.
Our render times vary from a few seconds per frame to several minutes for 3D. The more powerful our machines become, the more complex our simulations get, so render times haven’t really improved. In contrast, game engines can compute up to 60 frames per second.

But that will likely change soon. New GPU-based render engines are emerging, such as Octane, available in beta for Cinema 4D. The renders are realistic and computed in a fraction of the usual time.
Of course, it’s not perfect. You need to use proprietary materials and lights, which makes reverting to traditional setups difficult. Still, GPU rendering is very likely to gain traction in the coming years and eventually be integrated natively into 3D suites — even if that requires partial rewrites of current render engines.

It doesn’t take a fortune teller to imagine that, in a few years, we’ll be able to render photorealistic images almost in real time on affordable consumer machines. Then we’ll be able to fully focus on the creative side of production and spend less time on settings and optimizations, constantly striving to reduce render times. The dream of our clients — and their insatiable appetite for changes!

About the author

I am the Founder of Mattrunks.
I work as Creative Director and Motion Designer in my studio. I also create video tutorials to share my passion of motion.

Want to be informed when a
new tutorial is available?