Unreal Engine 5 for Driving Simulation

Unreal Engine 5 is out, with quite a bunch of amazing features. Today we’ll explore what this means for our platform, and for driving simulation overall.

Before we get into specifics, you can check Sébastien Lozé’s post titled “Simulation benefits coming from UE5”, which gives a great overview of UE5 for simulation applications, as ours.

The Matrix Awakens (CitySample)

If you don’t know what UE5 is about, one way to see what’s new is just watching (or playing) The Matrix Awakens: An Unreal Engine 5 Experience. That tech demo is an absolutely amazing and jaw-dropping masterpiece of realtime rendering.

Not only is there a video, a demo (on consoles), insightful tech talks, but you can actually download the whole project on the Marketplace! This just blows my mind, all of that available for free.

So as someone who develops driving simulators, when I see a big city, with tens of thousands of vehicles and pedestrians, stunning visuals, all that running in realtime on modern hardware, and available right now: I am very excited.

Nanite and Lumen

Nanite and Lumen probably are the two headlining features for UE5, and rightly so. However, their impact on driving simulation might not be as important as it’ll be for other fields.

Nanite is obviously a big deal: not having to worry about triangle count is huge. It’ll allow us to create even more immersive environments, with more details, more accuracy. But Nanite really shines by allowing super high-resolution meshes to be used, bringing the smallest of object’s details to our world. Which, when you’re behind the wheel of a car, moving at rather high speed, doesn’t seem as important as it can be in a first or third person game. Nanite also requires DirectX 12, which doesn’t really play nice with OpenXR and Varjo. Still, I’ll gladly take those free triangles, thank you very much!

lumen.png

Lumen brings a new level of realism when it comes to lighting, something we’ve truly never seen in any driving simulator out there. But… it’s still considered Experimental for nDisplay, and doesn’t really work with VR headsets right now, from what I understand. Obviously, Epic is working on it, and I can’t wait for Lumen to be truly available for our use case.

World Partition and OFPA

When UE5’s features were announced in more details, World Partition was the one that really caught my eye. It’s a set of tools and features to make it easier to create and manage large worlds. Once again, the Matrix demo makes a great job of showing all of those: I couldn’t imagine any studio creating a city that big without those tools.

World Partition is mostly targeting open world development, which when you think about it, is not really our use case. In most of our experiments, we know exactly the itinerary the participant will follow, and we can build our whole environment around that. We can divide our sublevels around our scenario, load and unload them at strategic points in the simulation, and so on.

worldpartition.jpg

But that doesn’t mean World Partition has no use for us, far from it actually. First of all, One File Per Actor will make it so much easier to work and collaborate with actors and levels, it’s one of the feature I’m most excited about in UE5 (though it doesn’t have full Git support at the moment). Level Instancing (and Packed Level Instance Blueprints) will also allow easier and more efficient reuse of existing assets. We have a very large collection of buildings and other roadside structures (mostly coming from the Marketplace) that would greatly benefit from being converted to PLIB (is that what we call them?).

And even if we don’t use World Partition right now, we’ll probably end up using it in the future. It gives us the ability to create large environments, that could potentially be reused and modified (Data Layers?) across multiple projects and experiments. It’s something we’ve talked about internally: why not make our own large open world, with city, suburbs, rural, highways, etc.? One single scene for all our experiments. World Partition sure makes that option much more realistic.

Mass and ZoneGraph

Mass is UE5’s Entity Component System, similar to Unity’s DOTS. It’s still very experimental and poorly documented at the moment, but it’s what powers the tens of thousands of moving actors in the Matrix demo.

For driving simulation, we probably don’t really need an ECS: we don’t need a huge amount of living actors in our scenarios, and we usually want to have fine control over their behavior, per experiment protocol constraints. But still, Mass could be interesting, notably for pedestrians, in situations where they don’t really interact with the participant. For example, Smart Objects can definitely find a use.

zonegraph.png

As an OpenDRIVE lover, I’m really looking forward to ZoneGraph. It probably won’t replace OpenDRIVE for all our use cases, as the later holds more data that have some relevance for us; but it still could bring new opportunities, for example to add secondary moving actors. I’m just waiting for the feature to get some official documentation, so that I can write an OpenDRIVE to ZoneGraph conversion tool for our plugin, and see what I can do with that.

Chaos Vehicles

UE5 has (finally) made the move to Chaos, Epic’s own physics system, to replace PhysX. In terms of physics, it shouldn’t have a large impact for driving simulation; however, the change broke all existing vehicles, which now need to be migrated to the new solver.

There’s a guide for that, and the operation isn’t too complex. But, since we mostly rely on Marketplace vehicles for our platform, we now have to patiently wait for each creator to update their products to Chaos, if they choose to do so. And for the dozens of vehicles I made myself, I have no other choice than to migrate them each individually, which is definitely not something I look forward to do.

It also doesn’t help that Sound Cues are broken at the moment, meaning you can’t have proper car engine sound. This is scheduled for a fix in 5.0.2; but in the meantime, it’s pretty hard to do things with cars in UE5.

Human Skeleton

UE5 has a lot of new features around Animation, including new default Mannequins, which now have the MetaHumans skeleton hierarchy.

CompatibleSkeletons.gif

This means that there are now two different humanoid skeletons in Unreal: the legacy Mannequin, used in UE4, and the MetaHumans, used in UE5. Those aren’t mutually exclusive, so you can have both at the same time, and the world won’t collapse (unlike Chaos and PhysX). But it means all assets that you have for the “old” skeleton won’t out-of-the-box work with the new one, which can be expected to become the new norm for Marketplace characters and animations.

In the myriad of new animation features, Skeleton Compatibility aims to solve this potential issue, by allowing different skeletons to share their related data (e.g., animations). I haven’t tried it yet, but you can find some information on the subject on the forums

Conclusion

So, to sum things up, Unreal Engine 5 brings a lot of amazing features that will allow us to create better immersive simulations, at a lower cost. But it might be a bit early for prime time for driving simulation, as some of the engine’s new features aren’t yet fully compatible and optimized for our workflows.

On our side, we’re still on UE4 at the moment, as we started to work on a new experiment earlier this year. We have a working UE5 branch of our simulation platform, mostly for testing purposes and to start migrating content (Marketplace and vehicles). The plan right now is to wait for our main Marketplace content to be updated to UE5 (especially vehicles), run some tests on our nDisplay and VR setups, and then decide if it’s worth migrating the project. But we expect most (if not all) future projects to start from UE5, so exciting times are coming.

Written on April 23, 2022