Ask Unreal Anything: Rendering | June 15, 2022 at 1PM EDT

Say a hearty hello to Brian, Daniel and Sébastien, our guests for the inaugural Ask Unreal Anything all about rendering in Unreal Engine.

This event is your opportunity to receive real-time answers to your questions on Nanite, Lumen, global illumination, and other features; overarching development guidance; career advice; industry questions; and anything you may have about our guests themselves.

The team will join us two hours starting at 2022-06-15T17:00:00Z.

MEET THE TEAM

Daniel

Brian Karis is an Engineering Fellow - Graphics. Most recently, he’s led the development of Nanite for Unreal Engine 5. He is most known for work on physically based shading and temporal anti-aliasing, although has touched most areas of real-time computer graphics throughout his career. Prior to Epic he worked at Human Head Studios.

Daniel Wright is an Engineering Fellow in graphics, and Technical Director of the ‘Lumen’ dynamic Global Illumination and Reflections system in UE5. Prior to that, he developed lighting and shadowing techniques for Unreal Engine 3 and 4, which shipped in Gears of War, Fortnite, and a multitude of games licensing UE technology. Daniel’s main passion is real-time Global Illumination.

Sébastien Hillaire is a Principal Rendering Engineer. He pushes for visual quality, performance and innovations in areas, such as physically based lighting and materials, volumetrics, and visual effects. He has developed the water shading, atmosphere and cloud systems supporting real time sky environment lighting. He is now focused on updates to the material system. Before joining Epic Games, he worked at Frostbite / Electronic Arts where he worked on Battlefield, Star Wars Battlefront, FIFA, Need for Speed and more.

In his free time, he likes to chill with his family and play board, card or video games—especially rogue-like. He enjoys playing guitar and composing metal music, is a fan of horror and sci-fi, and amazed by space and nature.

GUIDELINES

  • Ask one question at a time by replying to this topic, using the button at the bottom. Please read through the questions to see if your question has already been asked, before posting your own.
  • Start with one question to give others a chance to participate.
  • Please do not reply to anyone else’s post—the purpose of replies in this topic is to receive answers from our guests. If you’d like to discuss a related topic in more detail, create a new topic.
  • Keep in mind, this is not a support session. Questions that are specific to your project or troubleshooting will be removed.
  • And don’t forget to have fun :slight_smile:

Posts not following these guidelines may be removed by moderators to keep the AMA flowing smoothly. Thank you!

9 Likes

I’m hoping you can provide some color around the future of Nanite and foliage support. I think initial assumptions were that Nanite would struggle with foliage, due to the way Nanite reduces triangles and the inherent size of leaf geometry. That being said, we’ve now seen WPO and Two-Sided material support added to Nanite on the UE5-main branch in GitHub, leading to speculation that Nanite is becoming more of a possibility for foliage. After testing the newest branch, Nanite has proven mindblowing for up-close foliage, but we still see the challenge of leaf geometry loss at longer distances.

I absolutely appreciate the wizardry that you’re pulling off with Nanite, but it would be nice to have a clearer sense of what you all feel is possible with Nanite and foliage in the near-ish term. As you can see from the popularity and engagement on Nanite/Foliage-related discussions and videos, this is a big deal, but at the end of the day we’re a bunch of devs excitedly speculating about a feature that we know requires a lot of very difficult work on your part.

If the elimination of leaf geometry is a challenge that you feel you are still a long way off from solving, limiting Nanite and foliage usage to scenes with higher screen space values, can you provide a few details about what you could see as alternatives or hybrid approaches? Maybe giving devs the ability to add our own octahedral imposter to the end of the Nanite LOD train? Easier ability to swap between Nanite/Imposter meshes in a blueprint based on screen space?

It’s so brutal knowing we would be leaving so much on the table skipping Nanite for trees with what it is capable of now that we’re desperate to find a solution that accommodates longer draw distances/smaller screen spaces.

5 Likes
  • Q: Nanite and Vertex Painting, if and when?

  • Q: Nanite and WPO, if and when? - Edit: I see this has been partially asked, what’s the timeline for this then?

  • Q: Landscapes at large scale tend to create video memory exhaustion from loading all masks at full res, so if you’re working on a 16k> landscape, everything dies as soon as you hit edit. This brute force method of heightmap size being equal to landscape size requirement feels unsustainable, are there any plans for large scale landscape improvements? And more modularity of terrain detailing from Epic side.

  • Q: Blueprints updating Material Layer & Blend params within a DMI is a real pain atm, mainly construction script doesn’t like it when changing DMI parameter association types from global to layer/blend. Also the same type of DMI changes seem don’t work with IsValid checks either. Do you have any recommendations for trying to update Layer&Blend DMI params from BP constructionscript?
    Also curious if this may be shining a light on a larger issue of setting params on level actors from construction script or if its specific to this instance.

  • Q: Will Lumen receive any performance improvements, If and When? Currently the only viable way to utilize it in a game space is to reduce any lumen related scalebility settings to the lowest possible, bar outright disabling features.

2 Likes

Hello,

Q: Is there any way to activate/switch from Lit to Path Tracing in Runtime (and vice versa). Same as we can display Wireframe (F1) Unlit (F2) and so on, or with any console command?

If not, would this feature have any chance to be added to the UE5 editor at some point? I would love to see this happening.

Example: We can launch Movie Render Queue in runtime and render, Lit mode or Path Tracing, but we can not previsualize Path Tracing result in Runtime as we can with Lit mode (F3 Key)

We can press F8 during runtime (Editor Mode?) and there, change the view mode to Path Tracing, then press Pause and Path Tracing will start “refining”, but most of the BP functionality, current camera position, etc… is gone while in “F8 Mode”, so does not seem as a workaround.

*Would be great to have a proper Runtime Render Tutorial at some point, explaining how to set up most of the render settings with blueprints or python (or both) diving into the possibilities we have to set it up using Movie Render Queue.

Thank you!

Two related questions about the awesome Path Tracer.

Q: When will it support (per-light) Indirect lighting intensity?

Since GPU Lightmass uses PathTracing code and CPU Lightmass supports indirect lighting intensity, I hope that GPULM (and hence PathTracing) will support it in future.

Q: Any chance there will be very preliminary AOV (Arbitrary Output Variables) support for MRQ?

It may be not even in the TODO list, but it would be extremely helpful for advanced composition.

Since there are many kinds of AOVs (see other GPU renderers, e.g. Redshift), I don’t expect Unreal will ever have them all, but I hope to have some minimal infrastructure to ease further custom development.

I hope it’s not too much work for Epic experts because Albedo and Normal are already computed and fed into OIDN, but there is no (easy) way to get them out in MRQ.

Actually we tried to implement AOV support on our own, but feels that it’s easy to destroy performance and/or exhaust CPU/GPU memory for non-expert graphics programmers like us.

We’re planning to implement (besides Albedo/Normal mentioned above): Direct Diffuse, Direct Specular, Reflections, Refractions and GI. No arbitrary LPE(light path expression) support is needed.

BTW: Some renderers (like Redshift and open source LuxCoreRender) support Light Groups AOV, that allows to change the power/color of individual light groups without re-rendering. It would greatly improve the workflow for some people (like us), but I understand it’s harder to implement efficiently.

If you don’t think these features will be implemented in forseeable future, some suggestions for implementing them on our own is greatly appreciated!

EDIT: Forgot a small question: when using Path Tracer with MRQ, the GPU utility (from Task Manager) is usually less than 10%, while other GPU renderers can usually get 70%~80%. Is it normal? Any chance to improve the GPU utility and save some render time?

We tried Nsight Graphics but even failed to capture a frame with Frame Profiler or GPU Trace Profiler so we have no idea what’s going on.

Maybe there are too many different materials (we’re using a lot of 3dmax exported assets). If this is the main problem, Will “updates to the material system” (mentioned in the original post) help in this case?

Two questions about Lumen from a friend of mine who is developing an in-house app with lots of meshes procedurally generated at runtime.

  1. Will it be possible to build Lumen-needed data for runtime procedurally-generated meshes?

First of all, our generated meshes are guaranteed to be “modular”, and can satisfy whatever future restrictions because we have full control (which is opposite of many teams, who have difficulties), so it’s likely to be very suitable for Lumen.

However, in runtime, when building UStaticMesh from MeshDescriptions, bFastBuild must be false, so all these meshes cannot bounce light, which made Lumen almost completely useless. I understand that building Lumen-needed Data (MeshCard related data, SDF etc) is time-consuming, but since our procgen meshes are relatively simple, and it’s acceptable to stall our app during generating meshes (it’s not a game), A very rudimentary support is sufficient for our case.

It would be perfect if you can add a flag in FBuildMeshDescriptionsParams that allows to build lumen data with UStaticMesh::BuildFromMeshDescriptions in Runtime, but it’s also good enough to require us to provide some convex shapes/AABBs to approximate the mesh and build Lumen-needed Data out of the approximation shapes or even require us to compute these data ourselves and just provide some interface to import the data.

With the increasingly powerful GeometryCore and GeometryFramework available at runtime, we really want to make good use of them.

  1. Any chance that Lumen will work on high-end iPads?

While I know Lumen will not support general mobile renderers in near future, is it possible to use lumen (software raytracing) with high-end (A15/M1/M2) iPads?

Last time, we added manually added “bSupportsLumenGI=true” to IOS/DataDrivenPlatformInfo.ini’s “ShaderPlatform METAL_MRT” section but it crashed.

Since our app is an in-house app, we have full control of which iPads and OS we use. I wonder whether this is “theoretically possible but no one tried and will not be officially supported” or “there are known problems not easily resolvable”.

We don’t use Nanite (it doesn’t support runtime procgen meshes anyway) or VSM and can sacrifice some lighting features that requires HWRT or other capabilities that iOS doesn’t have, as long as the GI is (roughly) matching DX12 for the same scene.

In general, we’d be happy to know your opinions about supporting more graphics features on high-end iPads. Game industry may not need this much, but for ArchViz and product design we usually use iPads whenever possible.

Thanks!

Thanks so much for taking the time to answer questions. UE5 is incredible; Lumen and Nanite are amazing, and I’ve been really enjoying using the engine.

One of the only major disappointments for me is the lack of support for tessellation (especially on landscapes, but also on other meshes since it was more flexible and - unlike nanite - doesn’t require building everything in external software). I know tessellation was not an ideal solution and came with a lot of its own problems - but it was flexible and solved a lot of use-cases for displacement in a non-destructive way. VHFM as an alternative has too many issues of its own (more difficult to implement, RVT limitations, very buggy). I’ve heard rumors of nanite supporting WPO in the future, but even that could still mean we have to think about pre-tessellating models in an external program (yuck), and this might not apply to landscapes anyways.

At the moment Nanite is amazing, but aside from the limitations described for foliage in the questions above and the limits for landscapes, it also very importantly isn’t non-destructive. So if you need to make changes to a mesh, maybe move a texture that works with a displaced surface for example, you’re now forced to send that model back to 3D modelling (since that texture’s displacement has to be baked-in in advance). This is a huge workflow problem.

So my question is, with so many use-cases that Nanite just can’t do lost from the engine with the removal of tessellation, will we ever see those gaps brought back to the engine (either with Nanite or otherwise) and will it ever be able to work non-destructively (i.e. texture-driven on meshes without the need for pre-tessellating or pre-displacing externally)?

3 Likes

Where are the lighting outputs (the results of each light) of DeferredLightPixelShaders.usf added together in the source code?

As with most systems in Unreal, we have modularity through, you guessed it, modules.
Would there be any possibility of a shader cross compilation module that allowed the use of Open Shader Language as a replacement for HLSL (and it’s underlying GLSL cross-compilation) or would it break too many systems that rely on these shader models?

When does unreal engine implement ReSTIR GI ? is it different than lumen ?

Is there a plan to make improvments in the mobile Dynamic lighting? Why is there a limitation on 4 dynamic lights per scene now that we can use Deferred rendering?

Software Occlusion Culling was essential for Meta Quest 2 VR developers (or for any mobile VR developers for that matter). It appears that UE5 no longer offers such visibility culling option.
In most cases Quest 2 games are GPU bound, leaving almost no room for hardware occlusion culling.
Will you bring software occlusion culling back with UE5.1 ?

4 Likes

Any updates on when Nanite/Lumen will be usable in VR?

Has Epic ditched the “Vulkan Parallel Renderer”? It was archived years ago and never mentioned again.

Why isn’t Vulkan the default rendering API?

Will UE ever support Forward+ (e.g Doom Eternal) rendering path?

1 Like

I have some questions about the DX12_2 featureset.

5.1 supports Mesh Shading for polygons bigger than a pixel. I wonder, could you use mesh shading for deformable meshes and foliage?
Are there plans to enhance/speed up Virtual Texturing with Sampler Feedback? Any plans to integrate DirectStorage directly into the engine? What about Variable Rate Shading? Obviously, every developer is free to integrate these features as the source code of UE5 is open, but I would like smaller developers without coding talent to benefit from these features, so a direct integration into the engine would be nice.

I’ve seen in a lot of places that the ability to add volumes to a blueprint actor (in the construction script) is “in progress”. Just wondering if that’s still true? It would be great to be able to add post-process volumes for procedurally generated areas, or physics volumes to our own custom water sources, as hand-placing volumes for those can be a real pain :sweat_smile:

Also, not directly rendering-related so feel free to ignore, but could we potentially have a demo project that uses world partition, runtime data layers, and the water tool? Learning how to use these things all in combination is a bit tricky with the current documentation. I’m using a custom river system at the moment but would love to use the in-built stuff, it just gets very fiddly with WP and Data layer switching at runtime. Particularly, a demo that maybe has an “On Data Layer Loaded” function? The way it’s done in VOTA is by a continuous timer that checks if the layer is loaded. Is there a better solution than that?

Thanks! :slight_smile:

Any plans of adding more options to “break” Lumen’s physicallity? Things like having more AO or having diffuse or emissive boost per mesh or material like we had in previous systems like the CPU Baker?

This would really allow great ways to to achieve specific moods and art styles/direction while still using something as great as Lumen.

2 Likes

Q: How is the new Landscape system coming along? Will the new system be connected to the Nanite system?

2 Likes

What, if anything, is Epic doing to improve the situation regarding Shader/PSO compilation hitching and stuttering in PC versions of games that utilize Unreal Engine?

This issue is becoming so dire and prevalent in UE PC games, and seems to be only getting worse. People are starting to notice this trend.

3 Likes