Are there any plans to support Lumen in a more static or precomputed lighting way?
It would be great if it could be a single solution for a wide scalability range and simply forgo some of the realtime features.
How can we prevent a mesh from drawing into the lumen scene?
We have muzzle-flashes that draw into the Lumen scene for 1 frame, and then leave a screenspace GI after image that fades away.
Can we get more documentation on how to configure lumen quality and performance?
Sadly Tessellation is no longer supported in UE5, but I suppose there is going to be an alternative in the future given its broad use cases that pre-computed tessellation can’t offer.
What can we expect from such an alternative (if it comes)?
What are the approaches that are being looked into? (compute shader or still tessellation shader)
Are there any plans to support Lumen and Nanite in instanced stereo for VR? With the NASA Mars collaboration there were some comments that Lumen and Nanite might in the near future work with instanced stereo.
Is water, and translucent shaders actively being addressed to render correctly in instanced stereo? This seems to be a significant VR rendering challenge amongst all engines currently.
Question 1: Is mass visualization ( the instanced static mesh rendering ) planned to have support for mobile/vulkan/etc
I have mass running on a quest 2 using vulkan, but for some odd reason all of the mass visualization stuff is squashed on one axis. As in, it looks like one axis was scaled down to zero, making the characters totally flat.
Also, trying to run that mass visualization in vulkan android preview in the editor crashes the editor.
Question 2: What is the state of the mobile deferred renderer?
I can successfully build an .apk and launch on a quest 2 with mobile clustered deferred rendering enabled, but the scene is entirely black. I can hear my game audio in the background and get haptic feedback, but I essentially get no image on the screen. Where are we on the timeline to get deferred rending for devices like quest 2 / vulkan compatible android?
Question 3: when will mesh distance fields become available for mobile pipeline outside of mobile HDR?
I could create a plethora of effects without need of post process if I had access to mesh distance fields. This is ideal for the mobile vr development pipeline.
Question 4: Can we get support for multiple vertex color channels please?
Q1 What is currently on the roadmap in terms of refractive materials? Will real time caustics be a thing any time soon? Any other new material features we can expect?
Q2 Coming from an offline rendering background, and specializing in product viz / configurators, the idea of rendering objects with ‘primary visibility’ is essential for rendering pipelines at scale in that field. Is this feature anywhere on the roadmap?
How would I go about setting up lighting to work in a retail type environment?
Lots of products, lots of lights, and in a closed space.
In the past, we tried working in UE4, but having so many spotlights for focus points caused the framerate to drop pretty heavily. Does Lumen fix that? Are there hard limits on number of lights in a scene with Lumen?
Hi there team
Rendering performance on the GPU at the moment seems very efficient and fast given the complexities we are seeing with Lumen and Nanite, but CPU performance leaves quite a bit to be desired. Specifically in regards to how limited performance becomes in more complex scenes due to single thread speed and due to real-time PSO compilation occuring causing rather large CPU related stutters.
Is the team looking at improving the situation on both these fronts? PSO compilation being a bit more transparent, automatic, and consistent (hitting all particle types for example) and not happening JIT could be a great benefit!
Thank you for UE5.0, and when I saw UE5.1 I realized that you are continue to surprise us with more and more features!
I have see Nanite Landscape, Nanite Displaced Mesh ( I assume Nanite Landscape will support it?), Nanite materials: Mask, WPO, PDO.
But, my main question is about Strata materials, I played a bit with it, a lot of new features. I see a conversion from Strata to legacy materials, would you please explain a bit your future directions with it? It feels more like a new programable rendering pipeline to me.
Light Complexity Viewmode shows reduced overlap on lights configured with IES Profiles, is this accurate?
Distant shadows - what do you recommend for long range open world shadows, are Distance Field shadows expected to be the most performant? Far shadow Cascade(s) also needed?
Light channels with/without Lumen: Is light channel support planned for Lumen? Currently, disabling light channels on an atmospheric directional light preserves the atmospheric influence, which can be used with lumen to add ambient light, is this technique viable or is it breaking anything, or costing a lot?
We know Lumen doesn’t support “translucent” materials but is this specifically the translucent blend mode? what about Additive, others. if we wanted to explore using additive shaders for fake distant light effects or something, would this conflict with Lumen?
It’s a simple career guide question but i would like to hear from you industry professionals, too. What projects/readings would you recommend to a junior university student? I’ve done a ray tracing project via Peter Shirley’s books and now I’am working on my OpenGL renderer.
Lumen is great. However, in interiors, when there isn’t a lot of directional light to bounce around, a lot of areas get really dark. Using point lights and other lights is a solution but it results in spotty lighting, which is not ideal for artistic reasons.
Unlike lightmass, there isn’t an ambient light boost setting for Lumen (attached image). A setting like this would greatly help so that in dark places, there would still be controllable ambient light. This is a feature that would greatly benefit making games with Lumen. Having this feature, would be another parameter to control how dark an area gets apart from using local exposure as a solution. The issue with local exposure is that it is too global as it affects both dark and bright areas together.
In regard to Nanite meshes and overall game performance, is there a scenario when using lower polycount Nanite meshes is necessary? Specifically if there a lot of characters in the scene mixed with many high poly Nanite meshes that are 1 mil poly or more. I understand that Nanite rendering is pretty stable regardless of how many polygons are in assets, but does having million poly assets impact other parts of the rendering pipeline like Lumen or rendering of non Nanite geometry? In other words: Is it okay if we use million poly Nanite meshes for the environment even when we have a crowd of 50 characters next to them in the scene? Thank you.
We’re constantly working on improving Lumen performance and quality, however our target budgets for Lumen are probably not going to change: Epic GI settings targets 30fps on next gen consoles while High GI settings target 60fps.
Dynamic Global Illumination is always going to cost something as it’s very difficult to solve in realtime.
We would like to add support for Lumen’s import time generated data (card placement, Mesh SDF generation) for procedurally generated meshes. As you note the two main problems would be around modularity of the meshes, for Lumen’s Surface Cache, and polycount of the meshes. I’m not sure when we’ll get to it though.
Foliage is not expected to work perfectly in Nanite. Foliage is an aggregate geometry case and aggregate geometry is known to disrupt various parts on how Nanite is able to achieve what it does. In particular mesh simplification and occlusion culling. Solving that will require a fairly different tech solution than what Nanite currently does and will require a large research project to explore.
but we still see the challenge of leaf geometry loss at longer distances.
We have seen this too and it is on the list to address in the near term. I have ideas that should help but the issue with fixing it may be that Nanite foliage will end up even more expensive than it is now. Hard to say how it will go.
Yes, we’d like to support per instance mesh painting on Nanite in the future. It won’t ever be in the form of vertex colors, as with high poly meshes that becomes a ton of data. Instead it will probably be some form of UVless texturing, perhaps volumetric. That way the resolution of the painting isn’t tied to the resolution of the mesh.