Mesh shaders are already used by Nanite for larger triangles on hardware that supports them. WPO support is coming to Nanite which doesn’t conflict with the mesh shader support we have already.
If you are asking do we plan to exploit mesh shaders outside of Nanite I believe the answer currently is no. Instead we are going the opposite direction and trying to support everything in Nanite, bit by bit. That is a very long road so please don’t read into that as everything can be Nanite soon but that is the long term vision and where we are investing our resources.
Thanks so much for taking the time to provide this thoughtful response; I’m glad to hear at least that this is not an afterthought and that some work is going into bringing lost use-case functionality back to the engine <3
To better support that path we are right now working on some ways to more conveniently and with higher quality pre-tessellate and displace static meshes.
Research is in progress on that now, ie proprietary Nanite tessellation and displacement mapping.
I am particularly excited about both of these, and especially #2, since this would be a huge leap forward for us 3D artists in terms of easing the workflow between external softwares and UE. Have a great afternoon
Are there any plans to integrate color and luminance histograms similar to Davinci Resolve inside of unreal? Current work flow involves take screen shots in unreal and then bringing those into resolve or photoshop to check the values using their histograms.
Read the Physically Based Rendering book for an understanding of the fundamentals.
Then write your own renderer, implementing basic versions of each effect, on a modern graphics API. Once you get the basics working (lighting, shadows, AO, bloom, deferred shading, PBR, foliage, terrain, etc), then build an understanding of where the GPU cost is going and try to optimize. You’ll learn a lot along the way.
In regards to mobile rendering, specifically for VR in Vulkan.
Are there any plans to bring mesh distance fields to mobile? Using them for those soft distance field shadows on static lighting is really nice in SM5, would bring ALOT to Mobile VR Vulkan projects.
So far we’ve been focusing on fixing those issues which make Lumen GI too dark indoors compared to the path tracer. We still have a lot of over-occlusion with small windows when using Software Ray Tracing and foliage. The Local exposure feature can help here too.
I do think we’ll be adding some ambient lighting controls for the problem you are describing, like allowing the skylight to leak in to a controllable degree (similar to how skylight + AO works). Thanks for your feedback.
Just in case you missed it, some tips are there Volumetric Clouds | Unreal Engine Documentation .
Cost will be very high if you execute too many texture fetches. Try to reduce those as much as possible especially for low end platforms. Use compressed texture format. You could try also to use the cloud shadow map instead of secondary tracing (secondary tracing is a lot more expenssive), if that works for your use case. Try to use the conservative input (thus must be a cheap evaluation to be valuable) in order to avoid more expensive lighting evaluation down the line. And reuse that input when evaluating your material.
We do render clouds and the storm using volumetric cloud in Fortnite at 120Hz full screen (with some simplifications) so it is doable.
We have other optimization ideas in the pipes but no ETA as of today.
For volumetric fog: this is supported. We bake light function as 2d item in a simple atlas. It can be enabled with r.VolumetricFog.LightFunction 1 in UE5. Adds light function baking overhead.
For volumetric cloud: could be achieved by following to the same process (sharing the atlas) but has not been done so far. Also the light function for the atmospheric light will likely need to encompass be large area I imagine.
What can we expect from such an alternative (if it comes)?
What are the approaches that are being looked into? (compute shader or still tessellation shader)
Wish I could say more but its too early. All I can say is the research is in the realm of Nanite, would require Nanite to be supported by the platform, would be a novel and proprietary approach, and is focused on displacement mapping as the top priority with other use cases perhaps also fitting. Sorry for being a tease but the potential solutions I’ve explored have run the gambit and I couldn’t possibly talk about details of a specific approach when I’m not 100% sure that’s where we’ll land. I’m definitely narrowing in at this point though. I’ve probably said too much already
Can’t say much on a new Landscape but I can say we’ve done some work in the last couple weeks on converting Landscape heightfields into Nanite meshes and rendering that Nanite mesh instead when Nanite is supported. Whether that is a good idea to do is still TBD.
Q: Will you add more features to the Pathtracer, like real pathtraced subsurface scattering, and maybe support for all the nice volumetric stuff? And related to volumetrics, would it be possible to get a higher resolution for noise, to f.e. realize way more detailed clouds and (space) nebula like this one here?
Some of these requests are actually not possible. The Dbuffer is basically a premultipled alpha texture which then uses premultipled alpha blending. That means the final blend equation is SceneColor.rgb * DBuffer.a + DBuffer.rgb. Add can work with DBuffer.rgb += Color. Grayscale multiply can work with Dbuffer.a *= Color.r. Colored multiply can’t. Subtract only would work if Dbuffer pixel format was signed which it isnt and would double its size.
Picking individual decals to apply to individual meshes maybe could be done but that data would need to be known before the base pass since after the DBuffer has accumulated there is no way to separate individual decal contributions.
I don’t think we currently have any specific plans for expanding DBuffer decal functionality.
Thank you for the quick response! I am really happy to hear that you guys are working on adding some ambient lighting controls. It will be great to get some color control over the skylight feature that you described, to help light up interiors. Once again, thank you so much for all the hard work you guys are putting in to make Lumen even better and for taking suggestions from users of your engine!
Lumen does support GTX 1080 through its Software Ray Tracing (which is the default). Software Ray Tracing through Mesh Distance Fields just requires a Shader Model 5 GPU and enough performance to afford Lumen’s passes.