FYI, water plugin(single layer water) is fully supported in NvRTX Caustics 5.2.1, I’m gonna give a live stream session to talk about what inside NvRTX Caustics branch:Level Up with NVIDIA
Busy week. Came across this site and found a “Fast Path tracer”
Then a “fast GI” solution from the same guy.
Pretty darn good looking if you ask me and the performance is actually crazy good(more on that later).
That’s interesting. Super fast path tracer on my old computer - and in a web browser.
I also did a fair bit of research into real-time PT a while ago (before NVIDIA released their RTX remix toolkit and similar). I have good news, and I have bad news.
Good news: real-time path-tracing is viable in real-time, and NVIDIA has made a branch of UE that supports it. It can, depending on hardware and scene configuration, actually be faster than lumen, but that really depends on the scene and what shaders are being called.
Bad news: path-tracing, by and large, won’t be faster than rasterization (saying this with many many caveats), for a few different reasons, but one of them is occupancy.
(Super technical here, also very simplified):
Summary
When rendering a scene (and if I misunderstand a computer graphics concept someone please call me out), the pixel shaders in your graphics card are provided information about the scene from buffers (normal, albedo, etc), and perform math to calculate a lighting value. The color of the metal and its’ specularity and its’ normal are computed with a lighting vector to generate a specular highlight, just as an example. Every pixel shader is working simultaneously in some functional block, and every pixel shader would have to finish their task before that functional block could receive new instructions (again, massive simplification by someone who isn’t a graphics programmer). When the pixel shaders are performing simple lighting math, like a specular highlight (N.L), they all generally finish up at roughly the same time, which means the entire functional block isn’t stalled out waiting for just a few pixel shaders to finish up.
Path-tracing, however, isn’t like that.
If you were to path-trace a scene, the visibility and direct lighting portion of the rendering (direct light and shadow) wouldn’t be too different from rasterization in either the look or the performance. But once you start tracing rays for indirect lighting, things get very messy. Maybe a ray will bounce into a light source and resolve with very little computational time. Maybe a ray will shoot off into a distant portion of the scene and take forever to finish. Since all the rays are traversing the scene in very different ways, suddenly your occupancy gets really wonky, and most of that functional block could be stalled out waiting for those long rays to resolve themselves.
And there’s also the issue of coherence, where ray-tracing a scene requires a lot of data, more than can be fit into your GPU’s cache at any one time. If a ray is traversing the scene and suddenly realizes it doesn’t have the data to figure out what it actually hit, then it has to pull that data from the (much slower) GPU memory, which again, stalls out the GPU, and creates what’s known as ‘cache-thrashing’ as memory is brought in and out of GPU cache to render a scene.
And then there’s still the cost and issues with shader coherence, which I won’t get into, but it’s another can of worms.
That demo running on Shadertoy basically runs into none of these issues because it’s small. Don’t get me wrong, it’s definitely using some inventive and intelligent sampling algorithms, but if all you have in your scene is a light, three spheres, and a cube, you can easily fit that into cache, and it won’t cost much of anything to traverse. But when you have massive scenes with billions of triangles, that cost explodes. You suddenly need really powerful algorithms to cut those costs down, and even they can only do so much. Then there’s a conversation about culling and BVH construction methods and skinned geometry support and a whole lot else.
Lumen does everything it can to avoid all the problems real-time PT suffers from. Their tracing rays from probes keeps traversal cost predictable and low. Caching GI to textures lets them reuse data and avoid crazy occupancy issues, and denoising means they can do a lot with very little. Even NVIDIA’s real-time path-tracer still uses plenty of caches and denoising trickery.
You could probably easily make a 60FPS path-tracer game on console, if the scene were trivial. Game scenes are complex though, which generally puts them above what the current technology can handle. With more progress in denoising, intelligent sampling, memory management, who knows, but that’s the current state of the art.
None of this is meant as a criticism of your desire, I still really want real-time PT and I can’t wait until everyone has it. These are just the current roadblocks that stop us from being there.
I’ll keep things tidy
Summary
Could you help me understand how voxels would be able to solve the cost that comes from traversing larger data sets?
Conveniently, the fine people at NVIDIA (among many, many others who have worked on the problem), had the same thought. If a ray exceeds a certain distance traversed, it has a few different options. It can A. Call a hit and return no lighting (resulting in over-occlusion under some circumstances) B. Call a hit and return a skybox color (resulting in light-leaking under some circumstances) or C. sample a nearby probe or coarser data structure (these are the general approaches at least). Remedy’s Control has their RTGI sample their baked lighting if it travels too far, and NVIDIA has a DDGI-esque solution in Cyberpunk 2077. Lumen avoids long diffuse rays by basically not tracing them, and having their distant probes trace lighting in their own neighborhood and interpolate (roughly speaking). Approximations abound.
What do you define as this? Seeking to understand.
There also comes the issue of the overall size of the codebase. A single shader with a simple scene setup could run quite fast, but when one layers in gameplay logic, AI, perhaps procedural FX, particles, and any number of other things, you suddenly have much more overall stress on the hardware.
Hi! So this is my current noise-level in UE5.3. This is happening on all materials with a mixed roughness or a very low roughness value. I tried playing around with the bilateralFilter settings but It didnt help. The only thing that would fix it is basically not using any metallic/low roughness shaders at all.
Also translucent objects are casting a shadow in the lumen scene but are not considered by screentraces at all?
Unfortunately, your problems are pretty much engine featuresets at this current moment; lumen essentially can’t render any surface with a roughness of from .15-3.9 at a usable quality. This has been an outstanding problem since 5.0, and the devs haven’t been able to offer up a solution.
In terms of hacks, you have a few options, but none of them are great. You can bias lumen to treat the surface as more or less rough, which will obviously mess up your art direction but it’ll remove the noise. You can also use the skylight leaking toggle in the PPV, which basically hides noise by making the skylight more heavily influence the scene. There’s also a command that makes the radiance cache accelerate reflection traces, which can save you time and quality. FInally, you can adjust how strongly lumen reconstructs hits in screen space, which can win back detail in reflections and marginally reduce noise.
TL;DR: rough reflections are basically unusable, and there’s no clear solution in sight. I can’t tell if the problem is fundamentally intractable (without AI) or if the lumen team just has much higher priorities, but it severely limits material choices and breaks legacy content.
As for the translucent objects and the screen traces, that’s also pretty normal. Screen traces work by tracing rays against the opaque depth buffer of the scene. Translucent objects are drawn on top of that depth buffer, meaning screen traces can’t interact with them. Lumen can still be aware of them however, so they can both receive reflections and cast shadows.
Hi guys,
I don’t see r.Lumen.HardwareRayTracing.MaxTranslucentSkipCount console command anymore in 5.3. It was a really useful one for us where we had windows with more than 2 translucent faces. Setting it to 4 solved our issues with sunlight not coming through the window glasses. What is the reason that it was removed and is there any way to edit it or is there an equivalent command or solution?
Hey all, sorry for disappearing for so long. It has been intimidating to figure out how to dive back into this thread while balancing all of my other work. I’ll make an attempt to reply to some of the ones I know the answer to. I probably won’t be able to get into anything that requires me to debug or investigate further, unfortunately.
Skipping translucent materials is supposed to be handled automatically for you now via Lumen’s AnyHitShader, rather than requiring us to restart the ray a fixed number of times (r.Lumen.HardwareRayTracing.MaxTranslucentSkipCount), which was a slower approach. Is it not working in your scene? I just did a test and it does seem to be working correctly - meshes with a translucent section have that section skipped in Lumen Scene with HWRT, and it doesn’t block light sources.
We’re not happy with it either. Honestly it’s just a matter of priority, there are too many things going on internally requiring our attention. We had the wet roads and emissive signs in Fortnite CH4 Season 2, and a lot of noise in Lumen Reflections there. In the end we had to use tonemapped weighting to work around the noise (r.Lumen.Reflections.ScreenSpaceReconstruction.TonemapStrength 1). That’s a not a solution really, it effectively removes bright spots in reflections from rough surfaces, but at least it works around noise. That’s currently only usable on/off, but after cl 28056606 (future 5.4) you can set it to fractional values as well.
TLDR: if you are struggling with Lumen Reflection noise on rough materials, try out ‘r.Lumen.Reflections.ScreenSpaceReconstruction.TonemapStrength 1’
We did change Lumen screen traces to no longer pick up translucency in 5.3, and unfortunately this didn’t make it into the Release Notes. Translucency in screen traces is always going to be broken in one way or another, since screen traces are intersecting the depth buffer, which only opaque materials write to. There were some catastrophic cases where foreground transparency would light up the background. Now translucency doesn’t show up in reflections, instead of showing up in the wrong spot.
To confirm, does this mean there is a technical solution, at all, or is this impossible to solve?
‘hardware ray tracing’ is not deprecated in UE5, but rather the Standalone Ray Traced Reflections method. Open your Project Settings, search ‘Reflection Method’ and read the descriptions. Standalone Ray Traced Reflections are replaced with Lumen Reflections with ‘Use Hardware Ray Tracing when available’ enabled.
Have you guys looked into what Nvidia is doing with Ray Reconstruction in DLSS3.5? Seems to do a great job of reducing noise and I heard them say they generate specular motion vectors along with combining denoising and upscaling at the same time.
Are there any plans to implement backscattering support for SSS materials in the surface cache? Noticed that this doesn’t work for Legacy or Substrate materials and screen traces are skipped for everything but subsurface profile.
Wow, that works amazingly well. Honestly from a development perspective that’s more than acceptable, lighting and scene geo can be tweaked but the lack of a solution to noise was quite bad. This changes things.
Pardon me asking, but if glossy reflection denoising isn’t high on the priority stack, what’s on the (preliminary) docket for the lumen team for 5.4?
If you need a shipping example of how jarring it can be, go down an elevator in Elden Ring with a lantern on your belt and look at the walls.
Elden Ring isn’t UE5?
Hi Daniel,
Unfortunately, no. This is what I see with double layered, two sided windows (so 4 faces together) with thin translucent shader. The sun is not casting shadows through the glasses, instead it seems like it is handling it as opaque.
So far, it was working without hardware raytracing, but now even in software raytracing, it is not working properly in 5.3. So far it was working with software raytracing, and for hardware raytracing, we just used the above console command to solve this issue.