Inside Unreal: DLSS and RTXGI with NVIDIA

How long does it take to hear back about RTXGI? I’ve been waiting since early last week.

I applied Sept 4th, but I’m guessing they’ve had a sudden uptick in submissions due to the livestream (and the viral nature of that caustics prism splitting imagery circulating, even though that isn’t RTXGI).

I also noticed in the fine print of the application at https://developer.nvidia.com/nvidia-rtxgi-sdk-get-started:

Additionally:

So I’m guessing that they want to vet each applicant’s organization and project(s), and this could take a significant amount of time if they received 10s/100s of new applications over the past week…

Has anyone applied and been denied? I wonder if they’re giving reasons in cases of rejection, or if you simply never hear from them…

On a previous page, Richard said that the criteria is looser for RTXGI vs. DLSS so I hope that means I can get in. Not having a reasonable solution for dynamic GI is really holding me back…

So I never got the instructions to work but I was able to run the patch by leaving the patch file in the main folder right clicking on the folder and using git bash. Then used this command in gitbash window


$ patch -p1 < rtxgi-nvrtx4.25.3.patch

No reply from both the form and that email…

Same with me.

I don’t get it. If it’s available as a plug-in why not just release it publicly?

I filled both forms Sep 5, no reply since…

If I try to patch the engine source with the provided RTXGI patch, after running git apply rtxgi-nvrtx4.25.3.patch I get a bunch of “patch do not apply” messages in the command prompt. Any idea how to successfully apply the patch?

Is there any update on this yet, I still haven’t heard back.

I wrote to Richard Cowgill, hoping to hear back… will give updates once I know what’s going on with this program.

Hey guys, I’ll repost what I told decksounds13 here (more or less). But right off the top, I’m sorry if you haven’t been able to get access. As someone else noted it’s on the page we don’t accept applications from personal emails anymore. The approval team did get flooded with gmails, hotmails etc. And so they changed their criteria to look for professional or work email addresses.

They also would prefer it’s for a specific, promising looking project and not just for experimentation right now. If it’s for something specific that you’re working on, even a 1-person indie project can be accepted. Please let them know about it, I think it would improve the odds. They like to see cool projects, good looking candidates that can show off and make good use of GI. I’m personally helping some 1 man indies with RTXGI on their projects, so I know this is possible.

We are in kind of an early access mode right now with the plugin. The reason it’s released this way and not simply put to the Epic store is because it does require code changes to how UE4 handles plugins. It’s doing something very fundamental with the engine code, and it will require some changes to how plugins work in order to make it a general marketplace release. We’re working with Epic on that, but no timeline or promises can be given. It is true that we want this to be as available as we can make it, and it may take a bit of time to really open it up.

So to circle back, I’m sorry about the confusion I caused by stating all you had to do was apply and get it. We do have every intention of making it more available over time, I know that’s not much consolation for people who want it right now. If you apply in the manner I outlined above, you’ll improve your odds of being accepted. I really hope this helps!

I applied with my work email, working on a project with epic megagrants, and haven’t received anything.

Thanks for the clarification!

Ironically, it’s specifically experimentation with game-changing tech like this that can lead to innovative new projects (or major course changes on current projects)… but I respect Nvidia’s approach on this, as much as it pains me to be unable to play with it myself. Just opening the flood gates could potentially be a support nightmare, especially with tech under such heavy active development.

Can’t wait to see more as this progresses!

One more thing guys, the Caustics branch went live today. You can get access to it on our site: Unreal Engine | NVIDIA Developer

You’ll need github developer access, but that’s all. Then you can build the branch yourself. You’ll notice in the documentation on the github page for this one are little demos and example projects for download.

It’s also worth noting that the Caustics branch includes a new improved version of RTGI. If you’re having a difficult time getting access to RTXGI right now, you could go for the new version of RTGI in this branch instead. It does have the advantage of including caustics, improved translucency effects as well, plus all of our other NvRTX branch enhancements for ray tracing. So for many people this might be the branch to get. Good luck! Let me know if you have any questions.

That fix is in the caustics branch because it’s based on the latest NvRTX branch 4.25.3. The command is r.raytracing.volumefogmode 1. That should be on the NvRTX page documentation, if not let me know.

And yes the plan is to get these technologies as mainstream as possible. Today these are custom branches (or in the case of RTXGI, a plugin that requires some engine changes to integrate) but we would like to eventually make them more available than that.

[ATTACH=JSON]{“data-align”:“none”,“data-size”:“full”,“data-tempid”:“temp_203814_1601408033236_183”}[/ATTACH]

[ATTACH=JSON]{“data-align”:“none”,“data-size”:“full”,“data-tempid”:“temp_203816_1601408202523_460”}[/ATTACH]

Hi All, I know this isn’t an official feedback thread, but it seems to be the place to throw in opinions for now, so here I go. Hopefully some of this might find its way to the developers ears.

I have been playing around with the RTXGI branch for a few days (SDK v1.1) and I’m really impressed by what can be achieved. It could be the primary lighting solution to a project I’ve been working on for over a year (and have tried hundreds of different set ups to get right, each with their own caveats).

Just to avoid confusion about my assessment so far, here are my objectives when testing lighting solutions for my project -

  • Lighting should work for indoor and outdoor locations simultaneously in an open world environment.
  • The main lighting source should be a skylight using an HDRI.
  • The HDRI has no direct sunlight (in other words overcast) so light is travelling and bouncing from all parts of the sky (at least during the day).
  • The lighting solution should allow for graceful transition between day and night in realtime.
  • Light from point / spot / area / emissive lights should also bounce within a scene accordingly in realtime.
  • Light & shadows should interact with volumetric fog, including blocking it out when indoors.

Simple, right?

I think RTXGI achieves almost all of this on its own which is amazing.
Here’s a few things I’ve noticed (that I presume are already being worked on as we speak) whilst playing with the system + some suggestions (just my own opinions and I’m sure there are probably better ways to do things).

  • Skylights - After fiddling around for several hours I accidentally discovered that RTXGI does actually work with a Skylight! This is something I did not expect as most other realtime systems seems to prefer direct light only (mainly a directional light) to perform other calculations from. Having watched the UE5 demo (which I’m presuming is using a version of this same tech), I’d assumed RTXGI would only use a directional light too. But no, it can take in light from a Skylight with a HDRI.
    The thing is, it only seems to work with a Static Skylight?! I presuming this has just been overlooked for now and will be remedied. Surely it should follow that you’d want to use a moveable (rotatable) skylight with a realtime GI solution? Otherwise what’s the point?
    If this is case then I’d really hope there’ll be a way to have the skylight only emit light into the RTXGI volumes - otherwise, at the moment, a moveable skylight emits a very simple illumination into the whole scene which illuminates every surface and destroys all shadows.
  • **Translucency **- so far it seems like it doesn’t work with translucency (such as simple glass in windows). Masking seems to be ok, but yeah, pretty sure it’s dead in the water if it can’t penetrate glass.
  • **Ray Traced Reflections **- or lack their of. The light emitted by the DDGI volumes does not seem to be captured by raytraced reflections. Maybe I missed a command that enabled this, but I couldn’t get it to work.
  • **Reflection Captures **- the light from DDGI volumes does seem to be captured by these but only when the camera is nearby during the capture. Otherwise big black areas are produced. I can only imagine it’s because the GI is only local to the player camera for now.
  • **Color Saturation **- I’ve noticed that DDGI bounced light tends to gain the saturation of surfaces it touches by quite a lot. Very desaturated surfaces like bricks become glowing red when exposed to the GI volumes.
  • **SSGI **- Although the video shows the SSGI and RTXGI working together I could not get them to co-exist on the latest branch. One cancelled out the other. I presume that’s because the public branch is behind the one in the video. SSGI could add some smaller details at close range that RTXGI fails to without having to add many smaller DDGI volumes.
  • Rotation - the DDGI volumes don’t seem to rotate. I hope this isn’t a restriction of the light being calculated with world aligned cubes. If the volumes can’t be rotated in future then it makes the whole system useless for any geometry that isn’t world aligned. Light probes would end up poking out of rooms / buildings and produce severe artifacts.
  • **Light Contribution **- most lights have an ‘indirect lighting intensity’ slider that could be used to influence the contribution of any particular light to the GI. Having the ability to have certain lights bounce more than others would be a great tool for location design.
  • **Volumetric Fog **- having RTXGI influence volumetric fog would be great too - especially if it was additive AND subtractive. For example if a smaller DDGI volume within a larger one could subtract the fog (to clear a room of fog in my case). On a side note I saw in the presentation that raytraced lights were creating volumetric shadows - please make this available!
  • **Probe placement **- this is a very clever system and has so much potential. However I think that the current method of choosing the amount of probes per axis is a little awkward. I think a much better system would be to have the amount as a factor of the overall DDGI volume’s size. e.g. have a ‘probe spacing’ slider which sets the probes at a set distance apart - then adds / subtracts probes as the volume is sized and new probes are needed. A max and min density could then be set within the engine to avoid mem overload.

Anyway, that’s my two pence for now. I’ll keep playing around with it and see if I can come up with anything else.
Amazing work and a huge breakthrough in light fidelity! Much appreciated.

I applied for DLSS weeks ago and the application shows that it hasn’t even been read yet.
Is there a way to at least view DLSS 2.0 results in editor or do we need the unlock code even to check the performance??

Thanks for this… I’ve been very eager to try mesh caustics since the prism light-splitting technique was first demoed.

However, upon cloning and compiling the NvRTX_Caustics branch, and opening the prism sample map, I get caustics but no color spectrum:

https://i.imgur.com/7QfOMIj.png

I haven’t changed any settings in the project or postprocess volume. I’m using an RTX 2070 with latest drivers (456.55), and UE4 DXR raytracing works fine in other projects.

Any suggestions for settings to check? Has anyone else gotten this to work?

Nice info, I really wonder why it wouldn’t work with raytraced reflections.
Anyways from the pictures and also from what they shown in the demo it does appear like it doesn’t have enough bounces to properly light up an interior.
Also the blurry indirect shadowing doesn’t look too good but that’s expected from this type of lighting I guess (maybe UE5 lumen can solve this better with distance fields)