I dont have any special tips for gaining RTXGI access, I’ve not spoken to anyone at nvidia directly and I applied for access using that website too. In my case it was earlier this week, and it took about a day. I dont know if it helped that I already had my github account connected to some other nvidia UE4 stuff in the past.
Ignore that bit if you only need RTXGI and if you do not need to try DLSS right now.
For DLSS, I eventually found the following page to apply to the DLSS early access program. I dont know if this is still necessary but if it is, I could not complete the application because I did not have a specific project in mind, and also when ready to release application or game you need to “collaborate with NVIDIA to remove the watermark”, whatever that means (collaborate is rather vague), and I dont generally try technology that is at that stage of special access being required.
If that program is not the only way to try DLSS in UE4 right now, someone probably needs to tell us.
Thanks for the clarification, I hadn’t gone as far as reading the readme yet – that’ll teach me to not RTFM before asking questions
I agree that from Richard Cowgill’s remarks on-stream that the registry file thing sounded like a minor detail, but after following your link it looks like this is a hard bounce to people like me just experimenting.
Hopefully this will change in the near future? Considering how much they’re playing up DLSS it seems like this is an unnecessary speed bump to wider RTX adoption: no one will buy the cards if there’s not enough games, and no one will make the games if there aren’t enough cards in users’ hands…
Edit: as an experiment, I went ahead and submitted an application to the DLSS Early Access Program, using one of my WIP projects. Crossing my fingers that they’ll be nice to an indie dev who’s not close to shipping but willing to battle test and give feedback (and maybe I can then use the reg keys to experiment with DLSS in all my projects)…
this stuff is really crazy. I’m making thesis for my university about virtual production and I would like to implement RTXGI and DLSS (instead of using TAA) in my pipeline, I already applied for both but since I’m a student idk if my application is valid… It would be nice to test this in this field… I hope my application will be ok, I can’t wait…
I sent them this form 3 days ago XD
That branch contains the RTX raytracing stuff that has been in UE4 for ages, and also various changes that Nvidia have made to enhance various RTX features. It does not include RTXGI.
If nvidia give you access to the RTXGI code, then it comes in the form of a patch file that you have to apply to the sourcecode of either the 4.25.3 branch from Epic (currently the release branch), or to the nvidia branch that you mention. Then after the appropriate RTXGI patch is applied, you can build the engine. Then, you need to do all the normal things to enable RTX in the project (such as making sure UE4 is in DX12 mode) and to use RTXGI you also need to enable a plugin, disable pre-computed lighting and drag a DDGI thing into your scene.
Thank you very much for such detailed information. Im so hyped for this that is hard for me to be patient.
Anyone have problems with compiling this? I’ve got error that I don’t have VS installed but I do. Any ideas?
OK - so apparently I’ve missed information that I need 2017 for this and no 2019.
They could at least have made the plug-in easily accessible… jeez
anyone get DLSS or rtxgi approval yet? I am thinking about the amazing samples/demos we could put out, that would sell more cards and push the industry. It baffles me that the wall is so high. Just ask us for email and name and send. I am highly doubtful you could use any of this in a real project without testing it for months. @NVIDIA, let the battle testing begin!!
still waiting for approval… applied last Wednesday. We are currently doing evaluations for our next game to make use of a fully dynamic lighting setup and rtxgi seems like a good fit but at this point I
m not sure if it just takes very long or if Im hitting a brick wall and they are restricting access for some reason. Didn’t sound like it from the parts of the stream that I watched…
Who have RTXGI code ? i am waiting 5 days for this code:confused:
This is not the way to make the dev community like you, NVIDIA. Just release the ■■■■ thing.
once this tech gets going I can just imagine how it can be used with unreal 5.
AI can currently increase resolution and fill in missing gaps, taking an old image shot using a potatoe, into a high resolution image or even an animation based off the single reference.
Imagine what this will mean to using Unreal 5 and tesselation. taking any low res texture image to almost 16k…
or taking a lower resolution final render to a high resolution video…
all of it on a dedicated GPU card or something soley for the ai …
Also applied 5 days ago, still waiting. It would definitely help if someone could explain the delay.
We’ve been using UE4 with raytracing in production on a few animation projects alongside Redshift, so far with good results – but RTXGI would massively benefit our workflow.
Also excited to check out the full version of Omniverse with USD support (though the machinima one looks cool too). Hopefully it will be publicly available soon!
Looks awesome! Looking forward to trying it in action!
If there are 2000 lights at the same time caustics do not cast shadows, the efficiency should not be bad, right?
I’ve compiled the branch. 100GB of data and now?
How to get NGX ID?
RTXGI intensity is very weak when the materials are not white. It does not respond to UE4 standard GI boost settings. I upgraded the code to have a BOOST value. Just find all occurrences of NormalBias and add another similar parameter IrradianceBoost.
There are still glitches and light bleeding in some cases but this is the first usable speed GI solution for unreal. Even without DLSS if you do a narrative game (FPS not so important) you can activate RTAO, RTXGI, and RT shadows and have 70 fps at 1080p on the Medieval Dungeon level and it’s perfectly good.
One problem is that a volume inside a volume cannot have additive blending so it’s actually not that simple to have a higher res volume that blends nicely. Should be easy to add as well.