Global Illumination alternatives

Check this out… this is it, it’s called VXGI. During the 980 reveal, I was pretty shocked to find out that the demo they showed off was in UE4!!! :wink: Here’s hoping they put this in 4.5 but looks like integration is already happening. Just got a EVGA GTX 980 SC from Newegg and I hope we get to play with this real soon.

I think, if they hope for rapid adoption, this library will be available for free (binary version) to download by anyone to make integration.

I don’t see any big news about someone integrating GI Works, which means people doesn’t give a ****. I’m sure if Epic would integrate GI Works into engine, NVIDIA would scream all around to announce big news about it ;).

But if the binary version of GI Works would be available for everyone… Well I guess there are enough people in UE4 community to jump on board and integrate it for everyone benefit.

Though it depend how much changes to rendering there is needed to maintain reasonable integration. If it can be non invasive and make use of existing systems, I think it’s more than reasonable to integrate it into main branch.

Note VXGI - GI Works.

Without Epic integrating it as part of the engine it can mean that even if Nvidia provides it for free that people would still have to do some work to make sure that it works correctly with the version of the engine they’re using and that it doesn’t break or have problems with other features. One of the reasons that SVOGI wasn’t kept in the engine was that they didn’t want to have to keep doing work to make sure it worked properly with each engine update. So ultimately it would be nice if it would be officially integrated, since many people don’t want to mess with coding.

I will note though, that while this is being touted as a feature of the new graphics cards that the old ones still support it, it’s just that these new ones are optimized for it.

Tiled Volume Resources are supported on GPUs ranging from Radeons 7xxx and from GTX 6xx up.

At least in theory. OpenGL extensions AMD_sparse_texture provides the same functionality.

But it depends on their specific implementation. If the data is stored in 3d texture or not.

In anycase. It’s better to have community maintained dynamic GI, that don’t have it :D.

Interesting, I wonder what the performance difference would be. Does that include the entire Kepler line from the GK104 and up or is it just the GK110?

One non-related piece of tech in the 9xx line that caught me a little off guard in a good way is the Direct to VR stuff they have which will really benefit the Oculus Rift.

I’m not sure how far back it goes, I think it might even work on AMD graphics cards as well, so it could be unrelated to the hardware. In the arguments between Nvidia and AMD they’ve said that while their Gameworks tools are designed for Nvidia GPU’s that there’s nothing stopping developers from getting them to work on AMD as well.

We’re interested in collaborating with Nvidia to see how something like VXGI could be made more easily available for interested developers, but for now it will continue to be developed/maintained by Nvidia so you’ll have to work with them directly. If VXGI gains wider adoption across more platforms then we’d be in a better position to maintain an official integration with UE4.

Thanks for the reply Ray! Always great to have the devs commenting here with us.

Technially this feature is feasble on GPUs starting from Radeon 7xxx.
The main innovation I see for this technique is reduced (greatly!), memory usage, by something mysteriously called… Tiled Volume Resources.

If you do some digging around, you will quickly realize, that this feature was first proposed by AMD in form AMD_sparse_texture for OpenGL and now is part of ARB, as ARB_sparse_texture.

Sparse resources are big part of OpenGL actually and DirectX, still doesn’t have important parts of it. Like sparse_buffers.

In any case such implementation should be perfectly viable on OpenGL, it would be multiplatform, and the only advantage of 9xx would be faster rasterization step, but that is not that much important, as even nowdays GPUs and algohrithms can voxelize dynamic scene under 2ms.

I was actually thinking about using sparse_texture for access to voxels packed into 3d texture. But as it happens, Tiled Resources, on directX do not support access to 3d textures, which is ridiculous…

But to make nessessary changes there would be needed access to full source of GameWorks, and I doubt nvidia will just give it away ;p.

In anycase I wonder if the NVIDIa integration will be available for public. It would be great.

Thanks for chiming in Ray and hope you are enjoying your Friday! That would be amazing to see a feature like this in UE4 and is there a branch that NVidia is working on that we are able to get access to? Also, from your POV, what are the differences/benefits you’ve noticed with this technique over LPV or even SVOGI? Maybe can also chime in on this one. But either way, the future is looking bright (pun intended) :wink:

VXGI looks amazing, and it’s already integrated with UE4! Though it feels like tech demo fodder… At least the way they’re hyping it up as “only possible because of Maxwell” is troubling. There official product page even says the only supported cards are the GTX980 and GTX970.

Also, as great as the results are, it’s still a small scene. I hope something comes out of it though!

my gawd, this is amazing.

Wow, impressive! I’m not on sub now, is this feature dynamic?!

WTF… VXGI Lunar Demo running in UE4???https://youtube.com/watch?v=O9y_AVYMEUs
http://abload.de/img/gidemoue4k0bdj.png

YOUR THE REAL MVP!

I’m afraid UE4 is gonna have the worst dynamic lighting after Unity 5 comes out, it’s a shame, SVOGI was removed (the biggest selling point)

I’m anxious to see what will be implemented in UE4 in the end, VXGI, a custom alternative, but I’m confident the Epic’s devs will do a great job and really soon.

But I have to say I would love test this VXGI implementation :rolleyes:

Looks great! Still needs indirect light bounce.

I don’t understand why are people talking about “biggest selling point” etc. in relation to SVOGI/VXGI.

First of all, if you think the biggest selling point of UE4 is/was SVOGI, you are either incredibly ignorant or simply dishonest.
Second, you do realize that SVOGI is an extremely resource intensive technique and VXGI requires hardware support (you can use it ONLY on 980 and 970 cards). That means it’s unusable for 95% of the PC market. And no, you can’t just make it look less appealing on weak systems, it has a high level of baseline performance it needs or it doesn’t work properly.

??
WE are talking about need of having Dynamic Global Illumination. What technique that will that be exactly, is much less relevant.

VXGI actually doesn’t showed much of things that wasnt being thought before. Voxel packing, can be achieved in OpenGL using 3D texture and sparse_texture extension. It should work from Radeons 7xxx and GeForce GTX 5xx (bar in mind newer GPUs have better support for sparse textures).

Rasterization is interesting and it is only real thing not present on older GPUs. But this issue can be omitted by using bigger voxels, rasterizing only static object. using cascading, prevoxelizing scene only once for static objects, and voxelzie only dynamic objects on run time.

Either way, dynamic GI or at least one that doesn’t require hours of pre computation (like Enlighten in Unity 5 is fast to precompute) is big selling point. As it either make possible to create more dynamic games, or at least do not spend hours waiting for lights to bake, just to discover that we don’t really like this lighting setup after all…

that’s not true. The 980 and 970 architecture was built specifically with it in mind but it will work with other Nvidia cards and even amd cards. Also Nvidia’s VXGI is alot cheaper than SVOGI.
Also Nvidia implementation comes with quality settings that you change that affects computation time.
GTX 770 averages between 7.4 to 12.9ms
GTX TITAN averages between 6.6 to 9.6ms.

In comparasion, DFAO costs 4.5ms on a 7970 at 1080p