Without Epic integrating it as part of the engine it can mean that even if Nvidia provides it for free that people would still have to do some work to make sure that it works correctly with the version of the engine they’re using and that it doesn’t break or have problems with other features. One of the reasons that SVOGI wasn’t kept in the engine was that they didn’t want to have to keep doing work to make sure it worked properly with each engine update. So ultimately it would be nice if it would be officially integrated, since many people don’t want to mess with coding.
I will note though, that while this is being touted as a feature of the new graphics cards that the old ones still support it, it’s just that these new ones are optimized for it.
Interesting, I wonder what the performance difference would be. Does that include the entire Kepler line from the GK104 and up or is it just the GK110?
One non-related piece of tech in the 9xx line that caught me a little off guard in a good way is the Direct to VR stuff they have which will really benefit the Oculus Rift.
I’m not sure how far back it goes, I think it might even work on AMD graphics cards as well, so it could be unrelated to the hardware. In the arguments between Nvidia and AMD they’ve said that while their Gameworks tools are designed for Nvidia GPU’s that there’s nothing stopping developers from getting them to work on AMD as well.
We’re interested in collaborating with Nvidia to see how something like VXGI could be made more easily available for interested developers, but for now it will continue to be developed/maintained by Nvidia so you’ll have to work with them directly. If VXGI gains wider adoption across more platforms then we’d be in a better position to maintain an official integration with UE4.
Technially this feature is feasble on GPUs starting from Radeon 7xxx.
The main innovation I see for this technique is reduced (greatly!), memory usage, by something mysteriously called… Tiled Volume Resources.
If you do some digging around, you will quickly realize, that this feature was first proposed by AMD in form AMD_sparse_texture for OpenGL and now is part of ARB, as ARB_sparse_texture.
Sparse resources are big part of OpenGL actually and DirectX, still doesn’t have important parts of it. Like sparse_buffers.
In any case such implementation should be perfectly viable on OpenGL, it would be multiplatform, and the only advantage of 9xx would be faster rasterization step, but that is not that much important, as even nowdays GPUs and algohrithms can voxelize dynamic scene under 2ms.
I was actually thinking about using sparse_texture for access to voxels packed into 3d texture. But as it happens, Tiled Resources, on directX do not support access to 3d textures, which is ridiculous…
But to make nessessary changes there would be needed access to full source of GameWorks, and I doubt nvidia will just give it away ;p.
In anycase I wonder if the NVIDIa integration will be available for public. It would be great.
Thanks for chiming in Ray and hope you are enjoying your Friday! That would be amazing to see a feature like this in UE4 and is there a branch that NVidia is working on that we are able to get access to? Also, from your POV, what are the differences/benefits you’ve noticed with this technique over LPV or even SVOGI? Maybe can also chime in on this one. But either way, the future is looking bright (pun intended)
VXGI looks amazing, and it’s already integrated with UE4! Though it feels like tech demo fodder… At least the way they’re hyping it up as “only possible because of Maxwell” is troubling. There official product page even says the only supported cards are the GTX980 and GTX970.
Also, as great as the results are, it’s still a small scene. I hope something comes out of it though!
I’m anxious to see what will be implemented in UE4 in the end, VXGI, a custom alternative, but I’m confident the Epic’s devs will do a great job and really soon.
But I have to say I would love test this VXGI implementation :rolleyes:
I don’t understand why are people talking about “biggest selling point” etc. in relation to SVOGI/VXGI.
First of all, if you think the biggest selling point of UE4 is/was SVOGI, you are either incredibly ignorant or simply dishonest.
Second, you do realize that SVOGI is an extremely resource intensive technique and VXGI requires hardware support (you can use it ONLY on 980 and 970 cards). That means it’s unusable for 95% of the PC market. And no, you can’t just make it look less appealing on weak systems, it has a high level of baseline performance it needs or it doesn’t work properly.
??
WE are talking about need of having Dynamic Global Illumination. What technique that will that be exactly, is much less relevant.
VXGI actually doesn’t showed much of things that wasnt being thought before. Voxel packing, can be achieved in OpenGL using 3D texture and sparse_texture extension. It should work from Radeons 7xxx and GeForce GTX 5xx (bar in mind newer GPUs have better support for sparse textures).
Rasterization is interesting and it is only real thing not present on older GPUs. But this issue can be omitted by using bigger voxels, rasterizing only static object. using cascading, prevoxelizing scene only once for static objects, and voxelzie only dynamic objects on run time.
Either way, dynamic GI or at least one that doesn’t require hours of pre computation (like Enlighten in Unity 5 is fast to precompute) is big selling point. As it either make possible to create more dynamic games, or at least do not spend hours waiting for lights to bake, just to discover that we don’t really like this lighting setup after all…
that’s not true. The 980 and 970 architecture was built specifically with it in mind but it will work with other Nvidia cards and even amd cards. Also Nvidia’s VXGI is alot cheaper than SVOGI.
Also Nvidia implementation comes with quality settings that you change that affects computation time.
GTX 770 averages between 7.4 to 12.9ms
GTX TITAN averages between 6.6 to 9.6ms.
In comparasion, DFAO costs 4.5ms on a 7970 at 1080p
That was the biggest selling point for UE4 in 2012, all the other engines now have all the features, CE has better shaders and Unity5 is getting better lighting, which puts UE4 behind in terms of graphics. (all this just to please the consoles)
The game that brought Global Illumination back to UE4 is being made for Xbox One (Fable Legends). UE4 also has support for mobile phones (which are weaker than consoles).