Download

NVIDIA GameWorks Integration

Hi it’s at: https://github.com/NvPhysX/UnrealEngine/blob/VXGI-4.9/UE4_VXGI_Overview.pptx
Changelog: https://github.com/NvPhysX/UnrealEngine/tree/VXGI-4.9

828d8f067f07ebd5b27162ebe9681ac10429957b.jpeg

Ah thanks, I suppose it’s r.VXGI.DiffuseTracingEnable and r.VXGI.SpecularTracingEnable

So I’ve been working on a small project with VXGI for a while now, and now that I’ve gotten used to it it’s actually a really nice system. One concern though is lower end spec machines don’t even have a hope at running the game. Aside from the obvious ones like 4 cones, sparse tracing of 4 and mapsize of 32, what kind of things can I do to get it working well on lower end hardware? It doesn’t even necessarily need to look any better than manually placed point lights, which for now is the alternative option, to manually place a set of point lights to fake GI that are only enabled when VXGI is off. It’s obviously not an ideal solution, since I essentially need to do the scene’s lighting twice, and you lose a lot of the benefits of having a dynamic system in the first place.

Speaking of, if that really is the limit of how much you can turn it down for more performance, would be a good feature to add in a flag to the lights like I’ve been doing in the project. Basically, any light that I want on when VXGI is disabled, I end the name with _novxgi, essentially making a blacklist of fill lights that turn off when VXGI is on. It works great, except for the part where in the editor view it always shows both sets of lights even with VXGI enabled. I’m sure a more elegant solution is possible than what I did there.

I thought the main reason Epic removed VGI was the fact it didnt scale well! I really want to see VXGI take off but maybe its just a bit before its time.

**Please explain where you want to copy a file GFSDK_VXGI_x64.dll or GFSDK_VXGI_x86.dll? **

And what does the “Add a reference to GFSDK_VXGI_x64.lib or GFSDK_VXGI_x86.lib.”?

It sounds like your trying to follow the directions to use VXGI in a standalone application, that will not work for UE4. You need to obtain the UE4 VXGI branch from NVIDIA: https://github.com/NvPhysX/UnrealEngine

Destructibles and Flex interaction

Is there any way currently to have APEX destructibles interact with Flex particles? Destructible objects only seem to collide with Flex objects before they’ve been fractured. There was a demo NVIDIA released a while back with water in a glass tank. The tank is then shattered in various locations releasing the water. Any details on how that was achieved would be greatly appreciated. Thank you in advance.

Or is it not necessary to do? As you build engine VXGI?

As I understand it, at the top of the left column “Branch” instead of “release” should I choose “VXGI” right?
After that, there are two options, or “Download ZIP” or “Clone in Desktop” right?
This time I downloaded the ZIP file, this can be a problem?

GalaxyMan2015, could you describe step by step how you get the engine + VXGI.

In the above privacy I do not understand what to do after unpacking the archive. Perhaps you need something to download further (perhaps more the component to Nvidia)? Do I need to make changes to the VS (create links to change the code, etc.)?

PS:Thank you in advance for your help!

  1. Download: https://github.com/NvPhysX/UnrealEngine/archive/VXGI-4.9.zip

  2. Extract to your computer somewhere

  3. Run Setup.bat in the root path

  4. Once the above has complete (needs to download about 3.4gb of prereqs). Run GenerateProjectFiles.bat

  5. Open UE4.sln and compile UE4 solution.

  6. Once compilation has completed, Recompile ShaderCompilerWorker (under programs in Solution Explorer)

  7. Launch UE4Editor.exe (or Start Debugging)

  8. Wait for launch procedure to complete, it will get stuck on 45% for a while, since its compiling shaders.

GalaxyMan2015, thank you for help! On your instructions I became much clearer process of building!

I believe so… let us know if it doesn’t work.

I believe VXGI is a Maxwell-onwards thing. Maxwell has lots of “stuff” (as best I can describe it) that accelerates VXGI quite a lot. Nvidia Pascal is around the corner. I’m sure AMD will look at how to optimise for things like VXGI though AMD is clearly behind Nvidia in terms of power consumption at least.

There’s probably not going to be any tweaks for VXGI that will really work for pre-Maxwell cards, unless Nvidia or AMD put a lot of effort into it.

Unless they screw it up, Nvidia Pascal will blow away anything we’ve seen with Maxwell, we’re talking the GTX 1080 Ti (or what ever they call it) doing 4K high-quality VXGI at 90fps+ and so on.

With Nvidia, I don’t know how they’ve done it but GPU rendering for non-realtime is really blowing up right now and sidelining the CPU.

Cryptocurrency GPU mining is on the wane (AFAIK) because of dedicated ASICS and cryptocurrency significantly kept AMD purchases ticking along.

Nonetheless GPU compute which had a rocky start is taking off now.

So I admit, I’m a long time Nvidia fanboy… but the GPU world, more or less dominated by Nvidia at the high end, for PC + VR, is going to very significantly change 2015-2020+

So bottom line is Intel Kaby Lake + Nvidia Pascal + VR will drive the high-end 3D and gaming world with 4K and full GI like VXGI, BUT it will take several years to filter “down” to mainstream PC gaming and then consoles and mobile.

Hey guys, has anybody tried HairWorks integration in 4.9 version?

I tried hairworks test map with galxymans’s latest build. Why do you ask?

Oh, i totally missed the GalaxyMan’s repo… Just wondered if it could be merged successfully with 4.9 branch before i actually try that. I see now, thank you.

UE4 + VXGI worked!!!
I used the instructions from GalaxyMan2015 and it worked.
Probably the problem was me. I downloaded version 4.9 and it work (before I tried to build a version 4.8).

Alexey.Panteleev and GalaxyMan2015, thank you for your help!

https://developer.nvidia.com/gameworksdownload#?dn=nvidia-hairworks-1-1-1
NVIDIA HairWorks 1.1.1

Maxwell and Pascal run the tech great, and yes that’s what it’s built for, but there must be some way to offer a cut down, less accurate version for slower hardware. Lionhead’s LPVs were very performance heavy at first as well, now it’s more or less free in comparison, without any change in the hardware running it. Even if the lower spec settings for VXGI are as problematic as LPVs are when it comes to problems like light leaking, the option to be able to do something like that at least opens up the VXGI workflow a bit more. Right now, if you make a scene focused around VXGI, even if it runs great on high end machines, without doubling the workload the scene flat out does not run on even something like the GTX 680 well enough to call it playable. Setting a GTX 780 as the game’s minimum spec in 2015/2016 is unreasonable to ask of the consumers right now.

Even making a change like lowering the minimum mapsize from 32 to maybe 16 or even lower, probably wouldn’t need a significant rewrite to the systems while still lowering the performance hit that it has. The Sci-fi Hallway example included with the build is a great example of why disabling VXGI in a scene built for it isn’t going to work out too well.

I’m all for future proofing the visuals in games, that’s part of why I’m trying to make this tech work in the first place, but it is very difficult to convince non-artists on the team that the system is a good idea when their own computers can’t comfortably play the game they’re working on. Not to mention, as you said, VR, which when combined with VXGI will ensure nothing until 2020 can even consider running at the 90 fps that something like that requires (Unless Nvidia has a trick up their sleeve to not incur a performance hit when using the tech in VR). I do want this tech to work, been messing with the settings for over a week to try and get it going well enough on lower end systems now that I have a few scenes that can use it well, it’s just a challenge to get it in a state where it runs well for it. For now at least, my solution is just to fill the scene with fill lights that only turn on with VXGI disabled and try to make it look as close as it can.

I reckon that at the current polycount of today’s games an Nvidia Pascal by the ~end of 2016~, say the GTX 1080 Ti ~will~ be able to run 4K with 90fps with VR with VXGI. At least the Nvidia Pascal Titan Y or whatever they might call it.

So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia’s GameWorks VR efforts like “crushing” the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).

You rightly point out the issue of the next five years for mainstream PC gamers. I’m still learning UE4 so from a layperson PC gamer perspective indeed it’s great if developers can “fill in the gaps”.

What’s been brilliant about PC game developers is the scalability they’ve (almost) always implemented in games. I’ve always enjoyed how you could play through a game at low, low settings and still get a feel for it, while gamers with the top-notch equipment enjoy all the maximum graphics possible (“really bad console ports” not included, of course).

Can you share how you are working with the scenes with VXGI on and off? I think that’s a very noble approach to this whole matter. What’s the method to toggle “force no precomputed lighting” on and off at runtime? I think that’s a better approach than trying to use really low VXGI settings for low-mid GPUs.

So to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities. The game could (rightfully) be shown as, hey, this is how it looks on mainstream graphics cards (and it should still look good), but then, BOOM! with an Nvidia Pascal and better check out full dynamic GI with VXGI. Yes, it goes back to the controversial “The Way It’s Meant To Be Played”… but if you’re a game developer trying to deliver the best experience for gamers, as long as the experience for mainstream Nvidia and AMD is good and not purposefully crippled, why not? I don’t know the industry but I suspect incorporating UE4, VR and VXGI in any game in the next five years will get a lot of support (and let’s be honest, free marketing) from Epic, Nvidia and Oculus.

So all the best, you’re at the very cutting edge and potentially looking at good rewards for your efforts, Lord willing! (That said, there is a very dark side of people being lost in photorealistic VR and not being able to adapt back to the real world… but that is something I suppose an individual developer has to consider themselves).

PS. As for VR Oculus has already specified (IIRC) the Nvidia 970 as ~minimum~ …So for PC VR we’re already in high-end Maxwell territory. Obviously at this point VXGI is not suitable for Maxwell VR, but high-end Pascals should be capable as I detail (estimate) above.

Unless they completely break Moore’s Law with Pascal, I doubt it’ll be more than the standard 25% gap between the 900 and 1,000 series cards. A GTX 1080 Ti (or whatever they choose to call it, I think it’s still unannounced what numerical system they’re going to use. Might reset it like they did after the 9000 series) will not be able to run 4k at 90 fps in VR with VXGI in anything but a test scene. Even the 980 Ti right now struggles to run 4k in a fully populated scene at 60 fps, it’ll be a few generations of cards before we reach that level. VXGI might be cheaper on the Maxwell and later cards, but even at lowest settings it certainly isn’t free, and a lot of games in VR will likely still continue relying on double rendering like 3d Vision does for quite some time, doubling the performance hit.

I do double the lighting work for every single scene. For VXGI I get it looking good with that first, then I turn off VXGI and use as many point lights as I need in order to simulate the look as closely as I can, completely defeating the point of dynamic lighting. Nothing that I’m doing, and nothing I can do, will let me use baked lighting as a fallback. The only solution I see so far is to make a second set of lights with shadows disabled to manually simulate the light bounces, and at that point if I’m going through so much effort, there’s little reason to continue using VXGI at all. By manually placing lights, you’re getting rid of the biggest advantage that it offers, the ability to change the content in the scene in a major way without breaking the lighting setup.

Debatable if it is or not without having a lower setting option to scale down to. It’s a lot of extra work doing the lighting for every single room twice, and you always run the risk of having to make compromises for one lighting setup to better compliment the other. It’s not like PhysX particles or Waveworks where turning it off just turns off a few visual effects, using realtime GI fundamentally changes how an artist will go about lighting levels. It can be a great change too, letting level designers have more freedom with how the environments in the game react to the events going on in the scene. I see a lot of great things coming out of systems like VXGI in the future, it just needs to be a bit more adaptable to current hardware for it to be a viable option to develop a game around today.