Download

NVIDIA GameWorks Integration

I reckon that at the current polycount of today’s games an Nvidia Pascal by the ~end of 2016~, say the GTX 1080 Ti ~will~ be able to run 4K with 90fps with VR with VXGI. At least the Nvidia Pascal Titan Y or whatever they might call it.

So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia’s GameWorks VR efforts like “crushing” the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).

You rightly point out the issue of the next five years for mainstream PC gamers. I’m still learning UE4 so from a layperson PC gamer perspective indeed it’s great if developers can “fill in the gaps”.

What’s been brilliant about PC game developers is the scalability they’ve (almost) always implemented in games. I’ve always enjoyed how you could play through a game at low, low settings and still get a feel for it, while gamers with the top-notch equipment enjoy all the maximum graphics possible (“really bad console ports” not included, of course).

Can you share how you are working with the scenes with VXGI on and off? I think that’s a very noble approach to this whole matter. What’s the method to toggle “force no precomputed lighting” on and off at runtime? I think that’s a better approach than trying to use really low VXGI settings for low-mid GPUs.

So to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities. The game could (rightfully) be shown as, hey, this is how it looks on mainstream graphics cards (and it should still look good), but then, BOOM! with an Nvidia Pascal and better check out full dynamic GI with VXGI. Yes, it goes back to the controversial “The Way It’s Meant To Be Played”… but if you’re a game developer trying to deliver the best experience for gamers, as long as the experience for mainstream Nvidia and AMD is good and not purposefully crippled, why not? I don’t know the industry but I suspect incorporating UE4, VR and VXGI in any game in the next five years will get a lot of support (and let’s be honest, free marketing) from Epic, Nvidia and Oculus.

So all the best, you’re at the very cutting edge and potentially looking at good rewards for your efforts, Lord willing! (That said, there is a very dark side of people being lost in photorealistic VR and not being able to adapt back to the real world… but that is something I suppose an individual developer has to consider themselves).

PS. As for VR Oculus has already specified (IIRC) the Nvidia 970 as ~minimum~ …So for PC VR we’re already in high-end Maxwell territory. Obviously at this point VXGI is not suitable for Maxwell VR, but high-end Pascals should be capable as I detail (estimate) above.

Unless they completely break Moore’s Law with Pascal, I doubt it’ll be more than the standard 25% gap between the 900 and 1,000 series cards. A GTX 1080 Ti (or whatever they choose to call it, I think it’s still unannounced what numerical system they’re going to use. Might reset it like they did after the 9000 series) will not be able to run 4k at 90 fps in VR with VXGI in anything but a test scene. Even the 980 Ti right now struggles to run 4k in a fully populated scene at 60 fps, it’ll be a few generations of cards before we reach that level. VXGI might be cheaper on the Maxwell and later cards, but even at lowest settings it certainly isn’t free, and a lot of games in VR will likely still continue relying on double rendering like 3d Vision does for quite some time, doubling the performance hit.

I do double the lighting work for every single scene. For VXGI I get it looking good with that first, then I turn off VXGI and use as many point lights as I need in order to simulate the look as closely as I can, completely defeating the point of dynamic lighting. Nothing that I’m doing, and nothing I can do, will let me use baked lighting as a fallback. The only solution I see so far is to make a second set of lights with shadows disabled to manually simulate the light bounces, and at that point if I’m going through so much effort, there’s little reason to continue using VXGI at all. By manually placing lights, you’re getting rid of the biggest advantage that it offers, the ability to change the content in the scene in a major way without breaking the lighting setup.

Debatable if it is or not without having a lower setting option to scale down to. It’s a lot of extra work doing the lighting for every single room twice, and you always run the risk of having to make compromises for one lighting setup to better compliment the other. It’s not like PhysX particles or Waveworks where turning it off just turns off a few visual effects, using realtime GI fundamentally changes how an artist will go about lighting levels. It can be a great change too, letting level designers have more freedom with how the environments in the game react to the events going on in the scene. I see a lot of great things coming out of systems like VXGI in the future, it just needs to be a bit more adaptable to current hardware for it to be a viable option to develop a game around today.

Thanks for the insight into the on-the-ground reality… If it is difficult to “switch” between VXGI and Lightmass then VXGI and similar for the mainstream is still a few years away, but no more than 5 years I reckon.

One point with Nvidia Pascal is that it’s not really following Moore’s law anyway because they have a lot of architectural changes and are “jumping” from 28nm to 16nm.

Fair enough with Maxwell, and Volta is only scheduled for 2018. But I do believe that by the end of 2017, the Nvidia Pascal “Titan Y” will be able do 4K VXGI 90FPS in real-world scenes. That’s 2 years of quite a lot of optimisations with UE4, VXGI, GameWorks (including GameWorks VR), etc.* Edit: Don’t forget DirectX12, in 2 years time the DX12 speed optimisations will be quite significant.*

So, time will tell, I guess it’s good that everyone is putting in the R&D right now!


Edit: Here’s a nice (optimistic) post about what Pascal could be:
http://forums.anandtech.com/showthread.php?t=2436009
*
“GP100: 550 sq. mm. die, 13.75 billion transistors, 6144 CUDA cores… In terms of performance, we’ll be looking at no less than a full doubling of the Titan X’s power.”*

I’m personally guessing 30%-50% improvement of Titan Y Pascal over Titan X Maxwell. If Nvidia totally hits it out of the park then 50%-100% but that’s very optimistic. The 50%-100% improvements would be more in the “advertised” areas like deep learning, etc.

I’m thinking back to the statements the CEO of Nvidia made: http://www.pcworld.com/article/2898175/nvidias-next-gen-pascal-gpu-will-offer-10x-the-performance-of-titan-x-8-way-sli.html
Not sure what that means for performance in gaming…they said that they try to make Pascal ten times more powerful than Maxwell in compute tasks, but I can’t translate that into anything useful for me. Maybe because I’m not good at speculating?

Yeah the 10x (1000% increase) in speed is at specific “deep learning” type stuff which I barely understand. The 30%-50% increase is my personal guess based on stuff I detailed above… most notably going from 28nm to 16nm along with Nvidia’s overall architechural change track record (post-Fermi). There’s also the “CUDA Cores” kind of stuff where if you’re talking the Pascal Titan Y having 4000 CUDA cores, their new 3072-bit (that’s not a typo, AMD is shipping 4096-bit cards now) HBM2 memory implementation… certainly for gaming there should be big increases.

There’s also all the GPU compute stuff which will impact developers and media producers. Encoding, rendering, is all being revolutionised by GPU compute. Here’s a render on my GTX 660M Kepler from Daz3D with Nvidia Iray. For argument’s sake I set the time limit to 5 minutes, 1920x1080 (only post process was grey background and increase in brightness (curves):

a927be37fbbf3f6badb915b007496d2951c23d67.jpeg

This is of course nothing compared to the stuff coming out of people with Octane and say, two Titan X, let alone a GPU cloud rendering cluster (it’s considered “realtime” in the latter).

Besides hardware, as mentioned there’s DX12, GameWorks VR and so on, so yes I am very optimistic about Nvidia Pascal hardware ~and related software~. Let’s take Lightmass for example. Sure, the current quality when you crank up the settings is phenomenal. But if you took Nvidia IRay or VXGI or OpenCL in general and made a Lightmass-type baking system, you could get much faster baking.

To sum up anything less than a 30% increase of Titan Y over Titan X in gaming (UE4, 4K, VR, DX12 etc) and mainstream GPU rendering (IRay, Octane) would be an Nvidia misstep in my opinion.

Edit: On the AMD side the Dual-Fiji should be out soon and that should do 4K VR 90FPS etc. Will have to be liquid-cooled though, I think: http://wccftech.com/amd-dual-gpu-fiji-gemini-spotted-shipping-manifest-codenamed-radeon-r9-gemini/

If Lightmass is an option, you’ll always get higher quality out of baked lightmaps than realtime GI. I have high hopes for VXGI, but unless you put it up at 64 cones or higher it can’t match it, and it’s unreasonable to expect it to. VXGI renders once per frame, where Lightmass has hours to process a scene in the highest quality possible. Realtime GI only makes sense in a scene that couldn’t normally be done with prebaked lighting for some reason, such as placing down entire buildings at runtime, or large maps populated with hundreds of objects (like a forest).

Switching between them is impossible as well, since Lightmass uses static/stationary lights, and VXGI only uses dynamic lights. The only real way to switch is to turn all the dynamic lights to static ones and rebake the entire scene, not really an option to do at runtime.

LPVs are a good alternative for a lower performance hit right now, but it doesn’t support spotlights or point lights, making it only really work in outdoor scenes.

If either LPVs added support for the other kinds of lights, or VXGI offered a low performance impact option, then suddenly VXGI would become viable for full projects overnight. The only reason someone wouldn’t use it right now for a dynamically lit scene is that it alienates a lot of the PC gaming audience (and the entire console audience if you’re targeting that).

Hi all, just trying to get familiar with these Nvidia specific branches of UE4, particularly for ArchViz work and making things that much more realistic.

Is there a way to combine the features of the various Nvidia branches of UE4 or are we expected to compile each branch as we need those features? Really curious!

Thanks!

A.

I feel like I’m beating a dead horse at this point, but one thing I would like to mention is that a lot of the Maxwell performance improvements are for individual features, not the system itself. There are certain features of VXGI that only run well on Maxwell, but don’t improve quality much. That being said, if you already lowered the StackLevels to 3 (less isn’t viable), disabled storeEmittanceInHDRFormat, lowered the MapSize, AND set the voxelization to use the lowest LOD on your meshes without reaching optimal performance, then you need a second solution. VXGI is intensive, but there are a lot of optimizations that can be made to make it run faster, including some within your assets, such as disabling it on certain materials or even reworking your meshes a bit to assist the voxelization process. I guess what I’m saying is that taking full advantage of VXGI is like taking full advantage of the PS3. You can do it, but it’s a pain in the ***.

Hi,
at the moment I try to create my first VXGI Build after a long testing period. But unfortunately the Build always fails. Are there known problems with GalaxyMan´s VXGI branch? At the moment I am using the Unreal Engine 4.8.

Best regards, Andreas

Check out: https://github.com/GalaxyMan2015/UnrealEngine/tree/4.9.1_NVIDIA_Techs thats all the gameworks techs compiled into one.

None that I am aware of, haven’t touched the 4.8 build in god knows how long. I would try the 4.9.1 build if I was you. I use that same one and I am unaware of any issues.

Thank you very much for your answer. I will try the 4.91 build. :slight_smile:

That is exactly what I was looking for.

Thanks so much!

GalaxyMan you are awesome!!! is there a way to donate you?

Thanks and yes. Check this post for details: https://forums.unrealengine.com/showthread.php?53735-NVIDIA-GameWorks-Integration&p=391404&viewfull=1#post391404

I have tested the build 4.91. Unfortunately I am not able to build my VXGI projects. I get always this error:

aef69c5f299f6172b828fe25dd5a20da635b326b.jpeg

At the moment I dont know what I can do to solve this problem. I have no experiences with the UnrealBuildtool…

Is that with NVIDIA’s VXGI branch? As I have tested packaging with my branch (the all merged GameWorks branch) and it works fine (I did have to make a couple of tweaks, which I will commit soon, but none for VXGI as far as I can remember), have been working on a BP only project all week using the engine, packaging each night just to ensure. No issues so far.

Strange… I tested version 4.91 with two of my VXGI projects. I get always this error. I will try to compile the build again. Perhaps this will help…

Epic Dude! Im working on a project and i wanted to do something just like this. Mind giving me a few pointers on how you got that to work? (I’m relatively new to game dev, and even more noob at UE4)

I generated a completely new 4.9.1 build. But I get the same error as the last time. UE4 is missing the UE4Game binary. What could cause this problem? Is there anyone else who has the same problems with packaging projects like me?

Here is the output log:

inFrameActions: Packaging (Windows (64-bit)): Project.Cook: ********** COOK COMMAND COMPLETED **********
MainFrameActions: Packaging (Windows (64-bit)): Project.CopyBuildToStagingDirectory: ********** STAGE COMMAND STARTED **********
MainFrameActions: Packaging (Windows (64-bit)): BuildCommand.Execute: ERROR: BUILD FAILED
MainFrameActions: Packaging (Windows (64-bit)): Program.Main: ERROR: AutomationTool terminated with exception:
MainFrameActions: Packaging (Windows (64-bit)): Program.Main: ERROR: Exception in AutomationScripts.Automation: Stage Failed. Missing receipt ‘UE4Game-Win64-Shipping.target.xml’. Check that this target has been built.
MainFrameActions: Packaging (Windows (64-bit)): Stacktrace: bei Project.CreateDeploymentContext(ProjectParams Params, Boolean InDedicatedServer, Boolean DoCleanStage) in e:\Unreal_4_9_1VXGI\Engine\Source\Programs\AutomationTool\Scripts\CopyBuildToStagingDirectory.Automation.cs:Zeile 1572.
MainFrameActions: Packaging (Windows (64-bit)): bei Project.CopyBuildToStagingDirectory(ProjectParams Params) in e:\Unreal_4_9_1VXGI\Engine\Source\Programs\AutomationTool\Scripts\CopyBuildToStagingDirectory.Automation.cs:Zeile 1628.
MainFrameActions: Packaging (Windows (64-bit)): bei BuildCookRun.DoBuildCookRun(ProjectParams Params) in e:\Unreal_4_9_1VXGI\Engine\Source\Programs\AutomationTool\Scripts\BuildCookRun.Automation.cs:Zeile 211.
MainFrameActions: Packaging (Windows (64-bit)): bei BuildCommand.Execute()
MainFrameActions: Packaging (Windows (64-bit)): bei AutomationTool.Automation.Execute(List1 CommandsToExecute, CaselessDictionary1 Commands)
MainFrameActions: Packaging (Windows (64-bit)): bei AutomationTool.Automation.Process(String] CommandLine)
MainFrameActions: Packaging (Windows (64-bit)): bei AutomationTool.Program.MainProc(Object Param) in e:
MainFrameActions: Packaging (Windows (64-bit)): Unreal_4_9_1VXGI\Engine\Source\Programs\AutomationTool\Program.cs:Zeile 134.
MainFrameActions: Packaging (Windows (64-bit)): bei AutomationTool.InternalUtils.RunSingleInstance(Action`1 Main, Object Param)
MainFrameActions: Packaging (Windows (64-bit)): bei AutomationTool.Program.Main() in e:\Unreal_4_9_1VXGI\Engine\Source\Programs\AutomationTool\Program.cs:Zeile 53.
MainFrameActions: Packaging (Windows (64-bit)): ProcessManager.KillAll: Trying to kill 0 spawned processes.
MainFrameActions: Packaging (Windows (64-bit)): Program.Main: AutomationTool exiting with ExitCode=Error_MissingExecutable
MainFrameActions: Packaging (Windows (64-bit)): Domain_ProcessExit
MainFrameActions: Packaging (Windows (64-bit)): AutomationToolLauncher exiting with ExitCode=103
MainFrameActions: Packaging (Windows (64-bit)): copying UAT log files…
MainFrameActions: Packaging (Windows (64-bit)): RunUAT.bat ERROR: AutomationTool was unable to run successfully.
MainFrameActions: Packaging (Windows (64-bit)): BUILD FAILED
PackagingResults:Error: Error Missing UE4Game binary.You may have to build the UE4 project with your IDE. Alternatively, build using UnrealBuildTool with the commandline:UE4Game <Platform> <Configuration>

@GalaxyMan2015
Question my friend:
I saw the build errors Maxwell had and your suggestion to use your branch( I’m having the same issues, but when I go to fork it, github doesn’t seem to do anything (it just shows me the Nvidia branch) Suggestions?
P.S Git confuses me, so you might have to take it ‘slow’ with me XP