Originally posted by Lord-Kvento
View Post
Announcement
Collapse
No announcement yet.
NVIDIA GameWorks Integration
Collapse
X
-
NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
Feel free to Donate if you wish to support me
-
Destructibles and Flex interaction
Is there any way currently to have APEX destructibles interact with Flex particles? Destructible objects only seem to collide with Flex objects before they've been fractured. There was a demo NVIDIA released a while back with water in a glass tank. The tank is then shattered in various locations releasing the water. Any details on how that was achieved would be greatly appreciated. Thank you in advance.
Comment
-
Originally posted by GalaxyMan2015 View PostIt sounds like your trying to follow the directions to use VXGI in a standalone application, that will not work for UE4. You need to obtain the UE4 VXGI branch from NVIDIA: https://github.com/NvPhysX/UnrealEngine
After that, there are two options, or "Download ZIP" or "Clone in Desktop" right?
This time I downloaded the ZIP file, this can be a problem?
GalaxyMan2015, could you describe step by step how you get the engine + VXGI.
In the above privacy I do not understand what to do after unpacking the archive. Perhaps you need something to download further (perhaps more the component to Nvidia)? Do I need to make changes to the VS (create links to change the code, etc.)?
PS:Thank you in advance for your help!
Comment
-
Originally posted by Lord-Kvento View PostAs I understand it, at the top of the left column "Branch" instead of "release" should I choose "VXGI" right?
After that, there are two options, or "Download ZIP" or "Clone in Desktop" right?
This time I downloaded the ZIP file, this can be a problem?
GalaxyMan2015, could you describe step by step how you get the engine + VXGI.
In the above privacy I do not understand what to do after unpacking the archive. Perhaps you need something to download further (perhaps more the component to Nvidia)? Do I need to make changes to the VS (create links to change the code, etc.)?
PS:Thank you in advance for your help!
2. Extract to your computer somewhere
3. Run Setup.bat in the root path
4. Once the above has complete (needs to download about 3.4gb of prereqs). Run GenerateProjectFiles.bat
5. Open UE4.sln and compile UE4 solution.
6. Once compilation has completed, Recompile ShaderCompilerWorker (under programs in Solution Explorer)
7. Launch UE4Editor.exe (or Start Debugging)
8. Wait for launch procedure to complete, it will get stuck on 45% for a while, since its compiling shaders.NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
Feel free to Donate if you wish to support me
Comment
-
GalaxyMan2015, thank you for help! On your instructions I became much clearer process of building!Last edited by Lord-Kvento; 10-12-2015, 06:06 AM.
Comment
-
Originally posted by Kalman11 View PostAh thanks, I suppose it's r.VXGI.DiffuseTracingEnable and r.VXGI.SpecularTracingEnable
Originally posted by MonsOlympus View PostI thought the main reason Epic removed VGI was the fact it didnt scale well! I really want to see VXGI take off but maybe its just a bit before its time.Originally posted by Daniel.Wenograd View PostSo I've been working on a small project with VXGI for a while now, and now that I've gotten used to it it's actually a really nice system. One concern though is lower end spec machines don't even have a hope at running the game...
There's probably not going to be any tweaks for VXGI that will really work for pre-Maxwell cards, unless Nvidia or AMD put a lot of effort into it.
Unless they screw it up, Nvidia Pascal will blow away anything we've seen with Maxwell, we're talking the GTX 1080 Ti (or what ever they call it) doing 4K high-quality VXGI at 90fps+ and so on.
With Nvidia, I don't know how they've done it but GPU rendering for non-realtime is really blowing up right now and sidelining the CPU.
Cryptocurrency GPU mining is on the wane (AFAIK) because of dedicated ASICS and cryptocurrency significantly kept AMD purchases ticking along.
Nonetheless GPU compute which had a rocky start is taking off now.
So I admit, I'm a long time Nvidia fanboy... but the GPU world, more or less dominated by Nvidia at the high end, for PC + VR, is going to very significantly change 2015-2020+
So bottom line is Intel Kaby Lake + Nvidia Pascal + VR will drive the high-end 3D and gaming world with 4K and full GI like VXGI, BUT it will take several years to filter "down" to mainstream PC gaming and then consoles and mobile.Last edited by srmojuze; 10-12-2015, 09:42 AM.
Comment
-
Originally posted by Nigkdo View PostI tried hairworks test map with galxymans's latest build. Why do you ask?
Comment
-
Originally posted by srmojuze View Post
I believe VXGI is a Maxwell-onwards thing. Maxwell has lots of "stuff" (as best I can describe it) that accelerates VXGI quite a lot. Nvidia Pascal is around the corner. I'm sure AMD will look at how to optimise for things like VXGI though AMD is clearly behind Nvidia in terms of power consumption at least.
There's probably not going to be any tweaks for VXGI that will really work for pre-Maxwell cards, unless Nvidia or AMD put a lot of effort into it.
Unless they screw it up, Nvidia Pascal will blow away anything we've seen with Maxwell, we're talking the GTX 1080 Ti (or what ever they call it) doing 4K high-quality VXGI at 90fps+ and so on.
So bottom line is Intel Kaby Lake + Nvidia Pascal + VR will drive the high-end 3D and gaming world with 4K and full GI like VXGI, BUT it will take several years to filter "down" to mainstream PC gaming and then consoles and mobile.
Even making a change like lowering the minimum mapsize from 32 to maybe 16 or even lower, probably wouldn't need a significant rewrite to the systems while still lowering the performance hit that it has. The Sci-fi Hallway example included with the build is a great example of why disabling VXGI in a scene built for it isn't going to work out too well.
I'm all for future proofing the visuals in games, that's part of why I'm trying to make this tech work in the first place, but it is very difficult to convince non-artists on the team that the system is a good idea when their own computers can't comfortably play the game they're working on. Not to mention, as you said, VR, which when combined with VXGI will ensure nothing until 2020 can even consider running at the 90 fps that something like that requires (Unless Nvidia has a trick up their sleeve to not incur a performance hit when using the tech in VR). I do want this tech to work, been messing with the settings for over a week to try and get it going well enough on lower end systems now that I have a few scenes that can use it well, it's just a challenge to get it in a state where it runs well for it. For now at least, my solution is just to fill the scene with fill lights that only turn on with VXGI disabled and try to make it look as close as it can.Last edited by Zero-Night; 10-12-2015, 09:56 PM.
Comment
-
Originally posted by Daniel.Wenograd View PostMaxwell and Pascal run the tech great, and yes that's what it's built for, but there must be some way to offer a cut down, less accurate version for slower hardware...
So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia's GameWorks VR efforts like "crushing" the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).
You rightly point out the issue of the next five years for mainstream PC gamers. I'm still learning UE4 so from a layperson PC gamer perspective indeed it's great if developers can "fill in the gaps".
What's been brilliant about PC game developers is the scalability they've (almost) always implemented in games. I've always enjoyed how you could play through a game at low, low settings and still get a feel for it, while gamers with the top-notch equipment enjoy all the maximum graphics possible ("really bad console ports" not included, of course).
Can you share how you are working with the scenes with VXGI on and off? I think that's a very noble approach to this whole matter. What's the method to toggle "force no precomputed lighting" on and off at runtime? I think that's a better approach than trying to use really low VXGI settings for low-mid GPUs.
So to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities. The game could (rightfully) be shown as, hey, this is how it looks on mainstream graphics cards (and it should still look good), but then, BOOM! with an Nvidia Pascal and better check out full dynamic GI with VXGI. Yes, it goes back to the controversial "The Way It's Meant To Be Played"... but if you're a game developer trying to deliver the best experience for gamers, as long as the experience for mainstream Nvidia and AMD is good and not purposefully crippled, why not? I don't know the industry but I suspect incorporating UE4, VR and VXGI in any game in the next five years will get a lot of support (and let's be honest, free marketing) from Epic, Nvidia and Oculus.
So all the best, you're at the very cutting edge and potentially looking at good rewards for your efforts, Lord willing! (That said, there is a very dark side of people being lost in photorealistic VR and not being able to adapt back to the real world... but that is something I suppose an individual developer has to consider themselves).
PS. As for VR Oculus has already specified (IIRC) the Nvidia 970 as ~minimum~ ...So for PC VR we're already in high-end Maxwell territory. Obviously at this point VXGI is not suitable for Maxwell VR, but high-end Pascals should be capable as I detail (estimate) above.Last edited by srmojuze; 10-13-2015, 12:14 AM.
Comment
-
Originally posted by srmojuze View PostI reckon that at the current polycount of today's games an Nvidia Pascal by the ~end of 2016~, say the GTX 1080 Ti ~will~ be able to run 4K with 90fps with VR with VXGI. At least the Nvidia Pascal Titan Y or whatever they might call it.
So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia's GameWorks VR efforts like "crushing" the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).
Originally posted by srmojuze View PostCan you share how you are working with the scenes with VXGI on and off? I think that's a very noble approach to this whole matter. What's the method to toggle "force no precomputed lighting" on and off at runtime? I think that's a better approach than trying to use really low VXGI settings for low-mid GPUs.
Originally posted by srmojuze View PostSo to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities.
Comment
-
Originally posted by Daniel.Wenograd View PostUnless they completely break Moore's Law with Pascal, I doubt it'll be more than the standard 25% gap between the 900 and 1,000 series cards...
One point with Nvidia Pascal is that it's not really following Moore's law anyway because they have a lot of architectural changes and are "jumping" from 28nm to 16nm.
Fair enough with Maxwell, and Volta is only scheduled for 2018. But I do believe that by the end of 2017, the Nvidia Pascal "Titan Y" will be able do 4K VXGI 90FPS in real-world scenes. That's 2 years of quite a lot of optimisations with UE4, VXGI, GameWorks (including GameWorks VR), etc. Edit: Don't forget DirectX12, in 2 years time the DX12 speed optimisations will be quite significant.
So, time will tell, I guess it's good that everyone is putting in the R&D right now!
__________________________________________________
Edit: Here's a nice (optimistic) post about what Pascal could be:
http://forums.anandtech.com/showthread.php?t=2436009
"GP100: 550 sq. mm. die, 13.75 billion transistors, 6144 CUDA cores... In terms of performance, we'll be looking at no less than a full doubling of the Titan X's power."
I'm personally guessing 30%-50% improvement of Titan Y Pascal over Titan X Maxwell. If Nvidia totally hits it out of the park then 50%-100% but that's very optimistic. The 50%-100% improvements would be more in the "advertised" areas like deep learning, etc.Last edited by srmojuze; 10-13-2015, 02:47 AM.
Comment
Comment