Announcement

Collapse
No announcement yet.

NVIDIA GameWorks Integration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Originally posted by Lord-Kvento View Post
    Please explain where you want to copy a file GFSDK_VXGI_x64.dll or GFSDK_VXGI_x86.dll?

    And what does the "Add a reference to GFSDK_VXGI_x64.lib or GFSDK_VXGI_x86.lib."?
    It sounds like your trying to follow the directions to use VXGI in a standalone application, that will not work for UE4. You need to obtain the UE4 VXGI branch from NVIDIA: https://github.com/NvPhysX/UnrealEngine
    NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
    Feel free to Donate if you wish to support me

    Comment


      Destructibles and Flex interaction

      Is there any way currently to have APEX destructibles interact with Flex particles? Destructible objects only seem to collide with Flex objects before they've been fractured. There was a demo NVIDIA released a while back with water in a glass tank. The tank is then shattered in various locations releasing the water. Any details on how that was achieved would be greatly appreciated. Thank you in advance.

      Comment


        Originally posted by GalaxyMan2015 View Post
        It sounds like your trying to follow the directions to use VXGI in a standalone application, that will not work for UE4. You need to obtain the UE4 VXGI branch from NVIDIA: https://github.com/NvPhysX/UnrealEngine
        As I understand it, at the top of the left column "Branch" instead of "release" should I choose "VXGI" right?
        After that, there are two options, or "Download ZIP" or "Clone in Desktop" right?
        This time I downloaded the ZIP file, this can be a problem?

        GalaxyMan2015, could you describe step by step how you get the engine + VXGI.

        In the above privacy I do not understand what to do after unpacking the archive. Perhaps you need something to download further (perhaps more the component to Nvidia)? Do I need to make changes to the VS (create links to change the code, etc.)?

        PS:Thank you in advance for your help!

        Comment


          Originally posted by Lord-Kvento View Post
          As I understand it, at the top of the left column "Branch" instead of "release" should I choose "VXGI" right?
          After that, there are two options, or "Download ZIP" or "Clone in Desktop" right?
          This time I downloaded the ZIP file, this can be a problem?

          GalaxyMan2015, could you describe step by step how you get the engine + VXGI.

          In the above privacy I do not understand what to do after unpacking the archive. Perhaps you need something to download further (perhaps more the component to Nvidia)? Do I need to make changes to the VS (create links to change the code, etc.)?

          PS:Thank you in advance for your help!
          1. Download: https://github.com/NvPhysX/UnrealEng...e/VXGI-4.9.zip

          2. Extract to your computer somewhere

          3. Run Setup.bat in the root path

          4. Once the above has complete (needs to download about 3.4gb of prereqs). Run GenerateProjectFiles.bat

          5. Open UE4.sln and compile UE4 solution.

          6. Once compilation has completed, Recompile ShaderCompilerWorker (under programs in Solution Explorer)

          7. Launch UE4Editor.exe (or Start Debugging)

          8. Wait for launch procedure to complete, it will get stuck on 45% for a while, since its compiling shaders.
          NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
          Feel free to Donate if you wish to support me

          Comment


            GalaxyMan2015, thank you for help! On your instructions I became much clearer process of building!
            Last edited by Lord-Kvento; 10-12-2015, 06:06 AM.

            Comment


              Originally posted by Kalman11 View Post
              Ah thanks, I suppose it's r.VXGI.DiffuseTracingEnable and r.VXGI.SpecularTracingEnable
              I believe so... let us know if it doesn't work.

              Originally posted by MonsOlympus View Post
              I thought the main reason Epic removed VGI was the fact it didnt scale well! I really want to see VXGI take off but maybe its just a bit before its time.
              Originally posted by Daniel.Wenograd View Post
              So I've been working on a small project with VXGI for a while now, and now that I've gotten used to it it's actually a really nice system. One concern though is lower end spec machines don't even have a hope at running the game...
              I believe VXGI is a Maxwell-onwards thing. Maxwell has lots of "stuff" (as best I can describe it) that accelerates VXGI quite a lot. Nvidia Pascal is around the corner. I'm sure AMD will look at how to optimise for things like VXGI though AMD is clearly behind Nvidia in terms of power consumption at least.

              There's probably not going to be any tweaks for VXGI that will really work for pre-Maxwell cards, unless Nvidia or AMD put a lot of effort into it.

              Unless they screw it up, Nvidia Pascal will blow away anything we've seen with Maxwell, we're talking the GTX 1080 Ti (or what ever they call it) doing 4K high-quality VXGI at 90fps+ and so on.

              With Nvidia, I don't know how they've done it but GPU rendering for non-realtime is really blowing up right now and sidelining the CPU.

              Cryptocurrency GPU mining is on the wane (AFAIK) because of dedicated ASICS and cryptocurrency significantly kept AMD purchases ticking along.

              Nonetheless GPU compute which had a rocky start is taking off now.

              So I admit, I'm a long time Nvidia fanboy... but the GPU world, more or less dominated by Nvidia at the high end, for PC + VR, is going to very significantly change 2015-2020+

              So bottom line is Intel Kaby Lake + Nvidia Pascal + VR will drive the high-end 3D and gaming world with 4K and full GI like VXGI, BUT it will take several years to filter "down" to mainstream PC gaming and then consoles and mobile.
              Last edited by srmojuze; 10-12-2015, 09:42 AM.

              Comment


                Hey guys, has anybody tried HairWorks integration in 4.9 version?
                Alone: The Untold - a story driven horror game

                Comment


                  Originally posted by BiggestSmile View Post
                  Hey guys, has anybody tried HairWorks integration in 4.9 version?
                  I tried hairworks test map with galxymans's latest build. Why do you ask?

                  Comment


                    Originally posted by Nigkdo View Post
                    I tried hairworks test map with galxymans's latest build. Why do you ask?
                    Oh, i totally missed the GalaxyMan's repo.. Just wondered if it could be merged successfully with 4.9 branch before i actually try that. I see now, thank you.
                    Alone: The Untold - a story driven horror game

                    Comment


                      UE4 + VXGI worked!!!
                      I used the instructions from GalaxyMan2015 and it worked.
                      Probably the problem was me. I downloaded version 4.9 and it work (before I tried to build a version 4.8).

                      Alexey.Panteleev and GalaxyMan2015, thank you for your help!

                      Comment


                        https://developer.nvidia.com/gamewor...airworks-1-1-1
                        NVIDIA HairWorks 1.1.1

                        Comment


                          Originally posted by srmojuze View Post

                          I believe VXGI is a Maxwell-onwards thing. Maxwell has lots of "stuff" (as best I can describe it) that accelerates VXGI quite a lot. Nvidia Pascal is around the corner. I'm sure AMD will look at how to optimise for things like VXGI though AMD is clearly behind Nvidia in terms of power consumption at least.

                          There's probably not going to be any tweaks for VXGI that will really work for pre-Maxwell cards, unless Nvidia or AMD put a lot of effort into it.

                          Unless they screw it up, Nvidia Pascal will blow away anything we've seen with Maxwell, we're talking the GTX 1080 Ti (or what ever they call it) doing 4K high-quality VXGI at 90fps+ and so on.

                          So bottom line is Intel Kaby Lake + Nvidia Pascal + VR will drive the high-end 3D and gaming world with 4K and full GI like VXGI, BUT it will take several years to filter "down" to mainstream PC gaming and then consoles and mobile.
                          Maxwell and Pascal run the tech great, and yes that's what it's built for, but there must be some way to offer a cut down, less accurate version for slower hardware. Lionhead's LPVs were very performance heavy at first as well, now it's more or less free in comparison, without any change in the hardware running it. Even if the lower spec settings for VXGI are as problematic as LPVs are when it comes to problems like light leaking, the option to be able to do something like that at least opens up the VXGI workflow a bit more. Right now, if you make a scene focused around VXGI, even if it runs great on high end machines, without doubling the workload the scene flat out does not run on even something like the GTX 680 well enough to call it playable. Setting a GTX 780 as the game's minimum spec in 2015/2016 is unreasonable to ask of the consumers right now.

                          Even making a change like lowering the minimum mapsize from 32 to maybe 16 or even lower, probably wouldn't need a significant rewrite to the systems while still lowering the performance hit that it has. The Sci-fi Hallway example included with the build is a great example of why disabling VXGI in a scene built for it isn't going to work out too well.

                          I'm all for future proofing the visuals in games, that's part of why I'm trying to make this tech work in the first place, but it is very difficult to convince non-artists on the team that the system is a good idea when their own computers can't comfortably play the game they're working on. Not to mention, as you said, VR, which when combined with VXGI will ensure nothing until 2020 can even consider running at the 90 fps that something like that requires (Unless Nvidia has a trick up their sleeve to not incur a performance hit when using the tech in VR). I do want this tech to work, been messing with the settings for over a week to try and get it going well enough on lower end systems now that I have a few scenes that can use it well, it's just a challenge to get it in a state where it runs well for it. For now at least, my solution is just to fill the scene with fill lights that only turn on with VXGI disabled and try to make it look as close as it can.
                          Last edited by Zero-Night; 10-12-2015, 09:56 PM.

                          Comment


                            Originally posted by Daniel.Wenograd View Post
                            Maxwell and Pascal run the tech great, and yes that's what it's built for, but there must be some way to offer a cut down, less accurate version for slower hardware...
                            I reckon that at the current polycount of today's games an Nvidia Pascal by the ~end of 2016~, say the GTX 1080 Ti ~will~ be able to run 4K with 90fps with VR with VXGI. At least the Nvidia Pascal Titan Y or whatever they might call it.

                            So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia's GameWorks VR efforts like "crushing" the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).

                            You rightly point out the issue of the next five years for mainstream PC gamers. I'm still learning UE4 so from a layperson PC gamer perspective indeed it's great if developers can "fill in the gaps".

                            What's been brilliant about PC game developers is the scalability they've (almost) always implemented in games. I've always enjoyed how you could play through a game at low, low settings and still get a feel for it, while gamers with the top-notch equipment enjoy all the maximum graphics possible ("really bad console ports" not included, of course).

                            Can you share how you are working with the scenes with VXGI on and off? I think that's a very noble approach to this whole matter. What's the method to toggle "force no precomputed lighting" on and off at runtime? I think that's a better approach than trying to use really low VXGI settings for low-mid GPUs.

                            So to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities. The game could (rightfully) be shown as, hey, this is how it looks on mainstream graphics cards (and it should still look good), but then, BOOM! with an Nvidia Pascal and better check out full dynamic GI with VXGI. Yes, it goes back to the controversial "The Way It's Meant To Be Played"... but if you're a game developer trying to deliver the best experience for gamers, as long as the experience for mainstream Nvidia and AMD is good and not purposefully crippled, why not? I don't know the industry but I suspect incorporating UE4, VR and VXGI in any game in the next five years will get a lot of support (and let's be honest, free marketing) from Epic, Nvidia and Oculus.

                            So all the best, you're at the very cutting edge and potentially looking at good rewards for your efforts, Lord willing! (That said, there is a very dark side of people being lost in photorealistic VR and not being able to adapt back to the real world... but that is something I suppose an individual developer has to consider themselves).

                            PS. As for VR Oculus has already specified (IIRC) the Nvidia 970 as ~minimum~ ...So for PC VR we're already in high-end Maxwell territory. Obviously at this point VXGI is not suitable for Maxwell VR, but high-end Pascals should be capable as I detail (estimate) above.
                            Last edited by srmojuze; 10-13-2015, 12:14 AM.

                            Comment


                              Originally posted by srmojuze View Post
                              I reckon that at the current polycount of today's games an Nvidia Pascal by the ~end of 2016~, say the GTX 1080 Ti ~will~ be able to run 4K with 90fps with VR with VXGI. At least the Nvidia Pascal Titan Y or whatever they might call it.

                              So the very high-end Nvidia cards of 2016-2017 I believe will handle 4K VXGI VR at high frame rates, particularly with improvements to UE4, VXGI and the drivers. Not to mention tons of optimisations being done for VR right now including Nvidia's GameWorks VR efforts like "crushing" the edges of a scene (https://www.youtube.com/watch?v=1Dn2JKfje2o).
                              Unless they completely break Moore's Law with Pascal, I doubt it'll be more than the standard 25% gap between the 900 and 1,000 series cards. A GTX 1080 Ti (or whatever they choose to call it, I think it's still unannounced what numerical system they're going to use. Might reset it like they did after the 9000 series) will not be able to run 4k at 90 fps in VR with VXGI in anything but a test scene. Even the 980 Ti right now struggles to run 4k in a fully populated scene at 60 fps, it'll be a few generations of cards before we reach that level. VXGI might be cheaper on the Maxwell and later cards, but even at lowest settings it certainly isn't free, and a lot of games in VR will likely still continue relying on double rendering like 3d Vision does for quite some time, doubling the performance hit.


                              Originally posted by srmojuze View Post
                              Can you share how you are working with the scenes with VXGI on and off? I think that's a very noble approach to this whole matter. What's the method to toggle "force no precomputed lighting" on and off at runtime? I think that's a better approach than trying to use really low VXGI settings for low-mid GPUs.
                              I do double the lighting work for every single scene. For VXGI I get it looking good with that first, then I turn off VXGI and use as many point lights as I need in order to simulate the look as closely as I can, completely defeating the point of dynamic lighting. Nothing that I'm doing, and nothing I can do, will let me use baked lighting as a fallback. The only solution I see so far is to make a second set of lights with shadows disabled to manually simulate the light bounces, and at that point if I'm going through so much effort, there's little reason to continue using VXGI at all. By manually placing lights, you're getting rid of the biggest advantage that it offers, the ability to change the content in the scene in a major way without breaking the lighting setup.


                              Originally posted by srmojuze View Post
                              So to sum up, at the end of the day the effort in making a game that has VXGI and non-VXGI all-in-one can pay off in terms of advertising and marketing opportunities.
                              Debatable if it is or not without having a lower setting option to scale down to. It's a lot of extra work doing the lighting for every single room twice, and you always run the risk of having to make compromises for one lighting setup to better compliment the other. It's not like PhysX particles or Waveworks where turning it off just turns off a few visual effects, using realtime GI fundamentally changes how an artist will go about lighting levels. It can be a great change too, letting level designers have more freedom with how the environments in the game react to the events going on in the scene. I see a lot of great things coming out of systems like VXGI in the future, it just needs to be a bit more adaptable to current hardware for it to be a viable option to develop a game around today.

                              Comment


                                Originally posted by Daniel.Wenograd View Post
                                Unless they completely break Moore's Law with Pascal, I doubt it'll be more than the standard 25% gap between the 900 and 1,000 series cards...
                                Thanks for the insight into the on-the-ground reality... If it is difficult to "switch" between VXGI and Lightmass then VXGI and similar for the mainstream is still a few years away, but no more than 5 years I reckon.

                                One point with Nvidia Pascal is that it's not really following Moore's law anyway because they have a lot of architectural changes and are "jumping" from 28nm to 16nm.

                                Fair enough with Maxwell, and Volta is only scheduled for 2018. But I do believe that by the end of 2017, the Nvidia Pascal "Titan Y" will be able do 4K VXGI 90FPS in real-world scenes. That's 2 years of quite a lot of optimisations with UE4, VXGI, GameWorks (including GameWorks VR), etc. Edit: Don't forget DirectX12, in 2 years time the DX12 speed optimisations will be quite significant.

                                So, time will tell, I guess it's good that everyone is putting in the R&D right now!

                                __________________________________________________

                                Edit: Here's a nice (optimistic) post about what Pascal could be:
                                http://forums.anandtech.com/showthread.php?t=2436009

                                "GP100: 550 sq. mm. die, 13.75 billion transistors, 6144 CUDA cores... In terms of performance, we'll be looking at no less than a full doubling of the Titan X's power."


                                I'm personally guessing 30%-50% improvement of Titan Y Pascal over Titan X Maxwell. If Nvidia totally hits it out of the park then 50%-100% but that's very optimistic. The 50%-100% improvements would be more in the "advertised" areas like deep learning, etc.
                                Last edited by srmojuze; 10-13-2015, 02:47 AM.

                                Comment

                                Working...
                                X