Announcement

Collapse
No announcement yet.

NVIDIA GameWorks Integration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Originally posted by srmojuze View Post
    I'm personally guessing 30%-50% improvement of Titan Y Pascal over Titan X Maxwell. If Nvidia totally hits it out of the park then 50%-100% but that's very optimistic. The 50%-100% improvements would be more in the "advertised" areas like deep learning, etc.
    I'm thinking back to the statements the CEO of Nvidia made: http://www.pcworld.com/article/28981...8-way-sli.html
    Not sure what that means for performance in gaming...they said that they try to make Pascal ten times more powerful than Maxwell in compute tasks, but I can't translate that into anything useful for me. Maybe because I'm not good at speculating?

    Comment


      Originally posted by Dakraid View Post
      I'm thinking back to the statements the CEO of Nvidia made: http://www.pcworld.com/article/28981...8-way-sli.html
      Not sure what that means for performance in gaming...they said that they try to make Pascal ten times more powerful than Maxwell in compute tasks, but I can't translate that into anything useful for me. Maybe because I'm not good at speculating?
      Yeah the 10x (1000% increase) in speed is at specific "deep learning" type stuff which I barely understand. The 30%-50% increase is my personal guess based on stuff I detailed above... most notably going from 28nm to 16nm along with Nvidia's overall architechural change track record (post-Fermi). There's also the "CUDA Cores" kind of stuff where if you're talking the Pascal Titan Y having 4000 CUDA cores, their new 3072-bit (that's not a typo, AMD is shipping 4096-bit cards now) HBM2 memory implementation... certainly for gaming there should be big increases.

      There's also all the GPU compute stuff which will impact developers and media producers. Encoding, rendering, is all being revolutionised by GPU compute. Here's a render on my GTX 660M Kepler from Daz3D with Nvidia Iray. For argument's sake I set the time limit to 5 minutes, 1920x1080 (only post process was grey background and increase in brightness (curves):

      Click image for larger version

Name:	CHAR_01.jpg
Views:	1
Size:	88.1 KB
ID:	1091218

      This is of course nothing compared to the stuff coming out of people with Octane and say, two Titan X, let alone a GPU cloud rendering cluster (it's considered "realtime" in the latter).

      Besides hardware, as mentioned there's DX12, GameWorks VR and so on, so yes I am very optimistic about Nvidia Pascal hardware ~and related software~. Let's take Lightmass for example. Sure, the current quality when you crank up the settings is phenomenal. But if you took Nvidia IRay or VXGI or OpenCL in general and made a Lightmass-type baking system, you could get much faster baking.

      To sum up anything less than a 30% increase of Titan Y over Titan X in gaming (UE4, 4K, VR, DX12 etc) and mainstream GPU rendering (IRay, Octane) would be an Nvidia misstep in my opinion.

      Edit: On the AMD side the Dual-Fiji should be out soon and that should do 4K VR 90FPS etc. Will have to be liquid-cooled though, I think: http://wccftech.com/amd-dual-gpu-fij...eon-r9-gemini/
      Last edited by srmojuze; 10-13-2015, 11:18 AM.

      Comment


        Originally posted by srmojuze View Post
        Thanks for the insight into the on-the-ground reality... If it is difficult to "switch" between VXGI and Lightmass then VXGI and similar for the mainstream is still a few years away, but no more than 5 years I reckon.
        If Lightmass is an option, you'll always get higher quality out of baked lightmaps than realtime GI. I have high hopes for VXGI, but unless you put it up at 64 cones or higher it can't match it, and it's unreasonable to expect it to. VXGI renders once per frame, where Lightmass has hours to process a scene in the highest quality possible. Realtime GI only makes sense in a scene that couldn't normally be done with prebaked lighting for some reason, such as placing down entire buildings at runtime, or large maps populated with hundreds of objects (like a forest).

        Switching between them is impossible as well, since Lightmass uses static/stationary lights, and VXGI only uses dynamic lights. The only real way to switch is to turn all the dynamic lights to static ones and rebake the entire scene, not really an option to do at runtime.


        LPVs are a good alternative for a lower performance hit right now, but it doesn't support spotlights or point lights, making it only really work in outdoor scenes.

        If either LPVs added support for the other kinds of lights, or VXGI offered a low performance impact option, then suddenly VXGI would become viable for full projects overnight. The only reason someone wouldn't use it right now for a dynamically lit scene is that it alienates a lot of the PC gaming audience (and the entire console audience if you're targeting that).

        Comment


          Hi all, just trying to get familiar with these Nvidia specific branches of UE4, particularly for ArchViz work and making things that much more realistic.

          Is there a way to combine the features of the various Nvidia branches of UE4 or are we expected to compile each branch as we need those features? Really curious!

          Thanks!

          A.

          Comment


            Originally posted by Daniel.Wenograd View Post
            So I've been working on a small project with VXGI for a while now, and now that I've gotten used to it it's actually a really nice system. One concern though is lower end spec machines don't even have a hope at running the game. Aside from the obvious ones like 4 cones, sparse tracing of 4 and mapsize of 32, what kind of things can I do to get it working well on lower end hardware? It doesn't even necessarily need to look any better than manually placed point lights, which for now is the alternative option, to manually place a set of point lights to fake GI that are only enabled when VXGI is off. It's obviously not an ideal solution, since I essentially need to do the scene's lighting twice, and you lose a lot of the benefits of having a dynamic system in the first place.

            Speaking of, if that really is the limit of how much you can turn it down for more performance, would be a good feature to add in a flag to the lights like I've been doing in the project. Basically, any light that I want on when VXGI is disabled, I end the name with _novxgi, essentially making a blacklist of fill lights that turn off when VXGI is on. It works great, except for the part where in the editor view it always shows both sets of lights even with VXGI enabled. I'm sure a more elegant solution is possible than what I did there.
            I feel like I'm beating a dead horse at this point, but one thing I would like to mention is that a lot of the Maxwell performance improvements are for individual features, not the system itself. There are certain features of VXGI that only run well on Maxwell, but don't improve quality much. That being said, if you already lowered the StackLevels to 3 (less isn't viable), disabled storeEmittanceInHDRFormat, lowered the MapSize, AND set the voxelization to use the lowest LOD on your meshes without reaching optimal performance, then you need a second solution. VXGI is intensive, but there are a lot of optimizations that can be made to make it run faster, including some within your assets, such as disabling it on certain materials or even reworking your meshes a bit to assist the voxelization process. I guess what I'm saying is that taking full advantage of VXGI is like taking full advantage of the PS3. You can do it, but it's a pain in the ***.

            Comment


              Hi,
              at the moment I try to create my first VXGI Build after a long testing period. But unfortunately the Build always fails. Are there known problems with GalaxyMan´s VXGI branch? At the moment I am using the Unreal Engine 4.8.

              Best regards, Andreas

              Comment


                Originally posted by AndreElijah View Post
                Hi all, just trying to get familiar with these Nvidia specific branches of UE4, particularly for ArchViz work and making things that much more realistic.

                Is there a way to combine the features of the various Nvidia branches of UE4 or are we expected to compile each branch as we need those features? Really curious!

                Thanks!

                A.
                Check out: https://github.com/GalaxyMan2015/Unr...1_NVIDIA_Techs thats all the gameworks techs compiled into one.

                Originally posted by Maxwell_77 View Post
                Hi,
                at the moment I try to create my first VXGI Build after a long testing period. But unfortunately the Build always fails. Are there known problems with GalaxyMan´s VXGI branch? At the moment I am using the Unreal Engine 4.8.

                Best regards, Andreas
                None that I am aware of, haven't touched the 4.8 build in god knows how long. I would try the 4.9.1 build if I was you. I use that same one and I am unaware of any issues.
                NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
                Feel free to Donate if you wish to support me

                Comment


                  Thank you very much for your answer. I will try the 4.91 build. :-)

                  Comment


                    That is exactly what I was looking for.

                    Thanks so much!

                    Comment


                      GalaxyMan you are awesome!!!! is there a way to donate you?

                      Comment


                        Originally posted by iuhiuh View Post
                        GalaxyMan you are awesome!!!! is there a way to donate you?
                        Thanks and yes. Check this post for details: https://forums.unrealengine.com/show...l=1#post391404
                        NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
                        Feel free to Donate if you wish to support me

                        Comment


                          I have tested the build 4.91. Unfortunately I am not able to build my VXGI projects. I get always this error:

                          Click image for larger version

Name:	error.JPG
Views:	1
Size:	25.3 KB
ID:	1091298

                          At the moment I dont know what I can do to solve this problem. I have no experiences with the UnrealBuildtool...

                          Comment


                            Is that with NVIDIA's VXGI branch? As I have tested packaging with my branch (the all merged GameWorks branch) and it works fine (I did have to make a couple of tweaks, which I will commit soon, but none for VXGI as far as I can remember), have been working on a BP only project all week using the engine, packaging each night just to ensure. No issues so far.
                            NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
                            Feel free to Donate if you wish to support me

                            Comment


                              Strange... I tested version 4.91 with two of my VXGI projects. I get always this error. I will try to compile the build again. Perhaps this will help...

                              Comment


                                Originally posted by CharlestonS View Post
                                So i have been messing with WaveWorks and the default material is kind of bland so i decided to add one of my custom water materials to the WaveWorks setup.

                                My material adds View Distance LOD to the Tessellation, SubSurface Scattering, Refraction and Depth Bias Visibility. Hopefully there is a 4.8 update to the WaveWorks integration soon so we canhave ScreenSpace Reflections on transparency which seems to be the only shader method WaveWorks supports right now.

                                Anyway, here's a quick video clip - 1080p @60fps

                                Epic Dude! Im working on a project and i wanted to do something just like this. Mind giving me a few pointers on how you got that to work? (I'm relatively new to game dev, and even more noob at UE4)
                                Hobbyist trying to go pro.

                                Comment

                                Working...
                                X