Announcement

Collapse
No announcement yet.

NVIDIA GameWorks Integration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Is there a way to boost indirect lighting for Emissive Color in VXGI for/from a specific material? A node maybe? Because if I take more of the monitor screen texture (Doom Beta), then the static and the monitor surface (via Roughness) are over dominated by the monitor screen texture.

    I would like to have more light cast on the ceiling and the floor.


    What I want the monitor to look like. Not enough light on the ceiling and the floor, though.
    Click image for larger version

Name:	DM 0dot05.jpg
Views:	1
Size:	318.1 KB
ID:	1100655

    What I want the ceiling and the floor to look like. But the monitor is too bright now.
    Click image for larger version

Name:	DM 0dot25.jpg
Views:	1
Size:	329.0 KB
ID:	1100656

    This is the material I use. If I used an actual light actor, I would have also to implement some kind of flickering. So if I just could boots the indirect light emitted by the material, the VXGI would do the flickering for me.
    Click image for larger version

Name:	Material.jpg
Views:	1
Size:	280.1 KB
ID:	1100657
    Last edited by SouldomainTM; 02-16-2016, 02:52 PM.

    Comment


      Hi. Just got a quick question regarding Flex on Mac with Unreal. I can't find anywhere if it is or is not supported... But Xcode fails the build for Unreal + Flex.
      I have an Nvidia graphics card in my MacBook Pro, as well as the latest version of Xcode.

      Comment


        He JackWint3r, i have the same problem with building Flex on Mac. I can´t find a solution for fixing and i also can't find anywhere if it is or is not supported.

        Instead i have installed Windows to build and use it.
        Now i run into a issue, i hope somebody, maybe GalaxyMan2015 can help me out.
        I´ve ported GalaxyMan2015´s repo and updated it to 4.10.4, and opened more Funtions for Blueprint.
        So, now i´m able to spawn Flex Fluid in Game and use it as projectile. But when ever i spwan the projectile i got a small freeze.
        I thought i could use a own thread for this, but i can´t get it run. I would be happy if someone has the time to spent me a little help.

        Comment


          Originally posted by JackWint3r View Post
          Hi. Just got a quick question regarding Flex on Mac with Unreal. I can't find anywhere if it is or is not supported... But Xcode fails the build for Unreal + Flex.
          I have an Nvidia graphics card in my MacBook Pro, as well as the latest version of Xcode.
          Originally posted by sensemilla View Post
          He JackWint3r, i have the same problem with building Flex on Mac. I can´t find a solution for fixing and i also can't find anywhere if it is or is not supported.
          I believe FLEX supports Mac as it has OpenGL support, but the UE4 integration does not at this time. Its something that I was intending to look at further down the path, but I can't promise it'll work on Mac as I dont own one, the most I could do is ensure that it works via OpenGL on windows.

          Originally posted by sensemilla View Post
          Instead i have installed Windows to build and use it.
          Now i run into a issue, i hope somebody, maybe GalaxyMan2015 can help me out.
          I´ve ported GalaxyMan2015´s repo and updated it to 4.10.4, and opened more Funtions for Blueprint.
          So, now i´m able to spawn Flex Fluid in Game and use it as projectile. But when ever i spwan the projectile i got a small freeze.
          I thought i could use a own thread for this, but i can´t get it run. I would be happy if someone has the time to spent me a little help.
          I havent had much time as of late to get back to my GameWorks branch, so I can't really help with your question, but around March I will be getting back into it and bringing support to 4.11 (Skipping 4.10 although I did recently get HairWorks, WaveWorks, VXGI and HBAO+ working with 4.10). At that time, feel free to send me your additional blueprint functions (If you want) and I will merge them.
          NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
          Feel free to Donate if you wish to support me

          Comment


            Originally posted by GalaxyMan2015 View Post
            I will probably get blasted by others for this, but what a crock of ... that video has a lot of incorrect information in it. He doesnt understand the tech he is talking about, its all about the visual, and even then its fairly incorrect. and it seems people want to blame NVIDIA for the fact that AMD cards are not as good at tessellation as the NVIDIA cards. Or thats at least the opinion I got from the video author. Which makes no sense.

            Look I have no hate for AMD, I love TressFX and the other effects they have released for free (If I didnt, I wouldn't be implementing TressFX into UE4). But I also love GameWorks, being the owner of both NVIDIA and AMD cards, I have not seen the supposed massive drop in performance that others have seen in GameWorks games using my AMD card. So maybe I'm just one of the lucky few.

            I for one will continue to support NVIDIA and their relationship with Epic, but I think its best that the GameWorks techs remain standalone to the engine (as installable plugins at best). This way people who think NVIDIA is the devil and dont want to use the GameWorks techs are not forced into it. PhysX is a different story since its the primary physics system for UE4 and runs on CPU for both NVIDIA and AMD - I know the video makes mention of this and that it still tanks on AMD cards, but as I said, I have not seen this myself, so can't really say anything.
            https://i.gyazo.com/e73c53b1124d30ca...163fa34651.gif

            I have both AMD and NVIDIA cards and my issue is even on my 980ti it is doing things like this, maybe I am setting it up wrong but when I disable VXGI it looks fine but with it enabled lighting and particles go crazy and fps is 0. I have to turn the mode to unlit to even do anything in editor and this happens on my 980ti and my AMD 390x. I like nvidia and amd but when their tech is having issues on their own cards then I have a problem. For my game I want to support NVIDIA tech but because SOOO many people are having issues with it I am forced to set it up in settings for enabling and disabling it. On my titan x I am not getting these issues at all but not everyone has the money to go out and buy the titan x just to play a video game. What I would like to see is NVIDIA playing a little nicer like AMD is and either making things open source or fixing up their gameworks so it can be ran on either NVIDIA or AMD cards, either way they won't be losing an sale from me as I will continue to buy both cards to ensure my game works good on both cards so no matter who plays it they can all enjoy it without issues.

            I do disagree with that video however trying to make them evil, I get that NVIDIA doesn't need to do more work to make it compatible for AMD graphics card and they own all rights to their product and do not have to make it open source at all however if they did make it open source then they would get more people using it which they could do something like epic games is doing if they aren't already and get money off games/products using their tech. That way even if it is open source no rival company would be able to steal their hard work and they will get more people using their technology and buying their cards as their cards will always have better performance for it, just other cards won't be super bottlenecked and become unusable.

            Comment


              Originally posted by RynerLuteTLD View Post
              https://i.gyazo.com/e73c53b1124d30ca...163fa34651.gif

              I have both AMD and NVIDIA cards and my issue is even on my 980ti it is doing things like this, maybe I am setting it up wrong but when I disable VXGI it looks fine but with it enabled lighting and particles go crazy and fps is 0. I have to turn the mode to unlit to even do anything in editor and this happens on my 980ti and my AMD 390x. I like nvidia and amd but when their tech is having issues on their own cards then I have a problem. For my game I want to support NVIDIA tech but because SOOO many people are having issues with it I am forced to set it up in settings for enabling and disabling it. On my titan x I am not getting these issues at all but not everyone has the money to go out and buy the titan x just to play a video game. What I would like to see is NVIDIA playing a little nicer like AMD is and either making things open source or fixing up their gameworks so it can be ran on either NVIDIA or AMD cards, either way they won't be losing an sale from me as I will continue to buy both cards to ensure my game works good on both cards so no matter who plays it they can all enjoy it without issues.

              I do disagree with that video however trying to make them evil, I get that NVIDIA doesn't need to do more work to make it compatible for AMD graphics card and they own all rights to their product and do not have to make it open source at all however if they did make it open source then they would get more people using it which they could do something like epic games is doing if they aren't already and get money off games/products using their tech. That way even if it is open source no rival company would be able to steal their hard work and they will get more people using their technology and buying their cards as their cards will always have better performance for it, just other cards won't be super bottlenecked and become unusable.
              I would fully support NVIDIA open sourcing their products for the reasons you have already listed, but I dont see that happening any time soon unfortunately. As for your particular issue, that is very strange, I have not seen anything like that before.
              NVIDIA GameWorks merged branch (v4.9.2) (v4.12.5) (v4.13 p2)
              Feel free to Donate if you wish to support me

              Comment


                Originally posted by RynerLuteTLD View Post
                I will probably get blasted by others for this, but what a crock of ... that video has a lot of incorrect information in it. He doesnt understand the tech he is talking about, its all about the visual, and even then its fairly incorrect. and it seems people want to blame NVIDIA for the fact that AMD cards are not as good at tessellation as the NVIDIA cards. Or thats at least the opinion I got from the video author. Which makes no sense.

                Look I have no hate for AMD, I love TressFX and the other effects they have released for free (If I didnt, I wouldn't be implementing TressFX into UE4). But I also love GameWorks, being the owner of both NVIDIA and AMD cards, I have not seen the supposed massive drop in performance that others have seen in GameWorks games using my AMD card. So maybe I'm just one of the lucky few.

                I for one will continue to support NVIDIA and their relationship with Epic, but I think its best that the GameWorks techs remain standalone to the engine (as installable plugins at best). This way people who think NVIDIA is the devil and dont want to use the GameWorks techs are not forced into it. PhysX is a different story since its the primary physics system for UE4 and runs on CPU for both NVIDIA and AMD - I know the video makes mention of this and that it still tanks on AMD cards, but as I said, I have not seen this myself, so can't really say anything.
                I have both AMD and NVIDIA cards and my issue is even on my 980ti it is doing things like this, maybe I am setting it up wrong but when I disable VXGI it looks fine but with it enabled lighting and particles go crazy and fps is 0. I have to turn the mode to unlit to even do anything in editor and this happens on my 980ti and my AMD 390x. I like nvidia and amd but when their tech is having issues on their own cards then I have a problem...
                I'm not sure what you mean, but there are no games that use VXGI. It's beta and a fairly new tech from nvidia. The same for Flex and FlameWorks. So regarding what works in GameWorks and what doesn't work or with unusual poor performance, VXGI, Flex and FlameWorks are not a part of that what AdoredTV on YouTube was talking about.

                However, to my opinion AMD is growing into a problem for PC gaming. They just should stick to consoles only. I dare to say that intelligent people won't fall for the "dark side of a monopoly". So far, Nvidia always seem to have shown the mentality of great inventors. They did so by using their names for GPU architecture, and even by simply (not it is actually not that simple) inventing/improving their GPUs and APIs, too.

                For the sake of a scientific physiological experiment. I would love to remove AMD from PC gaming, give Nvidia a total monopoly over the Glorious PC Gaming Master Race, and observe what would happen! For how long could Nvidia resist Big Cat Capitalism? My money is on Nvidia's inventor spirit!

                I would fully support NVIDIA open sourcing their products for the reasons you have already listed, but I dont see that happening any time soon unfortunately. As for your particular issue, that is very strange, I have not seen anything like that before.
                I wouldn't want Nvidia to open the entire source just like that, though, and give it to AMD or anybody.

                By buying a Geforce, Nvidia customers don't just pay for the GPU. They also pay for better drivers AND the APIs. So Nvidia customers did contribute to the development of GameWorks. AMD customers didn't.

                It's simple. If AMD is interest in a "better PC gaming world". Then they should pay a reasonable amount of money for a licence, that allows them to modify any code inside GameWorks that could make GameWorks run on an AMD GPU.

                If Nvidia is afraid that AMD might start learning too much from GameWorks, and could try to make their own API at some point. Nvidia could make a limited patent on the source (which probably is going to be a pain in the a$$ messing around with the law), or just set up a license term that for as long as AMD has access to the source of GameWorks, that they won't start working own their own API.


                Just for the sake of pointing out the obvious. This problem with GameWorks won't go away any time soon. Actually, around the time when Pascal comes out, and the recent additions to GameWorks, which are very powerful, left beta and are actually get used in games. The AMD fangirls out cry will get even stronger!

                I wonder how long it will take until some devs won't even mind that AMD gamers can't haz GameWorkz. And start implementing features that can't be turned off. So far, I'm not aware of a single game that requires Nvidia hardware to run. But depending on how useful GameWorks turns out to be, this may change in the future. Some big studio can afford making their own tech (their own GameWorks like thing), but many others can't. They only can do what an engine like UE4 or Unity 5 can do.

                Comment


                  @SouldomainTM: Your rather naive to belive that nVidia wouldnt do the same as Intel atm, without a strong competition nVidia would just update there GPUs as little as needed. You insulting people that use Hardware from a company not favored by yours, well to say it in your own words: Your the fangirl. So far most nVidia Gameworks stuff, is more a nice to have but not a "must have" and anyways only for the top high end hardware. Making games, like you suggested that only would run with gameworks, would be a bad idea, because if you look on the steam statistics, most of your potential customers (that would play your game on all min. and 20-25 fps) have a intel GPU (another GPU that cant do Gameworks). Also don't get me wrong, I love the gameworks stuff and will defiantly use it optional (and I have a 980ti and GSync Monitor - one downside of nVidia for not using the standard "FreeSync" as the rest of the competitors, before you accuse me of beeing AMD Fangirl).

                  Comment


                    would anyone dare to try merging VXGI with 4.11?

                    I really need to make a presentation and VXGI (interior with dynamic sun/sky) would be really helpful. Currently the lack of VR instancing (VR SLI) in 4.10 makes it unusable even at lowest settings (50FPS even with SP 80). Merging the current VXGI 4.10 to the master is a ridiculous failure for me...

                    Comment


                      I really dont think this thread needs to turn into an AMD vs Nvidia debate.. Its about UE4 and the gameworks integrations.

                      Comment


                        He GalaxyMan, thanks for your reply and not only for that. Your implementation of Flex has inspired and impressed me. I´m really interested what you will bring to us on March. I will send you the additional functions before or in March, so i got a little time to better understand what i´ve done. Reading your code teaches me a lot, thanks for sharing.

                        Comment


                          Now that I have some spare time, I tried to fiddle with flex again.
                          However it doesn´t seem to work. I built the engine from the nvidia-tree and when playing the flex project nothing happens.
                          Maybe my gtx480 is too old, but I hope the cpu will take over instead.
                          Is there something I have overlooked? The documentation is all about the various attributes of flex-objects, so I presume
                          it is all set to go from the getgo.

                          Comment


                            You need a card with CUDA compute capability 3.0 or higher. (for flex 0.8 http://docs.nvidia.com/gameworks/con...ry/physx/flex/) The gtx 480 has only 2.0 (https://en.wikipedia.org/wiki/CUDA)

                            Comment


                              Originally posted by MrMaster View Post
                              Now that I have some spare time, I tried to fiddle with flex again.
                              However it doesn´t seem to work. I built the engine from the nvidia-tree and when playing the flex project nothing happens.
                              Maybe my gtx480 is too old, but I hope the cpu will take over instead.
                              Is there something I have overlooked? The documentation is all about the various attributes of flex-objects, so I presume
                              it is all set to go from the getgo.
                              You need a card with CUDA compute capability 3.0 or higher. (for flex 0.8 http://docs.nvidia.com/gameworks/con...ry/physx/flex/) The gtx 480 has only 2.0 (https://en.wikipedia.org/wiki/CUDA)

                              Comment


                                Ah, ok. I had hoped that it would move the calculation to the cpu like physx.
                                Thanks for your quick reply.

                                Comment

                                Working...
                                X