Announcement

Collapse
No announcement yet.

Luoshuang's GPULightmass

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Hello all, just download this a few days ago and been grinding at it. really love it. Especially the skylight bounce.. produces very good shading.

    Now, is there anyways I can utilize more than one gpu from my computer? I have octane (parallel speed increase) and redshift (almost parallel up to two gpus) and they render really fast by utilizing my FOUR gtx 1070s in my pc. It's a shame that GPU light mass uses only one of my cards..

    Can you Luoshuang make any alpha version of multiple gpu version of this GPU lightmass? it really really would be awesome!


    anyways, thanks again for this incredible tool. can't thank you enough.


    Kenneth

    Comment


      Originally posted by Farshid View Post

      same scene with same hdri in 3ds max render
      I did a quick test and the result was different from any version you shown... Will investiagte later

      Comment


        Originally posted by KennethC70 View Post
        Hello all, just download this a few days ago and been grinding at it. really love it. Especially the skylight bounce.. produces very good shading.

        Now, is there anyways I can utilize more than one gpu from my computer? I have octane (parallel speed increase) and redshift (almost parallel up to two gpus) and they render really fast by utilizing my FOUR gtx 1070s in my pc. It's a shame that GPU light mass uses only one of my cards..

        Can you Luoshuang make any alpha version of multiple gpu version of this GPU lightmass? it really really would be awesome!


        anyways, thanks again for this incredible tool. can't thank you enough.


        Kenneth
        Biggest problem: I have only one GPU and thus cannot perform any test lol

        Comment


          Click image for larger version

Name:	error-out-of-memory.PNG
Views:	2
Size:	138.7 KB
ID:	1524523Click image for larger version

Name:	assertion-failed.PNG
Views:	1
Size:	72.3 KB
ID:	1524524


          Has anyone come across these errors when using GPU lightmass 4.20.2?

          I merged the 4.20.2 unified settings zip with the "engine" folder, set the TdrDelay to 300 as recommended, and updated the Nvidia driver to a version >= 398.26 for my GeForce GTX 750 Ti graphics card.


          Here are my settings for the baselightmass file

          Click image for larger version

Name:	settings.PNG
Views:	1
Size:	4.4 KB
ID:	1524525

          The errors shown above appear when my lighting build is close to being finished, and the lighting build fails unfortunately.


          Is there a workaround for this?
          Attached Files

          Comment


            I mean, it looks like you're out of VRAM...750 Ti only has 2GB I think?

            @ Luoshang: As I understand it, this doesn't support multi-GPU baking (on the same machine), is that still correct?

            If I have multiple GPUs on my machine, is it possible to select which one does the baking? (E.g. to use the non-display GPU for baking)

            Comment


              Originally posted by DsyD View Post
              I mean, it looks like you're out of VRAM...750 Ti only has 2GB I think?

              @ Luoshang: As I understand it, this doesn't support multi-GPU baking (on the same machine), is that still correct?

              If I have multiple GPUs on my machine, is it possible to select which one does the baking? (E.g. to use the non-display GPU for baking)
              It should be possible by setting environment variable CUDA_VISIBLE_DEVICES. See https://stackoverflow.com/questions/...o-run-a-job-on (not Windows, but should be similar)

              EDIT: On Windows looks like we have a more straightforward way: setting it in NVIDIA Control Panel.

              Comment


                Originally posted by Luoshuang View Post

                Biggest problem: I have only one GPU and thus cannot perform any test lol
                You should start a Patreon and maybe that could be sorted out.

                Comment


                  Originally posted by Luoshuang View Post

                  It should be possible by setting environment variable CUDA_VISIBLE_DEVICES. See https://stackoverflow.com/questions/...o-run-a-job-on (not Windows, but should be similar)

                  EDIT: On Windows looks like we have a more straightforward way: setting it in NVIDIA Control Panel.
                  I will check it out. Thanks!

                  Comment


                    Originally posted by Luoshuang View Post

                    It should be possible by setting environment variable CUDA_VISIBLE_DEVICES. See https://stackoverflow.com/questions/...o-run-a-job-on (not Windows, but should be similar)

                    EDIT: On Windows looks like we have a more straightforward way: setting it in NVIDIA Control Panel.
                    @Luoshuang: Is it still possible to bake lighting on my single 750Ti gpu (2GB) without running out of VRAM? Or would upgrading my gpu be more beneficial?

                    Comment


                      Originally posted by Luoshuang View Post

                      It should be possible by setting environment variable CUDA_VISIBLE_DEVICES. See https://stackoverflow.com/questions/...o-run-a-job-on (not Windows, but should be similar)

                      EDIT: On Windows looks like we have a more straightforward way: setting it in NVIDIA Control Panel.
                      Just FYI, it doesn't work. Yes you can select which cards to use CUDA with, (they are all select by default). But GPU Lightmass will only use one.

                      Comment


                        Originally posted by Norman3D View Post

                        Just FYI, it doesn't work. Yes you can select which cards to use CUDA with, (they are all select by default). But GPU Lightmass will only use one.
                        Your quote and reply are confusing.

                        He was answering the question: "If I have multiple GPUs on my machine, is it possible to select which one does the baking? (E.g. to use the non-display GPU for baking)" and YES, it is possible to select which single one does the baking with either technique, and it DOES work. He always said that multi-GPU does not work.
                        Technology Officer
                        Magma3D

                        Comment


                          Originally posted by AVLX0510 View Post

                          @Luoshuang: Is it still possible to bake lighting on my single 750Ti gpu (2GB) without running out of VRAM? Or would upgrading my gpu be more beneficial?
                          Probably not. I tried baking super simple scenes with a 750 TI and was not able to. It runs out of memory always. May be something else is the culprit? or just maybe just 2GB doesn't even cut the most simple bake?

                          I had no issues with a 970 (4GB), or a 1050 TI (4GB), and of course no issues with 1080 TI.
                          Technology Officer
                          Magma3D

                          Comment


                            I get this error even on very simple scenes. This one has only 10 lightmaps. Something seems very wrong with the memory being "allocated"?

                            <None> === Lightmass crashed: ===
                            Fatal error: [File:C:\UE4-GPULightmassIntegration\Engine\Source\Runtime\Core\Private\GenericPlatform\GenericPlatformMemory.cpp] [Line: 193]
                            Ran out of memory allocating 18446744057210768144 bytes with alignment 0


                            0x000007fefd9ea06d KERNELBASE.dll!UnknownFunction []
                            0x000007fed2f044c0 UnrealLightmass-ApplicationCore.dll!UnknownFunction []
                            0x000007feb1c59a5c UnrealLightmass-Core.dll!UnknownFunction []
                            0x000007feb1bec55d UnrealLightmass-Core.dll!UnknownFunction []
                            0x000007feb1adf1b4 UnrealLightmass-Core.dll!UnknownFunction []
                            0x000007feb1b4cbff UnrealLightmass-Core.dll!UnknownFunction []
                            0x000000013f303457 UnrealLightmass.exe!UnknownFunction []
                            0x000000013f2e64f9 UnrealLightmass.exe!UnknownFunction []
                            0x000000013f2ce379 UnrealLightmass.exe!UnknownFunction []
                            0x000000013f2f1622 UnrealLightmass.exe!UnknownFunction []
                            0x000000013f312dc7 UnrealLightmass.exe!UnknownFunction []
                            0x000000013f3aa0a8 UnrealLightmass.exe!UnknownFunction []
                            0x0000000077a059bd kernel32.dll!UnknownFunction []
                            0x0000000077b3a2e1 ntdll.dll!UnknownFunction []
                            This is with a custom build of 4.20.2. I followed the instructions here but maybe I did something wrong: https://forums.unrealengine.com/deve...65#post1460865

                            Do I need to rebuild the entire engine? I only tried rebuilding UnrealLightMass.

                            [Edit] FYI, I have 16GB of system RAM and a GTX 1080 Ti, so I don't think it's a hardware issue. The message pops up while it says "Building lighting...", but I haven't seen the GPU-specific progress bar at all.
                            Last edited by DsyD; 09-07-2018, 05:25 PM.

                            Comment


                              Originally posted by Situx View Post

                              Probably not. I tried baking super simple scenes with a 750 TI and was not able to. It runs out of memory always. May be something else is the culprit? or just maybe just 2GB doesn't even cut the most simple bake?

                              I had no issues with a 970 (4GB), or a 1050 TI (4GB), and of course no issues with 1080 TI.
                              ​​​​​​​Hmm....this problem for me seems to happen on gpu lightmass 4.19+ on Windows 10. But gpu lightmass runs completely fine on versions before 4.19, on Windows 7, and without modifying the tdrdelay, this is with my 750ti...so I don't think my gpu card is the problem.

                              Comment


                                Originally posted by AVLX0510 View Post

                                Hmm....this problem for me seems to happen on gpu lightmass 4.19+ on Windows 10. But gpu lightmass runs completely fine on versions before 4.19, on Windows 7, and without modifying the tdrdelay, this is with my 750ti...so I don't think my gpu card is the problem.
                                I wonder where you found versions of GPU lightmass for anything prior to UE 4.19.
                                Technology Officer
                                Magma3D

                                Comment

                                Working...
                                X