Announcement

Collapse
No announcement yet.

Your thoughts on and comments to Volume Rendering in Unreal Engine 4.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Originally posted by RyanB View Post
    Not sure why the performance is so slow with so few layers, what video card do you have? if its only 10x10 and they are 2sided materials then it should only be 20 layers or so max. but once fullscreen it could cost a bit. also whats the instruction count?
    I have a GTX 1080 and the instruction count is 66 instructions.
    Click image for larger version

Name:	e6a5b44291.png
Views:	1
Size:	314.8 KB
ID:	1113032

    This was the smallest I could make it to still respond to the cutting effect.

    Comment


      #17
      Hmmm that doesn't seem too bad at all. It might be slightly faster to use an additive material instead of translucent and then multiply the emissive color by the opacity instead, but it probably is not a big difference.

      have you done 'profile gpu' to verify it was all translucency slowing it down? or checked the same scene without the effect? It may actually be related to the 1000 draw calls of the 10x10x10 meshes which gets doubled in VR.

      Also the 'use cutting plane' option could probably be a static switch instead of an IF statement if its something that doesn't get toggled at runtime.

      btw 66 is the vertex shader instructions which won't be a big deal for such lowpoly geo. 66 is nothing for vertex shader. the vertex shader cost doesn't matter how big the objects are on screen, its just about how many verts are rendered wheras the pixel shader cost is what will scale with screen size and overdraw.

      Your pixel shader cost is the 27 instructions listed. I am surprised this effect is so slow with that video card... is it that slow in regular editor without VR?

      I noticed your FPS is only 45 even when not looking at the effect which indicates you don't have your scene optimized for VR. It should be hitting 90 no problem when looking away. We are publishing some VR example stuff soon that should help.
      Last edited by RyanB; 08-05-2016, 12:03 AM.
      Ryan Brucks
      Principal Technical Artist, Epic Games

      Comment


        #18
        For Plane equation just store distance from origo and normal. Then you can test which side of plane point bit cheaper. Also prenormalize vector at cpu. Then it's just: dot(normal, point) < distance. Very cheap. You also can get away with "use cutting plane" branch by using zero vector as plane normal. You can also premultiply color and alpha.

        Comment


          #19
          RE: Jenny Gore
          I store the normal and vector from the origin. I do not see how I can prenormalize that vector that I use to calculate the dot product because I have to generate it for each point in world space on the mesh, something I do not know in advance. Further, if I use just the distance I have no way to determine where the cutting plane is in space. I just know that it is somewhere on a sphere with a radius of that length. However, your last suggestion I can try. It makes sense and is a great idea, I think. If I am missing something or misunderstood you, please correct me.

          Thanks.

          RE: RyanB
          In the editor I get 110 frames when looking at nothing and 45 when looking at the voxels. I get 80-90 in standalone editor when looking at the voxels. In VR preview I get 90 when looking away and 45 when I look at the voxels. I was recording at the time and the editor was up in the background.The GPU profile shows that there are two major costs in m frame. The HZB SetupMips Mips 1 and direct light. The translucency takes about 1/3rd of what those two took. I dont know what the former means though.
          I have no idea how to optimize for VR. I can optimize in general, but I am not sure what to do for VR specifically.

          Thanks.
          Last edited by NoobsDeSroobs; 08-05-2016, 05:27 AM.

          Comment


            #20
            You are using Plane Standard Form. What I suggest is to use Plane General Form. This is usually superior format in terms of performance(test is just single dot) and storage.(4 vs 6 scalar).
            Maybe this video is helpfull.
            https://www.youtube.com/watch?v=zA0A4iqUycY

            Comment


              #21
              Hi all

              Here are my findings on what I've discovered in trying to do volume rendering in UE4. Here are the approaches I've tried. All use some kind of UV indexing into a 2d texture exactly how @<a href="https://forums.unrealengine.com/member.php?u=82295" target="_blank">r</a>yanB describes it. I'd derived all that stuff myself with modulus etc so want to check out if ryan's new nodes in 4.13 (they missed 4.12) are doing the same but faster. I havn't bothered with the two z interpolations but aware that it was needed, its just been on the back burner. Check out my youtube videos for all the things ive tried.


              1) Using gpu particles to fill a bounded area that index into the 2d texture.
              Pros, easy to set up and get a lot of particles to fill the space. works with fourier opacity maps by default to get self and environment shadowing.
              Cons, particles are placed randomly (good/bad? bit of both), breaks up patterns not consistent placement though, density is not adapted to how many particles there are (could link) but hard to relate density to space between particles.

              https://youtu.be/x_N1MpElRQY

              2) Half angle slicing - required c++ plugin to generate procedural mesh that renders back to front to avoid deferred renderer transparency order issues. @<a href="https://forums.unrealengine.com/member.php?u=82295" target="_blank">r</a>yanB someone at epic has the code I believe if you want to look over it). Could probably be done in BP but worried about performance.
              Pros - good even distribution of geometry, can relate number of slices to density in material so changing number doesn't lower/raise apparent density.
              Cons - Had serious issues with fourier opacity maps and bounds of mesh. @<a href="https://forums.unrealengine.com/member.php?u=404" target="_blank">DanielW</a> wasn't sure of cause but thought it might be a bug. imho opacity maps don't give a good look for volumes anyway even when they dd work with particles. required inputting a light angle from scene to change slice direction. This method is prone to a 'pop' when you go beyond the angle threshold. a large number of these items causes the game thread to slow down because of generating that much vertex data and uploading to GPU. probably don't want that many anyway.

              https://youtu.be/BgxHYqcoNbI

              3) Ray marching. nice results
              Pros - everything what @<a href="https://forums.unrealengine.com/member.php?u=82295" target="_blank">r</a>yanB said
              Cons - Can be expensive with shadows as said before but read below......

              no video for that yet.


              So now my mind has been coming back to this and I want to allocate some time to finishing what I think will be a good solution. Basically the idea is to combine slicing with raymarched shadows. The cost will be as ryan said O(Density Samples + Shadow Samples) as the initial density is taken care of by the layering of the geometry and becomes a simple compositing thing for the renderer to just do. Each pixel will then do it's own shadow samples with raymarching. From my earlier ray marching, the initial denity step was pretty fast (on an nvidia 770 gtx). The Shadows samples slowed it right down so I was limited to ~16 as discovered also by ryan. Breaking it down into two linear cost steps would really make things doable. I'm kind of excited about it actually.

              In terms of content, I was using maya and rendering maya fluids through an orthographic camera over 512 frame or so and moving the clipping planes every frame with 1/512 width. then storing that in 2d texture made with texturepacker. I even was able to pack animation in a texture. I think I was able to fit 32 frames of 128^3 voxel data into an 8k texture. With 3 or 4 channels that could be 96 or 128 frames of monochrome (the indexing math is harder/more expensive but doable). That was this....

              https://youtu.be/O6c5QC0lQuU @<a href="https://forums.unrealengine.com/member.php?u=27308" target="_blank">NoobsDeSroobs</a> , feel free to email me or add me on gchat. I'm planning a livestream soon to revisit this and could work on it together.
              Last edited by dokipen; 08-05-2016, 08:44 AM.
              Visual Effects Artist, Weta Digital, Wellington New Zealand
              BLOG www.danielelliott.co.uk
              @danielelliott3d https://twitter.com/danielelliott3d
              Unreal Engine and VFX Tutorials https://www.youtube.com/user/DokipenTechTutorials
              2015 Showreel: https://vimeo.com/116917817

              Comment


                #22
                Re: NoobsDeSroobs and JennyGore:

                You should also be able to remove the normalizes since you are just testing the sign of the result and that won't change with vector length. ( maybe thats the same thing Jenny's link pointed out but I didn't watch it). Also, you could try moving that calculation to the CustomUVs so its all done on the vertex shader. It might save something. It might add some distortion to the result but should be ok.

                Re: dopiken:

                Everything I have read about half angle slicing also talks about using geometry slices but from what I can tell, you don't need the geometry slices and you could treat it just like any other ray marcher as long as you structure the trace to do it in the right order where it alternates between lighting and density samples in a way that shares them. Maybe its just to make it a bit faster to index into the correct locations?

                From my understanding, the biggest drawback to half angle slicing over regular ray marching is that it doesn't scale super well since you would need two more buffers for each instance in the world since each would have a different slice angle from different viewing angles. Technically shadow volumes have the same limitation where you'd need a different volume for each instance that has a different rotation but you could at least share un rotated instances.

                The nice thing about regular old ray marching is that even though it is slow, it is fairly simple and doesn't have many issues with scaling other than the brute force cost.
                Last edited by RyanB; 08-05-2016, 10:28 AM.
                Ryan Brucks
                Principal Technical Artist, Epic Games

                Comment


                  #23
                  Originally posted by Jenny Gore View Post
                  What I suggest is to use Plane General Form. This is usually superior format in terms of performance(test is just single dot).
                  I am unable to understand how I can speed it up using the general form. I have to generate the point vector anyways, so I am not sure what I will save. Could you maybe show me an example? If I had a screenshot or example code I might understand it better. Thanks.

                  Originally posted by RyanB View Post
                  Also, you could try moving that calculation to the CustomUVs so its all done on the vertex shader.
                  Custom UVs? How can I use those? I read this, but I dont see how I can access the custom UV, nor what exactly it does.

                  RE: dokipen
                  I am looking through your stuff now. Thanks a lot. There is a lot to read up on, but it is so far very helpful. As for your offer, I would be honoured to work with you, but there are two small problems. First, I was unable to find your email and I am not sure I know what gchat is. Second, I am quite new to the implementation and this kind of usage of UE4. As such I am not sure how much I can actually be of help. If you still would not mind I would love to cooperate. That way I will learn more and I will be able to create a better system.
                  Last edited by NoobsDeSroobs; 08-06-2016, 11:27 AM.

                  Comment


                    #24
                    Custom UVs means performing operations on the vertex shader instead of the pixel shader. Go through and read that whole page you linked carefully, it explains all of it including how to use it.
                    Ryan Brucks
                    Principal Technical Artist, Epic Games

                    Comment


                      #25
                      When your plane is stored in general form whole test is just:
                      Code:
                      bool whichSide = dot(planeNormal, wordPosition) < planeDistanceFromOrigin;

                      Comment


                        #26
                        Originally posted by RyanB View Post
                        Re: NoobsDeSroobs and JennyGore:


                        Re: dopiken:

                        Everything I have read about half angle slicing also talks about using geometry slices but from what I can tell, you don't need the geometry slices and you could treat it just like any other ray marcher as long as you structure the trace to do it in the right order where it alternates between lighting and density samples in a way that shares them. Maybe its just to make it a bit faster to index into the correct locations?

                        From my understanding, the biggest drawback to half angle slicing over regular ray marching is that it doesn't scale super well since you would need two more buffers for each instance in the world since each would have a different slice angle from different viewing angles. Technically shadow volumes have the same limitation where you'd need a different volume for each instance that has a different rotation but you could at least share un rotated instances.

                        The nice thing about regular old ray marching is that even though it is slow, it is fairly simple and doesn't have many issues with scaling other than the brute force cost.
                        I'd really like to hear your ideas about how to share the samples to mimic the slicing in a raymarch. I havn't seen that done anywhere before. although I cant quite see how that would be done in a pixel shader unless there is some kind of buffer used in multiple passes.

                        Thats true if you are using the half angle slicing technique. What I am proposing is to not actually do half angle slicing for the shadows but do slicing for the density pass only (in that case you don't even need half angle, just slice towards the camera) and then do the shadows in raymarching. If raymarching, you have do do shadow samples inbetween every density sample. If I do the slicing the density sample is done up front and overdraw and shadow samples are the main cost. It seems like in that case the shadow samples would be very similar to doing just a density sample. This matches with the big 0 cost like you say.

                        I'm close to testing. I'm actually first re-creating my slicing code in blueprint (mainly out of curiosity to see how it performs, especially when it comes to nativizing it and also because I want to be able to share this around without users having to compile plugins/modules etc). I'm up to the point where I've detected start and end intersection points and have written the box/plane intersection function. Now I just need to build the vertices and indexes.

                        Click image for larger version

Name:	UE4Editor_2016-08-07_23-49-26.jpg
Views:	1
Size:	219.0 KB
ID:	1113347

                        You're right in that there is no opportunity for early exit. I'll get this working and see what the difference is.
                        Visual Effects Artist, Weta Digital, Wellington New Zealand
                        BLOG www.danielelliott.co.uk
                        @danielelliott3d https://twitter.com/danielelliott3d
                        Unreal Engine and VFX Tutorials https://www.youtube.com/user/DokipenTechTutorials
                        2015 Showreel: https://vimeo.com/116917817

                        Comment


                          #27
                          That should work. You are basically talking about flipping the shadow volume behavior. The one downside to that method is it will require another large render target to store the density slices. It will have to be the same size as the original volume texture to avoid losing resolution. Also you would require a separate sliced volume texture for each instance in the world so it wouldn't be as flexible as regular ray marching. But the good news is you will be able to compute the shadows in parallel instead of as nested steps and it should avoid the shadow volume edge bleeding artifacts.

                          Layering your slices will bring back some of the cost as overdraw so it is hard to predict exactly how that might perform compared to regular ray marching. It won't be as fast as half angle slicing though because it is still not sharing the shadow samples between slices which means that the deeper samples will cost more than the ones near the edge of the volume, and there is more re-peat steps being taken.

                          I think shadow volumes will be a bit easier to start with so I may try an experiment with them soon. Basically you just precompute the light energy received for each voxel which is costly once, but from then on its just single sample reads to get it.
                          Last edited by RyanB; 08-08-2016, 10:25 AM.
                          Ryan Brucks
                          Principal Technical Artist, Epic Games

                          Comment


                            #28
                            Originally posted by RyanB View Post
                            I think shadow volumes will be a bit easier to start with so I may try an experiment with them soon. Basically you just precompute the light energy received for each voxel which is costly once, but from then on its just single sample reads to get it.
                            Can you compute shadows with this approach? If you precompute the light intensity for each voxel you still need to compute how much is obfuscated by other filled voxels. Or are you thinking about just shading of the voxel itself viewed in a vacuum?

                            Comment


                              #29
                              Shadow volumes are one of the most basic and common forms of lighting and shadowing volumetric effects.

                              No, it is not shading the voxel in a vacuum, it takes into account the loss of transmission through all other voxels that are visible from the current voxel towards the light. if you had rotated instances, each rotated instance would have to have its own shadow volume. If they were the same rotation then they could all share one.
                              Ryan Brucks
                              Principal Technical Artist, Epic Games

                              Comment


                                #30
                                I haven't read through all the replies here, but I did implement volume rendering in UE for our VR experience, Allumette. I wrote a shader that ray marched some voxel grids I exported from Houdini. I baked in the lighting, so there are no shadow rays to march, which made it feasible. The cloudscape was really really big so I had to use some tricks with empty space traversal in order to get it running at framerate with a reasonable number of steps. I presented this at DigiPro this year:
                                http://dl.acm.org/citation.cfm?id=2947699

                                If you can't get access to that paper, send me a message, and I can try to get you a preprint of the paper.

                                Also, if you are interested in volume rendering as a topic, I recommend checking out some of the notes from these SIGGRAPH courses (I helped out the first year in 2010).
                                http://magnuswrenninge.com/productionvolumerendering

                                Also, Magnus Wrenninge (who helped organized those courses) wrote a book on volume rendering that has a ton of good info:
                                https://www.crcpress.com/Production-.../9781568817248

                                Comment

                                Working...
                                X