Announcement

Collapse
No announcement yet.

Anisotropy material

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Originally posted by stororokw View Post
    I think you can get anisotropic highlights by writing a custom shading model. You just need to figure out how to fit things in the GBuffer.

    Ward's anisotropic model
    https://en.wikibooks.org/wiki/GLSL_P.../Brushed_Metal

    Far cry 4 - slides 12-21
    https://www.gdcvault.com/play/102223...e-World-of-Far

    These ones are for unreal engine
    http://oliverm-h.blogspot.com/2013/0...materials.html
    https://docs.unrealengine.com/udk/Th...cLighting.html
    Yeah problem there is UE4's cubemaps don't support anistropic materials. No engine's does right now. So you'd be stuck with entirely realtime lighting.

    Comment


      #17
      I removed my previous posts because I felt that I was just repeating the same thing over and over again. To simplify things:

      If you are trying to compute the anisotropy inside the material, you'll need the light vector; which you cannot obtain until after the gbuffer phase is over or you'll have to run a second light pass during the gbuffer phase(doesn't have to be a super complete pass or anything, can be heavily simplified if needed). In which case, you'll hog a lot more cycles.

      If you're trying to compute the anisotropy during the lighting phase, you'll need extra gbuffer information in order to compute it correctly. In which case, you'll hog a lot more bandwidth.

      If you don't want to do either, you'll have to run the shader in forward. In which case, you'll pay a very high cost if your model is being lit by many light sources.

      But here's a good read on what I was talking about earlier with having to do two lighting passes (if you want to stick with deferred and don't want to add in additional gbuffer channels):
      http://advances.realtimerendering.co...%20Course).pdf
      Read around page 78.

      This is why I suggested just doing it with an MPC because it's kind of a half-solution. That way the material would have some of the the light's information and could calculate things, without having to run a second light pass and without having to take the fat gbuffer route. You'd have to run some level BP math to track key lights around the player and average out things like the light vectors, intensities, falloffs, and so on. So you'd be offloading a little bit of work to the CPU and yeah, it can be a little slow at updating the materials this way, but it would probably work.

      Comment


        #18
        Started playing around trying to implement anisotropic highlights. Varying the roughness.


        Frenetic Pony:
        I am going to try and use a modified reflection direction, a method used in far cry 4.

        Ironic Paradox:
        Thanks for the insight. I think the two lighting pass looks interesting, but looks it is too complicated for me. Maybe I will try out the material approach later.

        Comment


          #19
          The problem with the FC4 method is that they use a really weird method for their normals. They pack the normal, binormal and tangent into a 10:10:10:2 format, by using some filtering and converting back/forth with quaternions. It eats a ton of extra cycles and bandwidth, is pretty **** lossy, but allows them to have the information for the lighting pass (it's method #2 on my list from last post). It's pretty much like if you had to run three separate gbuffer normal passes, except they pack three sets of normals inside of 32 bits, to do it all in one normal pass, instead of one set of normals in a 24.

          If you're trying to dive into it, I'd suggest checking out the BRDF.ush in the shaders folder. Within it, look for the anisotropic ggx section.

          Code:
          // Anisotropic GGX
          // [Burley 2012, "Physically-Based Shading at Disney"]
          float D_GGXaniso( float RoughnessX, float RoughnessY, float NoH, float3 H, float3 X, float3 Y )
          {
              float ax = RoughnessX * RoughnessX;
              float ay = RoughnessY * RoughnessY;
              float XoH = dot( X, H );
              float YoH = dot( Y, H );
              float d = XoH*XoH / (ax*ax) + YoH*YoH / (ay*ay) + NoH*NoH;
              return 1 / ( PI * ax*ay * d*d );
          }
          You'd have to do some digging around to verify:
          I think the H term is the half vector between the light and the view, so it should be something like H = normalize(L+V) or whatever the variables are labelled as
          N would be your normal
          NoH would be normalize(dot(N,H))
          RoughnessX/Y would be your tangent/binormal vector information
          X/Y would be your tangent/binormal surface roughness values
          I might have the roughnessx/y and x/y stuff backwards, but it should be the basic gist.

          Something along those lines. You'd have to edit the material so that it's roughness input takes a vector2(making sure to assign the channels to the right variables), as well as deal with the normal non-sense.

          And one more thing: The main reason why you need all three normals is because a regular normal just points outward and it's tangent/binormals would be undefined. Even if you had two of the three, the third could exist in two positions (left or right, 90degrees from it's counterpart). That's why you need all three, so that it can "spin" on the normal and have angular directionality for the anisotropy. Deferred rendering just goes by the pixels within the buffers and no longer has access to the models in order to check their tan/bi normals, hence why it takes three sets of normals. This is why it's usually just done in forward rendering.
          Last edited by IronicParadox; 01-17-2018, 03:52 AM.

          Comment


            #20
            Ironic Paradox:
            Thank you for the information. I was using Ward's model in the first image my previous post. I am curious as to where they use this D_GGXaniso function?

            As far as parameterization goes I see that some offline renderers use anisotropy and an 'angle/rotation' map was wondering if you know anything about that? I think it would be easier to create grayscale maps.
            http://support.nextlimit.com/display...ace+Properties
            https://blenderartists.org/forum/sho...for-anisotropy

            I switched to the GGX anisotropic function.
            Last edited by stororokw; 01-17-2018, 06:02 AM.

            Comment


              #21
              About the first part, my guess is they just keep a reference compendium of all the various major shading models so that they can switch them if they ever want to.

              I'm going to take a stab at this, even though I don't 100% know the specifics, so bare with me... Yeah, they are technically grayscale maps, but you need two channels for it just like a flow map. You still need a channel for both the tangent and the binormal because like I said earlier, if you only had one, the other could be 90 degrees to the left or the right. As far as I can tell, most anisotropic maps just put the tangent/binormal into XY UV space, rather than having to bake out three normal maps for each of the normal channels. I'm assuming a value of 128/128 would be your neutral, zero would be -180 degrees and a value of one would be +180 degrees. If you shift one channel, you need to shift the second by the same amount(picture rotating a 3-axis gizmo around the z axis, the x/y have to stay 90 degrees apart and 90 degrees apart from the z axis). It might be possible to simplify the second channel into being a 0/1 sign channel where it would say the binormal is to the left or right of the tangent, but then you'd still need to do some math to take the value of the tangent channel and then +/- the 90 degree offset.

              Edit: Oh and I'm not sure if you can have non-90 degree tangent/binormals in the anisotropic texture, but you might be able to get some effects out of it. The shader will evaluate both anyways and might make something interesting. Again, I've never really played around with them.
              Last edited by IronicParadox; 01-17-2018, 05:06 PM.

              Comment


                #22
                Thanks IronicParadox. It seems to be working. I changed it so that black is 0 degrees and white is 360, it probably makes more sense.

                Comment


                  #23
                  So I have come across a problem with the tangents at the seams of UVs. They don't appear to match up even though they share the same edge.
                  I am using the static sphere mesh provided by the engine. They seem to diverge towards the poles. red = tangent, blue = bi-tangent and green = normal.
                  This causes a disconnect in the shading when using the tangents.
                  Can anyone explain why this is happening and a possible fix?

                  Comment


                    #24
                    That could mean the UV is not properly done. You might want to create the mesh and Unwrap UV yourself to make sure this is the issue.
                    Nilson Lima
                    Technical Director @ Rigel Studios Ltda - twitter: @RigelStudios
                    Join us at Discord: https://discord.gg/uFFSEXY

                    UE4 Marketplace: Cloudscape Seasons
                    supporting: Community FREE Ocean plugin

                    Comment


                      #25
                      That is mesh-specific.

                      Comment


                        #26
                        Thank you both for the quick replies. That seems to be the case. I wasted hours thinking it was on my end. The fix is to change tangent smoothing angle to 180.
                        I will post a quote from maya's obscure docs:
                        Tangent Smoothing Angle
                        Specifies an angle below which tangents are smoothed. The default setting of 0 leaves the tangent space along UV borders and mirrored edges unsmoothed. Increasing this value smooths those regions which may help to remove artifacts in bump maps, normal maps and other advanced lighting models caused by tangent space seams.
                        https://knowledge.autodesk.com/suppo...98BC2-htm.html

                        Comment


                          #27
                          I made some progress.So, it now uses the modified reflection direction for the reflections and I changed to disney's mapping for anisotropy.
                          https://seblagarde.wordpress.com/201...-where-are-we/
                          https://rmanwiki.pixar.com/display/REN/PxrTangentField
                          Some tests:

                          Comment


                            #28
                            looks interesting. any working results so far?

                            Comment


                              #29
                              Nice progress, keep up the good work!
                              ArtStation

                              Comment


                                #30
                                This is awesome! Subbed to this thread, looking forward to seeing how this progresses!

                                Comment

                                Working...
                                X