Announcement

Collapse
No announcement yet.

Averaging Normals Along Intersection of Two Meshes

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Averaging Normals Along Intersection of Two Meshes

    How's it going UE community? I am currently developing on 4.18 and trying to take advantage of the "DistanceToNearestSurface" node. I have been referencing two attempts that I have found the internet search.
    http://polycount.com/discussion/1811...rs-battlefront
    https://www.artstation.com/artwork/9zRna

    I have been trying to average either the pixels or pixel normals along the intersection of the meshes. Unfortunately, I have no idea what formulas to even start trying to put together to achieve this.
    Some ideas to achieve this that I have but don't know how to implement are as follows...
    1. Get the pixels along the intersection of both meshes and either average the color information and/or average the pixel normal information.
    2. Tessellate the mesh and move the vertices along the hypotenuse between both meshes
    3. Use some other method without making the material Translucent to achieve this.
    It seems that this is a thing that a lot of people want. so I figured making a topic on it so everyone could contribute to a solution...
    Thanks in advance for your contributions and hopefully even some guys from Epic might chime in!

    #2
    Hello,

    It's normal that the blend is not perfect. First you need to put a Triplanar UV (WorldAlignedTexture) in both material ( Floor and object) to have the same "texture placement"
    Then to get a perfect blend of the normal, this gonna be more tricky and i'll need to take a look on the NearSurace to give you a better answer

    Comment


      #3
      The material used on the artstation page that you linked was posted and can be found here.

      Comment


        #4
        Thanks for contributing!
        @ausernottaken~I downloaded it before and I didn't work as I intended. What I was aiming to avoid depth fade and opacity altogether. If I remember correctly That material uses those attributes
        @erodann~I currently have a W.A.T. going. so I have a great transition but it's the Normals that give a "hard-edge" where they intersect. I'm in the middle of moving when I get settled over the weekend I'll look into the meta-ball project to get that "merge effect" possibly at the intersection...

        Comment


          #5
          So, I guess the polycount thread and the artstation page are using two different methods. Artstation is using transparency and temporalaadither, and the one in the polycount thread is probably using distancefieldgradient. I guess he never explains how he does the normalblending...

          The basics are quite simple, you use a normalized distancefieldgradient, feed in a worldposition and you get the normal in that location. You also use the DistanceToNearestSurface node to create a mask at the intersections so you can decide where you want to use these blended normals. (This is great to use for all kinds of effects)
          Distancefield voxels are quite lowres, and you will get a softblend between the meshes. Note that I disabled distancefield generation on a few of those meshes. It works also, as they will get normal influence from surrounding objects that have distancefield enabled, but they will not affect anything themselves. If you only want the ground to affect the rocks for example, then disabling distancefieldgeneration on the rocks might be a good idea.
          I made a basic test. Here is a video.
          http://take.ms/v0nUZL

          It takes some work to set this up in a production environment and get it to work well in all cases. And dependng on the meshes and how they are positioned, you might not always get a smooth blend as you see in the video. But I guess it's a starting point for you.
          Here you have the graph.
          Attached Files
          Last edited by kjetilhj; 02-28-2018, 04:38 AM.

          Comment


            #6
            @kjetilhj~That's Awesome!!!! So a few questions...
            1. If I choose to use World Aligned Texture, would that aid to further blend the intersection?
            2. I was thinking of trying to use the UE4 Bake to texture method that way all of this can just be static texture to replace all the math involved at the moment... do you believe this will work?
            3. I do intend on using this to blend rocks to landscape.... do you think it's possible to make this a blueprint and blend 2 assets and their own material instances together?

            Thanks for taking the time to break this down. I think I learn more from going through an existing material/ Blueprint and back engineering it. I'll definitely use this as the foundation of the environment in a RPG Title I'm developing. Thanks again for your help as There is a check list of aesthetics I'm trying to accomplish with UE4, the next feature is Interactive/ish fog!

            Comment


              #7
              1. Using world aligned uvs on the intersectiontexture is required, if you don't do that, you will for sure have a seam.

              2. Baking the texture of a terrain to a worldspace normalmap and using the worldspace position to lookup that texture and then sampling it the same way in the intersection would probably work, but you would have to rebake it after each terrain adjustment. Also, you would still need some form of automatic seam detection. Using distancetonearestsurface is quite easy, so I guess I would keep that part. Using the gradient you can also get normaladjustment to any mesh, not just a terrain.

              3. There is the issue you can't sample the landscape weights (as far as I know). So you can't know what exactly material is painted on the terrain. So, most likely you will have to choose that "this rock" is blending against sand, and "this rock" is blending against grass and so on. Maybe there is an automatic way, I haven't really looked into that. Then the uv's will have to match, and I guess I would do like some others, use a material function or a material parameter collection to control the the tiling. That way you have just one input which will affect all blendable materials.
              If your ground texture has 4 or less materials you cooould use vertexcolors as a "material selector" on the rocks, but it wouldnt make the most efficient material.

              Good luck!

              Comment


                #8
                @kjetilhj~
                1. I'm using 4.18 so I guess your MF worldPosition node IS the World Aligned Texture...
                2. My workflow would be...
                2a. Setup the environment
                2b. Finalize all assets and terrain (possibly merging groups of meshes together for culling and draw calls)
                2c. Bake all materials to textures
                2d. Credit you guys for helping me out...
                3. I might try to access which distance field the mesh is intersecting/ colliding with and reference the material that way.

                Comment

                Working...
                X