Announcement

Collapse
No announcement yet.

Dynamic Decal Blending + Mesh Integration Tools

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    [FEATURE REQUEST] Dynamic Decal Blending + Mesh Integration Tools

    It seems CryEngine are releasing some really powerful improvements to their engine tools that would go a LONG way if implemented nicely in UE4.

    A good overview of these features can be seen in this video: https://youtu.be/53LG-63uZ00

    These features include decal based blending (as seen in the video above, a single decal projects snow on to any objects under it, with control over the angle and thickness of the snow - both easier than a per material based setup and seems much more consistent over many objects (what's going on with their projection method, I see no seams?)).

    Also they seem to have really nice features that allow material sampling from a given object (eg. the terrain) to provide really seamless transitions with custom geometry- perfect to add detailed shapes like jagged cliffs to the soft terrain but use only the terrain material and make it appear to be part of the landscape and not a unique mesh with an obvious seam.

    I know there are ways to get almost similar results currently in UE4 but they are usually either too expensive or have glaring issues, having a properly implemented approach from Epic for really powerful, performant and easy to use blending is definitely the number one feature request I have on my list by far.

    Just think how much this simple change could have (if implemented so almost every piece of geometry in the scene could blend nicely with terrain/other objects)- level design would instantly get a lot more fun and powerful, moving things around and having everything update seamlessly instead of having to tediously paint or place objects to hide seams. It instantly give a big layer of polish to a level for "free".

    Please consider adding some nice native support for this, all the new VR features are great but I really don't think the number of devs that benefit from those specific platform features come close to the amount that'd benefit from this global change.

    DICE have also had this tech in their engine since Battlefront, and you can see how useful and amazing this feature is here: https://youtu.be/wvWAz_mEy8Q

    #2
    It just looks like a glorified and tooled version of triplanar mapping. Do you have any performance stats on it on/off? I'm pretty sure it wouldn't be too hard to make a mockup in UE4 and compare the two.

    Comment


      #3
      Originally posted by IronicParadox View Post
      It just looks like a glorified and tooled version of triplanar mapping. Do you have any performance stats on it on/off? I'm pretty sure it wouldn't be too hard to make a mockup in UE4 and compare the two.
      However it's done, it's still a great thing to have, especially if it can match the alignment of the textures and actually make a smooth transition with polygons instead of a hard edge. That's above anything you can get with triplanar alone.

      It would be a glorious day if I never had to make a skirt for an object again when trying to place it on the terrain.

      Comment


        #4
        Oh you can do that with distance fields and there are quite a few guides on it. Here's one of the first ones that I could find:
        https://forums.unrealengine.com/show...s-Battlefront)

        Using triplanar mapping and distance field contact blending, you'll pretty much have all of your bases covered.

        Comment


          #5
          Originally posted by IronicParadox View Post
          Oh you can do that with distance fields and there are quite a few guides on it. Here's one of the first ones that I could find:
          https://forums.unrealengine.com/show...s-Battlefront)
          You seem to have little to no idea of what is being talked about in this thread. No you can't do that with distance fields. Fields give you intersection. That is all. Thread starter and other poster here seems to be well aware of ability to recreate the same effect for case to case basis. Thread is not about that.

          I know there are ways to get almost similar results currently in UE4 but they are usually either too expensive
          Proper tool like this requires to sample terrain's normal map, height map and weightmaps and textures used in landscape material.
          It would be cool being able to enable this feature for any material and automatically pass in relevant maps in respect to which landscape component/ world composition level the mesh is in.
          Such implementation is generally possible and at first glance requires quite a bit of work, but should be doable.

          But...

          The underlying problem is that if you try to sample all that in mesh's material, which you want to blend, and add complexity of the mesh material itself... things go grim.
          Imagine the case of having to blend 5-6 landscape layers on the mesh.
          Remember, that the only information of the textures you would need for the mesh comes from landscape components.
          Now think of a case, where mesh is located in 4 landscape components simultaneously.
          Each component has 3-4 moderately complex layers.
          Your mesh material would need to sample 12-16 sets of textures(normal map and basecolor at least), in addition to whatever you have in the mesh's material.

          That is simply unrealistic and will remain so for years.

          I don't have definite information about implementation of such tool in the engines you've mentioned, but I'm almost certain it relies on virtual texturing.
          Without it, it seems impractical.
          Virtual texturing is on the roadmap and hopefully we will see it working in foreseeable future.

          I agree that terrain/mesh blending tool is a widely welcomed feature.
          Last edited by Deathrey; 07-28-2017, 03:55 AM.

          Comment


            #6
            also sampling the distance field is quite expensive. I find it really hard to justify almost doubling your instruction count just to remove a seam
            Follow me on Twitter!
            Developer of Elium - Prison Escape
            Local Image-Based Lighting for UE4

            Comment


              #7
              Originally posted by Deathrey View Post
              You seem to have little to no idea of what is being talked about in this thread. No you can't do that with distance fields. Fields give you intersection. That is all. Thread starter and other poster here seems to be well aware of ability to recreate the same effect for case to case basis. Thread is not about that.
              Oh boy, here comes Deathrey with his little personal grudge lol... Yes, I know plenty of what I'm talking about and you are apparently very under-informed about what you can do with that intersection... Yes, you can do exactly what Daniel described in saying "especially if it can match the alignment of the textures and actually make a smooth transition with polygons instead of a hard edge. That's above anything you can get with triplanar alone." With the area of intersection, you can then apply any sort of material or effect and get rid of the harshly joined edge; like adding in a WPO and modifying normals.

              Proper tool like this requires to sample terrain's normal map, height map and weightmaps and textures used in landscape material.
              It would be cool being able to enable this feature for any material and automatically pass in relevant maps in respect to which landscape component/ world composition level the mesh is in.
              Such implementation is generally possible and at first glance requires quite a bit of work, but should be doable.
              It's still just as easy to make the effect from within the terrain material. If you have a snowy mountain area, your snow material is already inside of the terrain material. Therefore, this is where you'd perform that kind of task because you'd be wanting to add snow to the rocks that are intersecting the landscape. You can even throw in a vertex paint channel to add/remove the effect in different spots that you want/don't want it to be.


              But...

              The underlying problem is that if you try to sample all that in mesh's material, which you want to blend, and add complexity of the mesh material itself... things go grim.
              Imagine the case of having to blend 5-6 landscape layers on the mesh.
              Remember, that the only information of the textures you would need for the mesh comes from landscape components.
              Now think of a case, where mesh is located in 4 landscape components simultaneously.
              Each component has 3-4 moderately complex layers.
              Your mesh material would need to sample 12-16 sets of textures(normal map and basecolor at least), in addition to whatever you have in the mesh's material.

              That is simply unrealistic and will remain so for years.

              I don't have definite information about implementation of such tool in the engines you've mentioned, but I'm almost certain it relies on virtual texturing.
              Without it, it seems impractical.
              Virtual texturing is on the roadmap and hopefully we will see it working in foreseeable future.

              I agree that terrain/mesh blending tool is a widely welcomed feature.
              Here we go again with the complaining about landscape problems... It's like you find ANY reason at all to hijack threads and bring this up lol. I used to have a lot of problems with tons of layers, but that was when I was working with 6gb of ram. Since then, I'm now using 16gb of ram and haven't had an issue since. Hell, the level I'm working on has nine pretty complex layers and I'm not having any issues at all. If I wanted to, I could easily take it up to 16 layers. Upgrade your ram, optimize your materials, don't use 4k textures for every single channel and you'll live. Also, once it's all compiled, for standalone/packaged, it runs waaaaaaaay faster.

              There are a lot of ways to tweak the blending and save a lot of performance. A dithering approach would work well for games that have a render target of 1080p or higher.

              Originally posted by Chosker View Post
              also sampling the distance field is quite expensive. I find it really hard to justify almost doubling your instruction count just to remove a seam
              *Gasp* It's almost like you have to render an extra model!

              Don't get me wrong though, it would definitely be awesome for them to expand upon the decal system but that could throw a rack's worth of spanners into the mix. Projecting a decal is one thing, projecting a decal that can access the meshes, their materials, blend between them and even offset their geometry, is a pretty big beast to squeeze into the rendering pipeline. I'd still like to see some benchmark stats, of the Cryengine implementation, with that effect on and off.

              Comment


                #8
                Originally posted by IronicParadox View Post
                *Gasp* It's almost like you have to render an extra model!
                an extra model would just take over, so at any given pixel you either render one material or the other. a more expensive pixel shader means you're rendering all pixels more expensively.
                start doing more complex things like blending the same texture on your mesh as from the landscape below (which btw might be blending several layers at the same point) and adding more fancy effects and things get out of hand quickly.
                there's a reason the engine provides a material complexity view. you mock me when I mention shader complexity but then you yourself suggest to "optimize your materials" ?

                sampling the distance field is not prohibitively expensive, it's just a really big chunk of a feature (that stacks on top of others) that's really hard to justify when you get the point where you need to sacrifice features for the sake of performance.
                just because it can already be done in a certain way doesn't mean it's the only acceptable way to do it. what's being suggested here is to do things in a different (more powerful and efficient) way, taking advantage of the deferred renderer. unless you like to settle for less, there's nothing wrong with suggesting such things. especially if it's something that can be done like... *Gasp* other engines do!
                Follow me on Twitter!
                Developer of Elium - Prison Escape
                Local Image-Based Lighting for UE4

                Comment


                  #9
                  [MENTION=146056]Deathrey[/MENTION] [MENTION=540]Chosker[/MENTION], Just don't read those posts and don't reply. You already know what happens in the end.


                  Back on topic, this feature is currently not supported in UE4 and there's nothing you can do about it. It requires lots of hard work that has to be done by the engine development team. After Frostbite 3, Cryengine is the second engine to have this feature built in. But if I've learned one thing from comparing other engines with UE4 it's that you shouldn't expect to have the cool stuff, also at the same low cost in UE4 like other ones. We're still struggling with performance issues with almost anything related to landscapes (tessellation, performance per layer, grass tool, etc.) it'd be very unrealistic to ask for new landscape specific features while the basics are almost broken and not production ready.
                  Last edited by Maximum-Dev; 07-28-2017, 07:58 AM.
                  Artstation
                  Join the support channel
                  Gumroad Store

                  Comment


                    #10
                    Originally posted by IronicParadox View Post
                    It just looks like a glorified and tooled version of triplanar mapping. Do you have any performance stats on it on/off? I'm pretty sure it wouldn't be too hard to make a mockup in UE4 and compare the two.
                    It is, a bunch of people have been fooling around with it for over a year now after DICE showed off the same thing for Star Wars Battlefront. So far nothing's come of it, and all attempts have ballooned into far too much of a performance hit even if the visual results are nice.

                    Comment


                      #11
                      Originally posted by Frenetic Pony View Post
                      It is, a bunch of people have been fooling around with it for over a year now after DICE showed off the same thing for Star Wars Battlefront. So far nothing's come of it, and all attempts have ballooned into far too much of a performance hit even if the visual results are nice.
                      What people are doing isn't quite what dice did.
                      People are using distance fields to blend between 2 materials (the second material on the mesh being the same material on the landscape). But what dice did is that their meshes blend into every/any layer of the landscape they're placed on. The landscape layer underneath the mesh is sampled and bleeds into the mesh, which is not something anyone has been able to do in UE4.
                      Artstation
                      Join the support channel
                      Gumroad Store

                      Comment


                        #12
                        Originally posted by Chosker View Post
                        an extra model would just take over, so at any given pixel you either render one material or the other. a more expensive pixel shader means you're rendering all pixels more expensively.
                        I didn't mean literally... I mean't figuratively. You're trying to create something, similar to a skirt model, out of a shader; in order to bypass having to model and place the skirt.

                        start doing more complex things like blending the same texture on your mesh as from the landscape below (which btw might be blending several layers at the same point) and adding more fancy effects and things get out of hand quickly.
                        there's a reason the engine provides a material complexity view. you mock me when I mention shader complexity but then you yourself suggest to "optimize your materials" ?
                        You're asking for a next generation level "icing on the cake" type feature and you're not willing to pay the cost for it. Make up your mind. As for optimizing materials, yes, you should always optimize them but you need to make sure you optimize them properly. You don't need mesh-terrain blending on models that are 3000 units away and therefore you'd put a range on it; within the material.

                        sampling the distance field is not prohibitively expensive, it's just a really big chunk of a feature (that stacks on top of others) that's really hard to justify when you get the point where you need to sacrifice features for the sake of performance.
                        just because it can already be done in a certain way doesn't mean it's the only acceptable way to do it. what's being suggested here is to do things in a different (more powerful and efficient) way, taking advantage of the deferred renderer. unless you like to settle for less, there's nothing wrong with suggesting such things. especially if it's something that can be done like... *Gasp* other engines do!
                        Like I've said many times, different engines are designed differently. Picture an inverted pyramid. Now picture having to make all sorts of changes toward the bottom point. That's what would need to be done in order to implement a lot of these features(better landscape layering, "3d" decals, etc) in a more performant way. It would likely require them rewriting half of the rendering pipeline and that's probably not going to happen any time soon.

                        Originally posted by Maximum-Dev View Post
                        Back on topic, this feature is currently not supported in UE4 and there's nothing you can do about it.
                        We have been on topic but you're just cherry picking what you want to see.

                        It requires lots of hard work that has to be done by the engine development team. After Frostbite 3, Cryengine is the second engine to have this feature built in. But if I've learned one thing from comparing other engines with UE4 it's that you shouldn't expect to have the cool stuff, also at the same low cost in UE4 like other ones. We're still struggling with performance issues with almost anything related to landscapes (tessellation, performance per layer, grass tool, etc.) it'd be very unrealistic to ask for new landscape specific features while the basics are almost broken and not production ready.
                        Absolutely. The FB3 engine, if it were licensed, would probably cost studios ~5 million; at the least. It's a super AAA engine that's pretty much only used on games that are raking in tens of millions, if not hundreds of millions. Cryengine, on the other hand, has always been a leader and "trendsetter" in graphical fidelity. A platform only has so many cpu/gpu FLOPS to work with though, so where they are strong in some areas, they will be weak in others and it definitely shows in some areas. I'd rather have an engine that is a jack of all trades, master of none, than an engine that is strong in several areas and weak in many. Which is exactly why I've chose UE4 over other engines.
                        Last edited by IronicParadox; 07-29-2017, 08:45 AM.

                        Comment


                          #13
                          your skirt analogy doesn't hold up at all. we're trying to create a skirt on top of the body that doesn't also need to make all the fancy features of the body behind it.

                          how do you put a range on something "within the material" if dynamic shader branching isn't even there?

                          also the point of using deferred decals is to greatly reduce the cost of such icing on the cake. they are deferred, they have access to all the buffers that are already written in that frame. you know what that means right? you sample what's already there instead of having to duplicate the same functionality in a different material
                          Follow me on Twitter!
                          Developer of Elium - Prison Escape
                          Local Image-Based Lighting for UE4

                          Comment


                            #14
                            Originally posted by Chosker View Post
                            your skirt analogy doesn't hold up at all. we're trying to create a skirt on top of the body that doesn't also need to make all the fancy features of the body behind it.
                            Alright, table cloth ghost costume then... Unless you're using transparency, on the cut up table cloth, it shouldn't have to sample the pixels of the "rock" underneath it. Dithering is your friend. If there is a dithered mask, over top of the rock, then it should only have to pull information for the rock material on the pixels that the cut up table cloth ISN'T occurring on. Now if there is transparency going on, then yeah, it would have to sample the rock and do a blend. That's why you'd definitely want to avoid transparency like the plague; on an effect such as this.

                            how do you put a range on something "within the material" if dynamic shader branching isn't even there?
                            There is an awesome node called pixel depth. You can compare values and determine what you want to do from there. It's commonly used to fade things out like normal maps to a flat normal at a distance. This is one big way to help optimize materials.
                            https://docs.unrealengine.com/latest...ference/Depth/

                            also the point of using deferred decals is to greatly reduce the cost of such icing on the cake. they are deferred, they have access to all the buffers that are already written in that frame. you know what that means right? you sample what's already there instead of having to duplicate the same functionality in a different material
                            Oh I'm aware of how they work and the beauty of them them(still waiting on deferred spline decals). Trying to add in a feature, like this CE decal system, requires a lot of work though. UE4 has a pretty simple version of deferred decals and doesn't allow for WPO/displacement because of the stage in which they are rendered. After digging around, I found a quote from Andrew Hurley, on an answerhub post asking about WPO/displacement on decals, saying:

                            "World Position Offset and Displacement both rely on manipulating the vertices of the mesh utilizing that specific material. Despite how it may work in other engines, Unreal handles decals differently by writing to the Gbuffer and storing those properties to be manipulated by the decals themselves. It does not manipulate the underlying vertices of the mesh it is applied. This would more than likely require a refactor of how we handle decals in the engine, which might also make them more costly to use."

                            Comment


                              #15
                              Originally posted by IronicParadox View Post
                              Dithering is your friend. If there is a dithered mask,
                              dithering is not my friend. for quick LOD transitions it's good but for permanent effects up-close it's ugly. the only way to mitigate its uglyness is to use TemporalAA, which in UE4 gives you horrible smearing of moving objects (or the entire screen if the camera moves) and blurs the entire scene as well.

                              Originally posted by IronicParadox View Post
                              There is an awesome node called pixel depth. You can compare values and determine what you want to do from there. It's commonly used to fade things out like normal maps to a flat normal at a distance. This is one big way to help optimize materials.
                              https://docs.unrealengine.com/latest...ference/Depth/
                              no. there isn't an awesome node that allows you to optimize materials on the fly based on an evaluated condition.
                              you obviously lack this very simple shader knowledge so I'll make it clear for you: Lerp samples both inputs first and then blends between them. all the time. for all pixels. always. lerp doesn't do dynamic branching so when you "compare values and determine what to do" and "fade things over a distance", both versions that go into the lerp are contributing to the shader complexity, always.
                              fading things over a distance is used for cosmetic purposes (fading normalmaps into a flat normal, fading textures to a less tiled version to avoid repetition), but whether you know it or not, it causes extra overhead which I'm sure you'll agree means it's the opposite of optimization. pretending it optimizes your material is simply lying to yourself out of ignorance.

                              "optimize your materials". sure.
                              Follow me on Twitter!
                              Developer of Elium - Prison Escape
                              Local Image-Based Lighting for UE4

                              Comment

                              Working...
                              X