I was just testing out some ideas to create a material that would allow me to blend two sets of normals together from separate UV channels on a single asset. The goal was to see if I could achieve something like the little edge chips/cracks seen in the P.T. Silent Hills demo (screenshot below).
What I was attempting to do is create one UV channel with all my surfaces unwrapped normally with standard UV seam placement for use with a tiling wall texture. On a second UV channel, I would unwrap all the chamfered edge polys and align them in a single direction that could then take a tiling âworn edgesâ normal map. I can assign the multiple UV channels with the TextureCoords node in the material editor, but the way the normals plugged into the second UV channel react to lighting is wrong. UE3 had issues with correctly handling lighting direction from normal maps assigned to multiple UV channels, and I was hoping this may have changed in UE4, but that doesnât appear to be the case.
Attached are a few images illustrating my test setup. Note that I made the bricks fairly extreme just to see what was going on. Also, itâs always the second UV channel that experiences the bad lighting flips. If I swap UV channels so the bricks are on UV0 and the tiled rough edges are on UV1, then the rough edges are reversed in odd directions (itâs just harder to see in a still image post).
If this issue with the way normals are handled on a second UV channel isnât something that can be resolved, is there a particularly good way to go about adding a tiling strip of edge decals to corners like the P.T. image below? Itâs definitely a tiled effect that gets repeated elsewhere, so theyâre not just using high-res custom models everywhere. Also in particular, note the rounded corner, which isnât conducive to using the standard decal sticker projectors. Iâm guessing they have a setup similar to Cryengine, which allows the use of deferred decal materials to be applied to any arbitrary geometry (it just writes whatever material inputs are set in the decal material on top of the rest of the deferred render pass), but thatâs currently not something UE4 allows (might make a request post later, since itâs an insanely useful tool for level art using tiling textures).
Yep, you can definitely set a different UV channel for different normal maps and then combine them. Itâs one way of combining a general normal map with small surface detail
âŚDo you care to elaborate as to how without getting normal/lighting errors? I guess if it wasnât clear reading my first post, hereâs an animated gif showing how changing UV channel 0 affects the lighting/normals of UV channel 1 in an undesirable way. The second UV channel doesnât change at all (the one where the obvious brick normals are applied) so regardless of what I do to the first UV channel, the second one with the bricks shouldnât be changing at all if this system works the way youâre saying it does.
This obviously isnât the case, however, since the shading and highlights on the bricks shift whenever I alter the first UV channel - even making the bricks look inverted at times when I move the other set of UVs around.
UV0 - UVs shown off to the right moving around
Arrow texture - mapped to UV0 (the one moving around)
UV1 - Never changes
Brick normals - mapped to UV1 (lighting should be static, not moving around)
Your UV shells should be arranged exactly the same in both of your UV channels. You answered your own question with your gif - rotating the first UV channel affects how the surface is rendered in engine, and will cause shading errors if your 2nd channel doesnât have the faces oriented in the same direction.
EDIT: When I say âarranged exactly the same,â I mean that you shouldnât rotate them on one and leave them unrotated on the other. If you rotate a face in one UV channel, you must rotate it the same in the 2nd UV.
No, you should be able to have completely different UVâs in your channel and use them the way you want.
Maybe itâs my screen though but it looks like the normal map doesnât look like the right color, are you sure itâs using Normal Map compression?
Whatâs happening there is that the normals of the surface are getting turned around, and that doesnât make much sense. Changing the first UV channel should not effect the normals in any way.
Actually, youâre wrong. UV Shell direction in different UV channels does affect how normal maps render, in the same way mirrored normal maps do. The engine struggles because you are giving each vertex normal information that is conflicting based on the UV shell direction.
OP, a simple test I would suggest - map your normal to UV set 0, lightmaps to UV set 1, and your diffuse to UV set 2. That should solve the issue, because the normals will be controlled by set 0 and wonât conflict. Hope that makes sense, this worked in UDK but it might not work in UE4.
No, UV direction should not in any way change the effect of a normal map or the normals of the mesh, thereâs nothing thatâs tied into that. And certainly there shouldnât be an issue where the first UV channel is effecting what happens on the second.
Mirrored normal maps have an issue because to be able to mirror the UVs youâre flipping the mesh, if youâre only rotating a normal map then itâs not a problem.
Youâre right! UV direction in different sets shouldnât affect the normal map, but for some reason in Unreal it does. OP even notes that in his initial post. This was a known problem in UE3 and it seems to have carried over sadly. The best way to solve it is use your unique normal map in UV set 0.
This is just unequivocally 100% wrong though, like Iâm not sure why this misinformation keeps getting posted when a 5 minute test in the engine itself will show this is factually incorrect. Itâs the exact issue I posted in the animated GIF above where I put the arrows in UV1 and brick normals in UV2 and rotated only UV1, which resulted in the highlights/shadows shifting around on the unchanged brick normals.
The whole reason I canât simply map all the normals to UV1 is for the issue shown in my first post. For large level architecture, thereâs a need for heavily repeated tiling normals using one UV set, and then another set of normals applied to the corner edges of geometry that require being mapped differently to separate UV set. If these edge textures were only albedo/roughness/metallic/etc. it would work, but as soon as thereâs a normal in there the whole thing is broken because the lighting is incorrect. Ideally we could just handle the second set of normals with manually-created decals like Cryengine allows and as I ask for in this post (apparently to no avail :().
Iâm going to do some tests and try getting something working with technique tonight, hopefully something quick to set up and has good results. None of the edges where anything is painted looks hard, wish I had P.T. so I could try to break it down more.
Iâm definitely interested if youâre able to get something working, but I donât have high hopes without the lighting/rendering/vertex math in the core engine being adjusted in some way.
Regarding the P.T. example - thatâs a pretty subtle use of the technique. Check out the images in my post here (the text is basically irrelevant for what weâre discussing here) - the really obvious normal mapped edge cracks are probably a better test scenario.
Alternatively, try to replicate the stonework edges from Z-enzymeâs post in the same thread (except with normal maps). The only thing Iâd keep in mind here that while the stonework in this post could likely survive as a floating set 1-bit alpha masked cards (which would then light correctly), the Cryengine shots in the previous example show the kind of smooth alpha blending that Iâd really like to see (which, as it stands in UE4 right now wonât light the same as the rest of the opaque geometry).
Nick! is right, the tangent basis is determined by the first UV channel. That is the basis from which the transform is performed when transforming from tangent into world space which is a required step when using tangent space normal maps. The engine actually only stores a single tangent vector. By doing a cross product with the tangent vector and the vertex normal, the other vector (some call it the bi-normal) can be derived without storing anything.
So if your normal map UVs disagree with the direction of the base UVs, they will be transformed in the wrong direction (ie if its upside down the G channel will be flipped).
That said, you could in theory attempt to bypass the final tangent->world transform and transform into a custom space for another normal channel as long as you impose a rigid structure for the orientation. But that isnât possible for a curveâŚYou may even need to encode the tangent vector into another UV channel using a script for that to work like you want in the 1st arch image.
For the non-curve case, you should be able to derive a stable tangent basis using the vertex normal and the material function âCreate Third Orthogonal Vectorâ combined with âMatrix 3x3 Tranformâ or possibly âInverse3x3Transformâ. But it wonât know about the curve so will gradually be wrong until it is 90 degrees wrong at the top of the arch.
If you are wondering why Tangent Space normal maps are the standard there is a pretty big reason: Compression!!
Local or World Space normal maps can be used, but then they are storing the full curvature of the vertex normals as well which tends to compress poorly. With tangent space, only the difference of the surface from the vertex normals is stored which compresses much better when combined with uncompressed vertex normals.
FWIW I have been wanting to do this for years as well. For the most part I worked around it by using floating shell geometry that has the UVs as UV0 etc. But that is way more work than it would be if it just worked with multiple UVs. I just ran into your other post about decal geometry and I am getting some conversations going among the rendering guys. It turns out that idea has been bounced around already and the time may be soon to get something like that scheduled. Seems related to this.
Great write up RyanB! I noticed this little hitch a while ago and just figured it was a limitation of tangent space normal maps, but it sounds like itâs actually more of an optimization thing.
You mentioned that you yourself hoped for this feature in the pastâŚI suppose that means thereâs a strong reason (performance?) calculating the tangent on uv1 as well is not in the engine and most likely wont be in the future?
Great to hear about the geometry decals possibly making their way into the engine someday though
Turns out that is is possible to derive a tangent basis by using DDX and DDY of the UV channel in question. I havenât figured out the details yet but I imaging you just get the ratio of the x and y UV slopes which gives you a screen space vector then you need to convert that into world space.
Then you have the tangent vector. To come up with the binormal should just be a cross product with that tangent vector. Then you have all 3: Normal (which is vertex normal), Tangent (our first derive) and binormal (derived with cross).
Then you simply need to transform the normal map into this new world basis. I may have some time tomorrow to experiment with this.
It lets you apply it, but it will not be a deferred decal material. I just tried this and it looks like the material is basically compiling an ultra basic fallback standard material.
This is a ânormalâ deferred decal. On the left its on a mesh and you can see it basically gets a black background.
On the right is what it looks like as an actual deferred decal, rendering only the normals to the underlying ground material.
Hey guys,
I did a quick test deriving the second tangent basis. It works but unfortunately the result is faceted. In some cases it may not matter. Iâll show some images so you can see what I mean.
Test mesh setup:
Derived Tangent Basis (arch trim only):
Here is how the tangent vector is derived:
And here is how the normal map is transformed using this vector and vertex normal:
Note: The material must have âTangent Space Normalâ = False. This means your other regular normals need to have a transform tangent->world to work alongside this.
The result:
This is the faceting in the tangent basis, each triangle gets a solid color without interpolation:
With lighting on this shows up as minor seams:
And for comparison, here is what the normals would look like using the default tangent basis (using tangent space normal = true):
Turns out that with some textures, the seams are really hard to see. With destruction stuff I think this may work pretty well. For super clean stuff, not so much.
Really curious to see how this looks with some better test content like Nick!'s arch stuff. I wonât have time to try better art with this for a while.
(nevermind the black on the flat wall, unrelated bug in my material).
There is also something a bit wrong with how I am doing the transform from view to world space.