Bug on object normal when scaling along X, Y or Z

I discovered a bug on Unreal Engine 4.22, when I scale a static mesh along X only or Y only or Z only with scale 0.01 then the object doesn’t become a plane because the normal is not transformed.

Let me share with you my code that I wrote by myself, I share it to help you how to fix the bug.

matrix **Scale(float3 v)** {
    return matrix(v.x, 0, 0, 0,
                  0, v.y, 0, 0,
                  0, 0, v.z, 0,
                  0, 0, 0, 1
void Obj::CalculateTransformMatrices() {
    if (m_isTransformCalculated == false) {
        m_isTransformCalculated = true;
        // R = RotateX * RotateY * RotateZ
        matrix R = IdentityMatrix();
        R = mul(R, Rotate(float3(1, 0, 0), m_rotation.x));
        R = mul(R, Rotate(float3(0, 1, 0), m_rotation.y));
        R = mul(R, Rotate(float3(0, 0, 1), m_rotation.z));
        // M = Scale * R * Translate
        matrix M = IdentityMatrix();
        M = mul(M, Scale(m_scale));
        M = mul(M, R);
        M = mul(M, Translate(m_location));
        m_transform = M;
        // M^-1
        m_invertTransform = M.Invert();
        // Transform for normals: Scale^-1 * R
        float3x3 S_1 = (float3x3)**Scale(float3(
            1.0f / m_scale.x,
            1.0f / m_scale.y,
            1.0f / m_scale.z
        m**_transformForNormals** = mul(S_1, (float3x3)R);
} }

And here is the Vertex Shader (HLSL):

void main(ShaderVertex input, out PS_Input output)
    output.position = mul(float4(input.position, 1), g_worldViewProjectionMatrix);
    output.normal = mul(input.normal, (float3x3)**g_transformForNormals**);
    output.texCoord = input.texCoord;

The code I shared above is too complicate because I use a matrix, so here is the very easy method:

Scaling doesnt affect the vertex normals. this is correct behaviour.

imo, Blender’s Behaviour is not expected. I dont want to have a sharper normalmap when I scale my Plane for example

@Raildex_, when you scale a plane for example then you have to scale along 3 axes (X, Y and Z) instead of along (X and Y) only, that’s a math, but it’s not obvious, that’s why in many softwares, the scale along X and Y only is never advised, for example in Blender just press S then you scale along 3 axes, and in 3ds Max the default scale is “Uniform Scale” which is along 3 axes.

For example: When you scale a Landscape along Z only (ScaleZ = 0.1) in order to flatten the landscape, then you should have a normal almost uniform, and that real example really happened to me, but the normal wasn’t expected and I got a weird looking flat landscape.

An opposite example: Suppose you have an almost flat landscape with almost uniform normal and you want to scale along Z only (ScaleZ = 10), then you got a tall landscape with almost uniform normal, and that’s very weird.

Please don’t tell me that Blender, Maya and 3ds Max behaviours are not expected.

It is not that they are not expected.

In realtime rendering, both ways are feasible and being used, that is keeping original normals or transforming normals by scale, but not only latter adds additional cost and tangent space related complications, it is not desirable in majority of cases form usage standpoint, so UE4’s default behavior here is expected and correct.

So nope, that is not a bug.

But it’s not that complicate to implement the normal transformation in programming, I just wrote my code in my first post above, and just multiply the normal (object space) with the matrix g_transformForNormals, it’s easy thanks to that matrix. For example, in the Material, the output normal is the Tangent Space, then it shouldn’t be transformed directly, only the object space normal should be transformed (the artists don’t care about the math, unreal engine should calculate it automatically inside), and inside the programming, I believe that Tangent Space Normal is transformed into Object Space Normal, then use the matrix with it.

I scaled the sphere with ScaleY = 0.01

I insist that it should not. Understanding as to why it should not at first place comes only after meshing together an environment or two.

Hi guys, I don’t feel comfortable with the bug, for example on scaled rocks, so I help you more to fix the bug.
The code I shared at first post is too complicate because I used a matrix, so here is the very easy method: