In case you were curious about precision of vertex normals used in Unreal Engine 5, here is the visual demonstration showing high-poly flat-shaded sphere with colors representing XYZ precision loss for normals outputted by VertexNormalWS node. The darker the color of a quad the better is the precision for its normal. I really didn’t expect to see it this bad and this is actually a deal breaker that forces me to move to another engine. BTW PixelNormalWS is bad too and the value it gives seems to be the same as in the GBuffer.
And for comparison here is how it looks in Unity. You can actually see the noise produced by derivatives, while in UE it’s masked by the much bigger difference between derived and vertex normals.
High Precision Tangent Basis doesn’t help much and shouldn’t be a solution anyway. Nothing other than proper full precision for vertex normals is acceptable for me.
So, how deep in the engine is this fundamental flaw?
well… you can read about octahedron normal vector encoding on google. the very Krzysztof Narkowicz implemented it in 2014 and it was a lil assisted for more speed.
the normals are compressed into 2 bytes. that’s the thing. full precision float3 normals would use 12 bytes. this is 6x the storage, pci-e data transfers and video memory to have it local to compute stuff. that includes local mesh data and the limited gbuffer channel memory. you wanna buy more vram to run high res meshes? compression is memory efficient and allegedly computes faster then other compression or general lower bit format solutions.
imho… if i’d factor in normal maps this is a neglectable precision loss on a flat shaded bowling ball.
some bit thinking applied it should have like atleast 128 directions on a circle. reasonable precision for a video game.
there’s a trail of pdfs that lead to other engines using similar stuff. like cry engine has a “quality” compression that may be nice, but doesn’t perform as well. if you know what i mean.
low poly is no argument. what precision you need for low poly? where you have even more coarse polygon surface angles. this makes no sense.
ohh… you can see it in raytraced spheres. only if you’re really up close, tho. basicly the same visual as in the 2014 blog post. i mean… it decompresses to and it looks similar to rgb8 world normals, which is the gbuffer limit. not much of a deal breaker for a normal view distance.
I might be wrong, and I probably am, since I’ve never used Unity and don’t know what precision it uses, but doesn’t this setup show that Unreal Engine’s normals are more accurate? The direct normal node always gives a more accurate result than the derivetive calculations in Unreal Engine or Unity. Assuming that the derived calculations are equally accurate in both Unreal Engine and Unity, and DDX and DDY are screen-based derivatives, which are very inaccurate to say the least since they are calculated based on a quadrant of 4 pixels, doesn’t that mean that more precise stored normals result in a more noticeable difference? Again, I could be wrong, just a thought.
It is possible that you are not wrong, also what exactly are you trying to do that requires that high precision? Have you tried implementing what you want in Unreal Engine first to see if it is actually that bad?
I’m doing something unique, that requires pixel perfect accuracy. Yes, I tried this in 3 different engines, only Unreal gives incorrect result, Godot and Unity don’t have this problem.
There is no other solution than a manual time consuming labor. The way I do it works perfectly for my workflow, fits the art style as nothing else and saves a lot of time in asset creation. Also, if other engines don’t have this issue then I’d rather use them than scrap the idea only because it doesn’t fit into the limitations of Unreal Engine. There is nothing extraordinary in having full precision normals, I’m just amused that game industry constantly makes such ugly sacrifices in its pursuit of some questionable goal.
This post I made for the awareness, I don’t expect that something will be done about it, after all Unreal Engine is notorious for its quirks as I learned it in a hard way and observed in various posts on this forum.
Unreal is sh|t and has been sh|t for ages, probably since 4.16 or so and arguably well before that.
And the industry is more full of political activist intent on making double sure their games fail than game makers.
But this doesn’t mean that its an industry wide thing - aside from failing companies, no one is really keen on using Unreal. And those who do, will often pick out a stable version and customize it deeply before they release anything.
Don’t mind the fanboys who scream “unreal is gr8”, honestly, there is far too many of those who don’t even know what a normal map is thanks to sh*t like chatgpt making them think they can code…
Deep. You would need to create your own geometry cache handling for a custom shader to make it work.
You can, but why bother? Use a better engine to begin with and live happy instead.
One thing worth noting is that i think the issue with the precision you see is more derivative of floating point limitation than it is of the vertexnormalws or pixelnormalws original value.
To check it: Define a point on the mesh to sample the value for and output to a debugfloat3 node (uv mapped to the surface of the object and scaled up for visibility).
Moving the box around you should see how much detail the initial value of the picked point actually has.
The engine butchers the math afterwards, so again. Better engine = live happy…
I remember some tutorials on youtube dedicated to cars and the nice reflections, and in the tutorial it was mentioned that unless we flip some setting in preferences, the reflections will be distorted because of the imperfect normals, or soemthing like that. Doesn’t that mean there is a solution ? Or at least one that improves the current bad quality.
I can’t imagine a better example to showcase this than high poly smooth nice reflective cars.
i think, you mean the “High precision normals” setting:
" High Quality Reflections
Default reflection quality settings strike a good balance between performance and visual quality. However, for projects that are less concerned about performance and want to push quality of reflections even higher, you can use the High Precision Normals GBuffer to do so."