About disastrously low precision of vertex normals

I might be wrong, and I probably am, since I’ve never used Unity and don’t know what precision it uses, but doesn’t this setup show that Unreal Engine’s normals are more accurate? The direct normal node always gives a more accurate result than the derivetive calculations in Unreal Engine or Unity. Assuming that the derived calculations are equally accurate in both Unreal Engine and Unity, and DDX and DDY are screen-based derivatives, which are very inaccurate to say the least since they are calculated based on a quadrant of 4 pixels, doesn’t that mean that more precise stored normals result in a more noticeable difference? Again, I could be wrong, just a thought.

It is possible that you are not wrong, also what exactly are you trying to do that requires that high precision? Have you tried implementing what you want in Unreal Engine first to see if it is actually that bad?