I am a relatively beginner solo dev working on a spaceflight simulator and chose Unreal Engine 5 largely because of its support for double precision world coordinates. I have a generic blueprint class for planets that scales itself given an equatorial and polar radius. It also includes a ProceduralMeshComponent that generates geometry for rings, which are two dimensional with a two-sided material.
I was originally testing this ring system with the planet Saturn, which has an equatorial radius of about 60,268 kilometers. Assuming that an Unreal Engine unit is equal to 1 cm, the actor is scaled until the sphere is 12,053,600,000 units across (the entire actor is 36,004,103,200 units wide including the rings). Of course, the engine handles this with ease but when I add a directional light to represent the sun, it looks like this:
The sphere static mesh does not cast a shadow on the ring mesh, or any mesh for that matter. However, if I set it to the size of Phobos (roughly 24.2 kilometers or 2,420,000 units wide), the shadow is cast but looks like this:
This confirms to me that it’s an issue with the scale as if I get the camera any further than this, the shadow disappears. As for the missing chunks of the shadow, I imagine that’s because of the fact that the static mesh sphere I am using for the planets only has 80 edges around its equator.
I tried enabling “Far Shadows” but it has no effect no matter how high I set the numbers. Is there any other way to significantly raise the distance at which dynamic shadows are rendered? Also, if my theory about the low poly count causing the blocky holes in the shadows that did render sounds reasonable, is there a way to avoid this without excessive subdivision?
Please let me know if you have any suggestions or require more information.