I’ve created a sphere in Unreal Engine 5 using its modeling tools. The sphere’s size is roughly a quarter of the Earth’s size, with a radius of approximately 1500 kilometers. However, when I’m on top of the sphere, I’ve noticed that the vertices start behaving unpredictably, almost as if there’s a precision issue. My suspicion is that Unreal Engine 5 might still be using single-precision floats for its models instead of double-precision. Could this be the root cause of the problem, and if so, is there a way to address it within UE5? I’ve also recorded a video that illustrates the issue, and any insights or guidance would be greatly appreciated.
why did you create a 1500 km sphere? this is not the way to build levels. you gotta build the floor out of flat geometry or cubes.
i reckon this visual glitch is a depth buffer issue. that precision is limited. and there’s no way to compute that with higher precision. double precision is super slow on modern gpus.
I’m currently working on testing a custom gravity implementation for my game, which features procedurally generated worlds within a solar system. Unlike typical games that rely on flat terrains, my game requires spherical worlds. The issue I’m encountering is not related to depth buffer problems, as the objects in question are positioned close to the camera. Rather, the geometry itself—specifically the individual vertices—is causing erratic player movement. The player appears to move upwards, then falls, only to repeat this cycle over and over again.
I suspect the root of the problem lies in how the vertex data is stored, potentially in floating-point format, though I’m not entirely certain. If anyone has encountered this issue before and has a workaround, I would greatly appreciate any advice while waiting for Epic Games to resolve the issue.
As an interim solution, I plan to dynamically generate the world using patches of geometry. This approach will allow me to determine the dimensions of each mesh within the sphere at runtime, and cap them to more manageable sizes, such as 200 m².
okay. i tried it. collision complexity makes the player jump. by default both simple and complex are used. project settings help you fix that. the simple one is an approximation of the sphere, that will not be used when you’re in range to use the complex one. if you’re hitting this threshold alot i reckon have you get weird results. to get proper collision on the polygon level you should use complex only. you gotta have proper collision meshes ofc.
i can not reproduce the graphics artefact. so i can’t tell exactly what’s wrong there.
as for the world design approach: i dunno how the popular scifi explorers do it. imo… spheres should only be used for planetary visuals in space. when you get on the ground you basicly unwrap the planet and morph the visual into a planar world. it’s visual cheating. note: if you’d go full on explore 1500km planets you may consider a cubic sphere unwrap and world gen, so you don’t end up with the polar paradox that straight uv spheres have. even wrap around worlds fail to get the polar direction done. that wrap around is what i would propose you use for endless planetary surface. you will not get a proper pole region and you better not have a compass.
Ever seen this thread below… Is there any connection?
Thanks, yes i have seen that thread. I believe it’s the same thing i’m experiencing. It’s a known bug. Hope it gets fixed soon.