rendering a really big planet?

hi
i’m working an a small space level just for fun in the orbit of saturn. something like this ( http://vignette2.wikia.nocookie.net/xenoblade/images/f/fb/XBC_-_Saturn.png/revision/latest?cb=20131028135829) or this (http://mygaming.co.za/news/wp-content/uploads/2015/08/Destiny-The-Taken-King-Dreadnought.jpg)

  • spacesphere and distant sun placed
  • the player can move free inside a 5kmx5kmx5km quad in map center
  • the saturn and rings mesh? here comes the problem - i have to scale it to extreme to get the right effect (i tried with just 2d texture but it doesnt look good in VR and also its hard to get everything right because of the rings and moving astroids)

i have searched the whole day, there must be a other solution… like this i have seen in the video of EVERSPACE (https://www.youtube.com/watch?v=pJiq3wg1LVE)
so maybe someone can point me into the right direction?

someone showed me this, how they do this in Source (https://www.youtube.com/watch?v=jm6VOaY1G2k&feature=youtu.be&t=33s) is this the solution? someone has a clue on how to do this?

Thanks
YAMI

(Ps: sry for bad english)

Unreal is not double-precision (for a multitude of reasons) and therefore there is a limited range of positions you can reach, which frankly just doesn’t scale to space environments. To successfully pull off an Earth Orbit, we had to use a scale of 1cm > 1km. Going much higher than that resulted in huge precision errors for our orbiting movement system.

If I could go back, I would have avoided that route altogether because it is completely impossible to get the sense of scale. You have to make your Pawn/Satellite actor so small that it’s kissing the near clipping plane of the Camera, but doing so causes numerous issues with lighting & shadowing and when you get close to the object, you’re still half the size of Africa - so the illusion breaks up quickly.

The best way IMO, is to keep your ‘moving’ object or Satellite centered at the world origin, and move the environment around it, completely faking the ‘sphere’. I would use an extremely large plane/sphere segment that pans the texture across it in such a way that it appears as thought you’re moving around it. Or swap the meshes out now and again.

This solution would work fine for a fairly close orbit, and if you wanted to go really far away, you could swap to a sphere over time and try to make the transition as seamless as possible. Unfortunately, it is generally difficult to pull off the scale of space! (Hell I’ve been doing it the ‘logical’ sphere way for over a year now).

I dont need the player going around (right now)
the player can only move inside a smaller space

but even on this scale of saturn its not looking right…

i would really like to know how the EVERSPACE team did that … (EVERSPACE™ Greenlight UE4 Gameplay Trailer - YouTube) i mean look at this, the scale looks perfect…

seems like this will take a good while to figure out a good solution

I don’t know if you can put them in a skybox?? (well if you will not do the transition to approximate and land???..
https://youtube.com/watch?v=JSRsQpRfDlkhttps://www.youtube.com/watch?v=0YpZlUzkQVk

Go crazy with the scale of the planet since you arent planning to get close or land on it anyway. You also need more stuff around you as reference like EVERSPACE; there are asteroids, spacecrafts and moons of the planets that help give you a sense of speed and scale. And of course, you’ll need proper amount of detail on the planet’s surface.

@tyoc213 — it is allready in my skybox :wink:

@jacky — when i scale it bigger than on my screenshot i have massiv material / lightning / shadow problems
how big can something be in ue4 without causing problems? ist there an limit (save limit to go with?)

someone mentioned “2nd render pass on top” but i dont know what to do with that :smiley:

Unreal has single precision, so regardless of your unit size, you’ll be locked into about ±5…10 km cube if you want to maintain 1cm level of detail relative to size of humanoid character.

However, range of floating point is ±3.4E+38](3.4E+38 centimeters - Wolfram|Alpha). That is 3 billion times bigger than observable unverse. The problem is that the farther you move away from zero coordinates, the lower precision you’ll have. Single precision floats (used by UE4) have 6…9 decimal digits precision, which means, that one million kilometers away from zero coordinates the smallest detail you’ll be able to represent will have size of one kilometer.

Another issue is that beyond certain camera range all geometry will be clipped but that’s configurable.

So the OP should be able to build his planet IF walkable area is located at zero coordinates and distant objects do not move.
Also, I’d guess there will be significant problems with modeling packages when you try to work on this kind of scale.


Speaking of space sims, if I were making a spacesim in UE4, I’d probably go with custom megacoordinates that had higher precision but could be converted into smaller single-precision coordinates for specified sector. That would require a lot of voodoo, though, because you’d need to render objects in several passes, with - from largest scale to smallest and then composite several renders on top of each other.

^ Exactly. Smoke-And-Mirrors is the quick and easier answer here.

Even scaling an object too large will give you problems. I don’t know off-hand what the far clipping plane of the default camera is in Unreal, or even if it dynamically scales to the furthest pixel (unlikely) - but the more possible range you have for the Z-depth of the scene, the less precise the depth becomes. Very strange rendering things can happen at that point. The same is true for making the clipping plane too close - distant objects can distort heavily.

Rendering objects using different cameras with different clipping planes is an approach we looked at early on, and compositing the scene afterwards. But that’s quite a lot of overhead and engine-level work to get that implemented and not something we felt confident doing at the time, so we decided on a scale, and just had to live with the sense of scale being lost the closer you got to the planet. In your case, I would have the planet as part of the skybox, and intelligently rotate/offset the texture(s) inside the material to give the illusion of rotation / moving around the object. Saturn is a lot bigger than Earth after all, and the rings make things complicated.

In an ideal world, we’d be able to swap the precision of the engine with a switch and you could go about this the logical way - but that’s monumentally easier said than done - and double/higher precision comes with an increased cost and requirements of it’s own.

EDIT: Additionally, you’d be amazed at what a clever particle system that simulates dust flying past the spacecraft can do to help with the scale. Look at Elite: Dangerous for an idea of what I’m talking about.

I just want to point out that humans do not use binocular vision to detect distance for objects that are further away than about 35 feet. This would suggest that for things as distant as planets, you should be able to use some kind of hybrid skybox solution even in a VR application.

wOW, ini seperti saya, rendering ruang angkasa. :slight_smile: