So my latest venture, is rendering hundreds of tiny star-like objects on screen that appear to be infinitely far away, similar to the way the sun-disk is rendered.
Obviously it’s impractical to draw and calculate hundreds of sun positions in one material that are just directly part of the pixel shader. A better method would be to render a texture map this way, but does anybody know quite how to go about it? The problem is, as soon as you have a ‘moon disc’, the trick of the UV-mapped skydome with stars breaks, as it appears that they are in front of the moon (which technically, they are).
Well, if you placed your starfield texture on a huge sphere, with the normals facing inwards, that’d give a good result (assuming your shader is up to par).
As for the moon, you could potentially also build it as a piece of geometry, instead of using a moon disc. That way, it could be inside the radius of your star-sphere, meaning it’d occlude the stars correctly.
This is how I have my day/night sky set up, and seems to work pretty well. (Pics TBA)
While possible, it does break the illusion of the massive distances between objects, especially if you’re moving fast. I’m prototyping what is basically a space-simulator, so doing it that way isn’t really an option.
My initial thoughts are to map the star texture to screen space, but have it related to world space/camera vector somehow, which could also eliminate UV-ing issues on a sphere mesh. But I need some idea of how to go about it. They really need to be infinitely far away. it’s also a huge problem if you want to render clouds on the sphere, because then your clouds are also insanely far away, or it gives the illusion that the stars are at the same level as the clouds.
What you could do is move the skybox based on the player location so that they never get closer to it, this would give the effect of infinite distance.
Thanks though I saw this already, the trouble is that method is a very big workaround for something that could just be some simple UV math in the material editor.
This could possibly work, until you get to large distances. Not a bad idea though. Would have to lock it to the camera position but not rotation, and per-player as well. Instantly the Multiplayer aspect (between 4 > 32 players at once) could make that quite… difficult.
Yeah my stars are just uv mapped to the sky mesh, that is it.
I was also going to suggest attaching the sky to the player, so the sky/stars move with the player and you can thus never get close by. I guess you may get a problem then with distant planets or such disappearing into the background mesh that has now moved forward though unless the mesh is insanely large.
Random idea -> use a post process blendable, mask out the depth and map a star texture only to the furthest away points. Then somehow make the star texture pan when the camera is moved. Then you don’t even need a sky/star mesh, the stars would automatically appear behind everything in the distance. Really complex to set that up though, in particular the moving of the stars based on the camera angle. Or perhaps using a cubemap?
The plan is to use a skydome as before, but distort the UV’s based on some camera/position math, which I figure can be done. It seems like the most efficient way to do it, it’s just figuring out the math to use, to actually distort the images so they “appear” to be miles away.
I’ll take a look at the atmosphere actor and see how it renders the sun, they manage to give the right illusion.
I tried my idea for rendering stuff very far away. This is a simple setup that relies on a cubemap. You could take this concept further than what I did here.
Scene transitions -> All volumes are to share the same blendable, the blend distance may possibly allow for a smooth blend also (and if not should be fixed by Epic as it blocks other uses of blendables as well).
Layered Animation and interaction -> You can build a skybox UE1/UE2 style, and then render a real time cubemap from that to have all animations and sky dome interactions you want. The capture camera would need a near clip plane set so it only captures the skybox. Performance impact of that would likely be manageable at such small scale. This would give you the same power and workflow as UE1 and UE2, including all kinds of clouds, moons, etc. type of effects.
PP priority -> You may have a problem with depth of field but most other things should render on top as if the cubemap is a mesh in the world.
I agree with Jbaldwin. I think you will get poor resolution stars using a cubemap but if that doesn’t seem to be a problem, do whatever is easiest.
I did the camera following thing once for a light beam effect in the underwater gears3 level. Ended up doing it in the vertex shader, basically positioning the effect at 0,0,0 and adding camera position to the worldpositionoffset. Very simple, no tick cost, and you can still tweak the star size/tiling independently. Also had to make the bounds huge.
I am not saying my method is best, but the question of how to do UE1 style skies came up in a couple of threads. I am just pointing out there are multiple ways about it as a means of exploring what can be done. Not just for stars, but also for just normal skies.
I suppose the use of some kind of mesh to map the stars too is inevitable by the sounds of it, rather than somehow trying to create a ‘virtual’ sphere by mapping the texture to screenspace and somehow manipulating the UV’s based on camera direction/location.
Perhaps I can find a way to factor the world position & forward vector of the camera into the math of the Sky Dome, so that I don’t have to move a mesh to stay with the camera, and the mesh can stay static. This would be better, since my game is for 32 tablet-based players which are all clients of a PC server.
Nice one! What’s ideal about this solution is I don’t have to mess with any networking stuff, it’s all done in the shader which is exactly what I wanted
I’m still confused as to what actual values the Vector nodes give out at times. Absolute World Position is just that right? The position of the pixel in world space? Screen Position still confused me somewhat, that’s a (assuming normalized) float2 from 0-1 in X and Y, corresponding to the pixels location on the screen right?
I hope you don’t mind me adding to this thread. I’m trying to do the same thing with a textured star sphere (actually a cube) that I usually use only in 3dsmax. It’s a reproduction of the actual night sky from a star database. In max, I link the stars to the camera and set it to inherit position only. As long as the star mesh doesn’t intersect with other objects in the scene as the camera moves around, I’m fine. The stars maintain their apparent size.
How can I do the same thing in Unreal? I’ve imported the stars but I can’t find a simple way to link them to the camera or feed the camera position to the star mesh.
I’ve just started with Unreal and I’d really appreciate some help.
Thanks.