Download

Method for rendering lots of stars/points that are infinitely far away.

So my latest venture, is rendering hundreds of tiny star-like objects on screen that appear to be infinitely far away, similar to the way the sun-disk is rendered.

Obviously it’s impractical to draw and calculate hundreds of sun positions in one material that are just directly part of the pixel shader. A better method would be to render a texture map this way, but does anybody know quite how to go about it? The problem is, as soon as you have a ‘moon disc’, the trick of the UV-mapped skydome with stars breaks, as it appears that they are in front of the moon (which technically, they are).

So… ideas anybody?

Well, if you placed your starfield texture on a huge sphere, with the normals facing inwards, that’d give a good result (assuming your shader is up to par).

As for the moon, you could potentially also build it as a piece of geometry, instead of using a moon disc. That way, it could be inside the radius of your star-sphere, meaning it’d occlude the stars correctly.

This is how I have my day/night sky set up, and seems to work pretty well. (Pics TBA)

While possible, it does break the illusion of the massive distances between objects, especially if you’re moving fast. I’m prototyping what is basically a space-simulator, so doing it that way isn’t really an option.

My initial thoughts are to map the star texture to screen space, but have it related to world space/camera vector somehow, which could also eliminate UV-ing issues on a sphere mesh. But I need some idea of how to go about it. They really need to be infinitely far away. it’s also a huge problem if you want to render clouds on the sphere, because then your clouds are also insanely far away, or it gives the illusion that the stars are at the same level as the clouds.

Take a look at this video series. He talks about how he handled stars and moons in his game.

What you could do is move the skybox based on the player location so that they never get closer to it, this would give the effect of infinite distance.

Thanks though I saw this already, the trouble is that method is a very big workaround for something that could just be some simple UV math in the material editor.

This could possibly work, until you get to large distances. Not a bad idea though. Would have to lock it to the camera position but not rotation, and per-player as well. Instantly the Multiplayer aspect (between 4 > 32 players at once) could make that quite… difficult.

In the case of multiplayer, I think you would simply not replicate the skybox and handle it all clientside.

Yeah my stars are just uv mapped to the sky mesh, that is it.

I was also going to suggest attaching the sky to the player, so the sky/stars move with the player and you can thus never get close by. I guess you may get a problem then with distant planets or such disappearing into the background mesh that has now moved forward though unless the mesh is insanely large.

Random idea -> use a post process blendable, mask out the depth and map a star texture only to the furthest away points. Then somehow make the star texture pan when the camera is moved. Then you don’t even need a sky/star mesh, the stars would automatically appear behind everything in the distance. Really complex to set that up though, in particular the moving of the stars based on the camera angle. Or perhaps using a cubemap?

Hello,

I’ll maybe say something stupid but maybe you can do a few stars instances with random locations and variations rear a transparent static texture.

The plan is to use a skydome as before, but distort the UV’s based on some camera/position math, which I figure can be done. It seems like the most efficient way to do it, it’s just figuring out the math to use, to actually distort the images so they “appear” to be miles away.

I’ll take a look at the atmosphere actor and see how it renders the sun, they manage to give the right illusion.

My initial thought:

In regards to “Generating your own stars”: This can be done using a noise equation in world coordinates. It is a bit tricky getting the right pattern so it has no tileable effects at such a large scale. I have successfully done this though using absolutely no texture information. Just the math nodes.

Here is an example of one I built for a very specific effect. So this won’t work for stars, but it shows a type of noise generation.
Noise Example: http://eat3d.com/files/mat_objectstaticfx.jpg

If you generated the right noise function, you could then align the size to the camera aspect ratio and run an equation that would constrain the values. So even if you got extremely close to the sky dome, it would still look like the stars were far away. This would be done by multiplying the noise resolution function by the distance of your ship / camera, and generating more noise based on distance. In effect - the closer you get, the more noise is generated, the farther you get, the less noise is generated. What this would do is effectively create an optical illusion that no matter how close, or how far, the star count would remain the same, the size would remain the same, etc. etc.

This is one of those situations though where there are multiple ways to accomplish the same thing. One of the other topics I noticed is you were having trouble with the moon rendering behind the stars? Have you thought about using the moon radius as a mask to block out the stars? Something else you could do is put the stars on their own geometry, disable certain rendering passes, then setting it to render in the background.

I tried my idea for rendering stuff very far away. This is a simple setup that relies on a cubemap. You could take this concept further than what I did here.

cubemapsky.jpg

https://dl.dropboxusercontent.com/u/2300830/cubemapsky.jpg

Interesting Hourences, this definitely seems like it could have some good applications.

At the same time though, it also seems it could be an issue for things such as:

  • Scene transitions - Transitioning from one PP Volume to another would require rigorous setup in maintaining and blending between PP sky materials and other PP Properties. This would be tedious and time consuming.
  • Layered Animation - Integrating 3D objects and elements that are suppose to be married to the sky, (Such as Mesh Clouds and objects behind the clouds), are limited to the PP Material and must also transition with each PP Volume Change (Which would be even more time consuming).
  • Sky Dome Interactions - Dynamic interactions with the sky would be extremely limited due to it being solely creating in a PP effect.
  • Sky issues could also show up if you program other PP effects to take place. You then would have issues if you set their priority higher. In which case, you would have to set the sky up in every effect you have that is driven with a higher PP priority.

Of course, if none of those things are ever going to be implemented, then this shouldn’t pose a problem really.

Scene transitions -> All volumes are to share the same blendable, the blend distance may possibly allow for a smooth blend also (and if not should be fixed by Epic as it blocks other uses of blendables as well).
Layered Animation and interaction -> You can build a skybox UE1/UE2 style, and then render a real time cubemap from that to have all animations and sky dome interactions you want. The capture camera would need a near clip plane set so it only captures the skybox. Performance impact of that would likely be manageable at such small scale. This would give you the same power and workflow as UE1 and UE2, including all kinds of clouds, moons, etc. type of effects.
PP priority -> You may have a problem with depth of field but most other things should render on top as if the cubemap is a mesh in the world.

While I don’t necessarily disagree with the proposed solutions, I am simply stating that this method, pipeline, and workload might be more cumbersome than other solutions.

It is very interesting though. I personally would rather lock the skydome to the player. This would only take a few seconds and it would solve most every problem without creating more assets and parameters to manage throughout a product.

Another simple thing you can do is set the skydome to the bounding edge. I did this in UE3 for “Orbit”. A project where I and a petrophysacist (Programmer) built a real world scale of the solar system and the planets to simulate Kepler’s law.

There are simple solutions, just need to experiment with what works best for your situation James.

I agree with Jbaldwin. I think you will get poor resolution stars using a cubemap but if that doesn’t seem to be a problem, do whatever is easiest.

I did the camera following thing once for a light beam effect in the underwater gears3 level. Ended up doing it in the vertex shader, basically positioning the effect at 0,0,0 and adding camera position to the worldpositionoffset. Very simple, no tick cost, and you can still tweak the star size/tiling independently. Also had to make the bounds huge.

You could also use this trick for controlled cinematic scenes.
For example:

  1. Lock the camera to a boat.
  2. Lock a high resolution plane (for the ocean surface) to the boat as well.
  3. Build your ocean in world coordinates
  4. Bake a texture alpha of the boat hull into the vertex information (Via 3ds max or maya)
  5. Blend wave animation with no animation (Via vertex information)
  6. Drive mask rotation based off of camera rotation

What this does is effectively give you the power to have a small ocean plane that moves with the boat, but is completely hidden due to the animation being world coordinate driven. And this allows the boat to be hollow with the waves crashing around the boat due to the baked vertex mask that is being driven via vertex channel (R, G, or B).

Locking things to the player sometimes is the easiest and least performance hungry :slight_smile:

I am not saying my method is best, but the question of how to do UE1 style skies came up in a couple of threads. I am just pointing out there are multiple ways about it as a means of exploring what can be done. Not just for stars, but also for just normal skies.

I suppose the use of some kind of mesh to map the stars too is inevitable by the sounds of it, rather than somehow trying to create a ‘virtual’ sphere by mapping the texture to screenspace and somehow manipulating the UV’s based on camera direction/location.

Perhaps I can find a way to factor the world position & forward vector of the camera into the math of the Sky Dome, so that I don’t have to move a mesh to stay with the camera, and the mesh can stay static. This would be better, since my game is for 32 tablet-based players which are all clients of a PC server.

Just read this, gimme a sec, going to post up a method for doing this James.