How to align the light function with the skydome

So I have a skydome (some code belongs to kite demo).

here is my UVs shader part:

I would like to make a light function that projects clouds to the landscape and that is aligned with the clouds in the sky.

Any idea, tips for starting? I don’t even know how the light function coordinates and alignment works.

Thanks in advance guys :slight_smile:

Basically you want ray vs sphere intersection. Starting point of ray is the world position inside light function and ray direction is directional light direction. You can probably assume that starting location is inside of the sphere and that dynamic/static shadow system take care of other intersections. Then you just calculate point in hemisphere where ray intersect the sky sphere. Then you just need to evaluate your sky material shader using this intersection point.

To aproximate this all you could make things bit simpler and faster and just render your sky to cube render target and then use parallax corrected cubemapping. Sample direction is then: sampleDir = intersectPoint - rayOrigin

Edit: Custom function that can be used for intersection.

Thanks! What is rayOrigin? Is it the Absolute World Position? Also, is dot(rayOrigin, rayOrigin) “normal” (I mean the dot product of two identical vector is pointless, isn’t it)?

RayOrigin is the absolute worldposition. Dot with itself is not pointless. It’s give you squared magnitude of vector. xx + yy + z*z. This is same as squared distance from origin.(which I assumed that your sky sphere is placed.)

fwiw after 4.9 there is a node to do this.

“raytracedsphere” looks to be the same thing Jenny posted, but it assumes ray origin is camera position (which is valid for 99% of cases).

But in this special case origin is the pixel world position and there are few optimizations that can be done because of assumptions. Origin is always inside of sphere which means that you always get intersection and that is never in backside. Also we can simplify position shifting if we assume that sphere is located at origin.

good point. maybe we should expose the ray origin on that function and a boolean to bypass the ray miss checks etc.

Also there is another function “sphere gradient 3d” that has some of this simplified math inside but does not expose it as outputs for some reason. I will see about exposing some of that stuff.

Can I multiply the result of the function with the custom UVs to get the final UVs? I tried it it didn’t worked.


here is a screenshot of my blueprint:

The final UVs should be a vector 3 to lookup into a cubemap. if you want to use this with a non-cubemap texture there is another function I will dig up from another thread. it remaps from equirectangular to XY.

For a cubemap you should just use that multiply by light vector just outside the center comment box. technically it should be normalized but it won’t matter. I’m curious how different radii affect the appearance.

This was meant to go out with the 4.9 engine release but it looks like some files were missed in the merges. I will make sure it comes out in 4.10 though.

This material code converts from spherical coordinates to a 2d UV value:

float2((1 + atan2(Vector.x, - Vector.y) / PI) / 2, acos(Vector.z) / PI);

And to use custom UVs I would perform the entire calculation in custom UVs then you only have a single custom UV channel to read in the end. If you replace CUV0 it will just work with default texture samplers.

For cubemap the direction is: sampleDir = intersectPoint - rayOrigin
Cubemap lookup coordinate does not need to be normalized. It’s implicitily normalized by hardware.(or some special emitted shader code for newer GPU’s like GCN).

For 2d texture you need to use exaclty same projection from 3d position to 2d uv coordinates that you use for skydome material.

Thanks I already used this function however - copied the code from the sky in kite demo :slight_smile: