Generate fake shadows from Projected Texture?

I’m trying to work out a really cheap and easy way to generate shadows from a texture, which is mapped to an object based on World-Space Coordinates.

The Texture is a simple greyscale map of tiling clouds. The texture is mapped based on World-Coordinates (WorldAlignedTexture material function) so that it’s completely seamless around the Earth sphere. The idea is that as you get closer to the Earth with the camera, the main cloud layer doesn’t hold enough detail, so I fade to tiling clouds instead which can therefore, be much more detailed. Since you’re so close at that point, you don’t see the tiling either.

What I now want to do is add shadows to those clouds, which is easier said than done. The idea is to use the same texture (or a slightly blurred version), invert it to create black areas where the clouds are and multiply that by the albedo/diffuse channel to create faked shadows. Of course all this would do right now is multiply straight underneath the clouds and you wouldn’t see anything, so instead I need to offset and distort the UV’s slightly to create ‘projected’ shadows.

This is where I’m at so far. All this does is offset the Side and Top layers of the World-Aligned Texture to create offset shadows. Basically This works fine for the top and sides of the Earth, but doesn’t work for the area facing the sun. (The reason for the Multiply * -1 is because my Light Vector faces the negative X axis, so I have to flip the direction).

f57e2b150e876c7af413716ed0882961b654e2af.jpeg

Now, I had a version which DID distort the UV’s facing the sun, which multiplied the UV’s of the front-facing texture so that it ‘skewed’ outwards. This works, but because the texture at the top and sides isn’t also being skewed, I get seams because the top & sides and the front don’t line up.

I’ve attached a hastily drawn image of what I’m trying to achieve with the UV’s, which essentially skews them over the earth from the Light Vectors direction. The trickiest part is mixing that with the seamless tiling, and where I’m getting lost.

6064180e5e6b0a0b5679880143e7c305a6da320e.jpeg

Inevitably somebody is going to suggest just layering the clouds as a translucent sphere and turning on shadow casting instead, but that would be horrifically expensive since I’d be overdrawing practically every pixel on screen at that point, and the detail isn’t high enough anyway.

Aren’t you basically looking for a line-sphere intersection? How is the light vector calculated, is it uniform or WorldPosition - LightPosition?
If it’s not uniform, the adjusted UVs should be the first intersection between the cloud sphere and the light vector.

If you just want to make the cloud projection “larger”, you could adjust the clouds instead of their shadow and multiply the cloud alpha by something like 0.9 to reduce their size a bit (or subtract 0.1, you’d have to experiment to find the right value), making the shadows comparatively bigger and skewed outwards.

1 Like

Light Vector is simply (-1, 0, 0), since I always know where my Sun is going to be in relation to the Earth (which will always be at (0, 0, 0)).

I’m not sure I understand what you mean by the line sphere intersection? You mean, how far through the sphere does a line intersect and tamper a value based on that? There’s no cloud ‘Alpha’ as such, but because it’s a black and white texture it’s just ‘Added’ to the final diffuse. What I’ve been doing so far is altering the UV position, but it’s not good enough and difficult because of the projected UV’s.

This is what I’m doing right now, if it makes it easier to visualize:

you don’t really need the whole sphere intersection, but its basically a ray intersection problem.

You need to ‘project’ the length such that when the surface normal is perpendicular to the light, the offset approaches infinity. To do that you could get the lightvector transformed into tangent space first, and then use the “length” of the XY times the ‘height’ (which is the value of the distance from your cloud bottom to planet surface in units), divided by the lightvector Z and then multiplied by the normalized XY. That should be the offset XY amount. Should be something similar to this but I will play with the nodes later if you are still messing with it.

I would try isolating this to a cylinder without any world-aligned stuff first. Maybe even use a simple debug texture that has a black circle on a mostly white background. With world aligned it will probably offset the opposite direction on -x and -y or something, so you will have to use the dot of those angles to reverse your offset.

Thanks Ryan!

I see how it works now, so here’s the calculated offset for the Texture, which definitely seems to give the correct result. I’ve clamped it a bit but might have to clamp it further since I suspect I’ll see some odd distortion at the edges with suge a huge offset.

cac32237d9c3ac40330d435e4c74225b166fda8d.jpeg

You’re right about the -X and -Y flipping coordinates, I’m doing a similar thing above in my first attempt at doing this, where I have to take the absolute value in order to make both sides move ‘backwards’ rather than just rotate around the middle.

EDIT: Just noticed I’m doing a pointless ‘Add’ and ‘Subtract’ with World Position there too… that can be taken out. Also switched to using the Normalized World Position instead of the Pixel Normal to calculate the cloud altitude. It’s 3 instructions more (for the normalization I guess), but since I can’t guarantee the Pixel Normal will be directly up after the Normal Map is applied, this makes more sense. Then again, the pixel normal ‘would’ have some affect on the shadow projection. Whether it’s the right effect or not using it this way… not sure.

EDIT #2: I’m derping hard here… this is all I really need to do.

70bdcd88edd91ad9616fa7437b169e35c00d6c54.jpeg

EDIT #3: Have now successfully transformed this for the front-facing Projected clouds (so for +X in World Space). I have to subtract rather than add to get the correct directionality (that or get the inverse light vector, which might be the better option for the other two). Now the hard part… transforming for the Top and Sides.

bf3f15bb0fa1145584861c1a5679a98310b48867.jpeg

Yup… transforming to the projected UV’s is much harder. I think I can recycle all the information I already have if I can just ‘rotate’ the offset values around an axis.

For example, I could create my Top-Down offset by rotating the ‘R’ channel clockwise around (1, 0, 0), and using the existing R as Green (See image) Very close now! It’s not really possible to ‘rotate’ these values though is it… since they’re not vectors or anything, they’re just values.

44442201ac30d3169ef1c9e7ff72cecbb0261e09.jpeg

EDIT:

The X-Transform is still wrong, I have to use an ABS before the Multiply in the Red Channel. Of course this will stop it working correctly with any other light vector other than (1, 0 ,0). Fine for my implementation, but not what I’d like to do. A bit of help with the Dot-Product side of things to make sure the +/- is correct and with the top and left would be great!

Bump @RyanB - bit of a ressurection I know, but I was wondering if you ever managed to play around with this. Come back to it now and want to get it working. I’m trying to get this working with a Panning World-Aligned Texture. The light vector can be anything really, so it makes the problem a bit tricker.

This is how I’m generating my UV’s. I transform Absolute World Pos to Local Space because the planet rotates the mesh itself, so not doing this results in the textures ‘Swimming’ accross the surface.

Trouble is, I can’t now figure out how to use the Ray intersection result with the world-aligned coordinates to have this working. Any ideas or tips?

The top part looks more like a plane intersection than a sphere intersection but maybe I am missing something.

The piece you are missing is you need to transform the offset ray into each of the 3 world aligned spaces. Since these are all just 90 degree rotations you should be able to do it with swizzling. For the bottom texture projection you have RG so probably don’t need to do anything, but for the others you would need to switch the vector around to match them.

@RyanB - You probably right about the line sphere intersection… though my coordinates are definitely reaching infinity at the edges away from the Light vector. I think I have it working for the front (GB) coordinates. This seems to work for that section regardless of the light vector direction and the objects orientation, so that’s fine. The key was to also multiply the offset UV’s against the Dot of the light vector and surface normal, which prevented it going crazy near the edges and tiling way too much.

The problem i’m having now seems to be the swizzling to make it work with RG (top/bottom) and RB (left / right) , which seems to be skewing the UV’s as well as pushing them back, so I’m unsure how to counteract that. This is looking at the sphere from the bottom up, with the light vector being 1, 0, 0. This also doesn’t work for anything other than a 1,0,0 light vector.

Somewhat annoyingly, the shadows on the top move toward the light vector, whereas underneath they move away. That part should be hard to fix though.

Okay ignore that post, what I had there doesn’t work either in different orientations. Ugh!

any progress?

I get that it’s not exactly what you’re asking about, but it would be a lot easier to accomplish this in a light function added to your directional/sun light. All the necessary functionality is already built into the engine.