Trying to get particle shadows to apply to other particles.


I would appreciate any help I can get on this.

so my buddy and I are trying to recreate the smoke particles shown here.

right now, our systems are shading based on the camera angles to the individual particles rather than using the actual light in the environment.
we can get the particles casting shadows as seen here by following those directoins:

but we can’t get it so that lighting brightens up one side and darkens the other.

So first things first. we want to get this effect working on a simple particle system.
can anyone shed some light (pun not intended) as to why it’s not working on our systems shown in this image?

secondly, we want to also extend this to a particle system which has half a million particles.
This system currently has all sorts of artifacts and we’re not sure why but figure its all related.

Basically we’re looking to get particles to shade other particles correctly.
The ultimate goal is something close to this:

which for reference was done with offline rendering and found here.

I know i will hit some setbacks with the technology. But we’re trying to get as far as we possibly can. we’ve got movement down, just not the rendering.

It might help if you can post a screenshot of your material setup and properties as well.

Whoops! thanks! somehow I forgot the most important part.

So let’s start with the smoke.
for I copied it from here for reference:

I’ve possibly modified it from its original state so here are screencaps below of what I currently have.

first the Material. M_Cloud_lit. This is shared for all 4 test clouds.

Next the clouds. I have 4 versions with the same world properties you can see that below.

as far as the emitter differences, here they are:

P_Cloud: unmodified from content demo.
P_CloudSmoke1: added initial velocity (min: -100, 25, 100) (max: -100, -25, 80) to make it go up and spread out a little. also increased the particle spawn count from 3 to 5.
P_CloudSmoke2: same as above, but also turned into a GPU particle system. Initial velocity is similar, just upwards only.
P_CloudSmoke3: same as above except I also shrunk the particle sprites considerably and am spawning at a rate of 100.

Am I missing anything?

First thing you’ll want to do is get rid of ‘two sided’. Two sided does nothing for sprites because they always face the camera, but it doubles your draw-calls for each particle emitter. In other words, it’s insanely expensive for no reason. I still to this day have no idea why it was enabled for EVERY material in Gears of War 1…

The particles appear to be shadowing correctly, but Lit Particles are only an approximation and you won’t get those results you posted there from using lots of small sprites together. To get that kind of effect, you might want to use fewer larger sprites or even a mesh, similarly to the way it’s achieved in the Elemental demo. Those offline shadows are done with Raytracing, whereas our realtime shadows use volume texture samples placed all over the level when lighting is built.

What you’ll need to do is play with the settings under 'Translucency Self-Shadowing to get the desired results. I find the ‘2’ value in the top-most box is usually a bit extreme, I tone it down to around .75-1.0 normally. Also, the more particles you emit, the less shadow ‘detail’ you’ll have because they’ll keep layering shadows on top of one another. The colour value also has a large effect, just play with those until you get it close :slight_smile:

Like Jamsh said, Also for small particles you should turn on Responsive AA as well.

As for the rest, There is no real way to have a 1 fits all case. You need to play with self shadowing like Jamsh said and get the desired effect per material.

Thank you for the quick replies!

Ok, so I turned off 2 sided and enabled responsive AA. I previously had those settings right but I guess I forgot to fix it in my haste to post. I’m not so much worried about optimization at this point as simply getting this to work. I’m kind of at a point where im clicking things and seeing if anything sticks lol. Good call though!

Since you mention volume texture samples, I need to mention that I’m trying to do this only with dynamic lighting (if possible).
According to this link: Lit Translucency in Unreal Engine | Unreal Engine 5.0 Documentation
Directional lights do translucent self shadowing per-pixel, so the light I have setup in my test scene is a directional light. The page doesn’t mention any settings though. I simply turned on all shadow settings on my directional light in the hope any of them would kick in and work. I’ve tried all 3 lighting mobility types as well and get the same results with each as far as I can tell. you’ll also notice I have a post process volume in the scene. I have that disabled currently. please let me know if there is a relevant setting I should mess with.

OK, so I’ve been messing around with the self shadowing settings all weekend, and no luck, its what led me here. I was hoping there was a magic check box I overlooked somewhere. The reason I arrived at those material settings above is because it didn’t distort the shadowing based on the camera angle. The image below illustrates my point using the default self shadowing settings. I know that the normal of the billboard is used to calculate lighting with the source of the light, but I have no idea why this color parameter has an effect on this when all I did was change it to a light blue tint. I’m honestly not sure how that works under the hood.

Below is the GPU small particles emitter from 2 camera angles. the same happens on all of the particle systems including the blue half million particle system in the original post. It seems to me now that the smaller the particles and the more I have, the worse it seems to get.

you mention the elemental demo. I just watched it, which effect are you referring to? I didnt notice any particles self shadowing in it. on the contrary, everything seems to be glowy pretty stuff.

I had tried Mesh particles over the weekend, a series of tiny low poly spheres. I overcame the 1000 particle count limitation by changing the settings. They seemed to behave the same as these cloud billboards in a sense. I’ll see if i can setup some again so I can properly document what I see.

First thing, check out the volcano smoke in Elemental

OK, I took apart everything I could see up by the volcano and unfortunately I couldn’t find any particle systems generating dynamic shadows.
I systematically deleted things one at a time out of the scene and couldn’t find anything similar to what I’m trying to accomplish.

the big puffs of smoke are P_EdgeSmoke and P_EdgeSmoke_L and those are unlit. the cast shadow option is grayed out. just a few dozen particles.
the smoke of the volcano itself is a mesh.
everything else I’ve seen in that project is glowy particles.

I’m sorry, but am I missing something?

I put together a (programmer-art-horrible-looking-but-good-enough-to-get-the-point-across) system that shows what I get with the mesh particles. Basically, it appears to me like each particle samples the lighting around it in the world and does not take into account each other. The same effect is seen with the billboard particles. I thought it was a limitation of it being billboards and all but now I realize that this is a limitation of the particle system. Keeping in mind my world only has 1 directional light, you can really see this when you look at the image below. the billboard ones darken as I rotate the camera, and the mesh ones also, but not as much. the key thing to notice is that the ones in the front do not block light and darken the ones in the back.


After really taking that all apart and after trying out mesh particles I’m starting to come to a realization. The technology to really do the things I want to do just isn’t there. It looks like I’ll have to get creative and fake things and just accept I’ve hit limitations of the engine I guess. I don’t think I’ll see the shadow resolution I want for a half million particle system any time soon. The approximation is not good enough.

Please enlighten me if I’m wrong, perhaps I’m approaching the problem at the wrong angle?

Hmm… thinking aloud here, but perhaps you could use Macro-UV’s to scale a normal map across the whole particle system, and use that for the lighting? It’ll be tricky to get it looking right and whenever you make a change you’ll have to likely adjust the macro-uv’s to compensate or perhaps even the normal map itself. If it’s for a specific particle and you always know how it’s going to look, i.e. not to random, you might be able to pull that off.

I think the problem is that you’re getting the faked ‘Volumetric’ shadowing and lighting, but it’s added to the individual sprites shadowing and lighting and not really working, you want the individual particles to have less shadowing and more shadowing coming from the volume part of things. Having a large-scaled normal map that aligns across the whole particle system might give nicer shadowing.

Thanks for all of the help so far!
I’m not familiar with how the macro UV’s work in this case, so I’ll run that by my artist and see what he knows. Maybe we can find some luck in that direction, but based on your description I’m not too sure.

to give you some background (which will probably help), I’m making a VR media player. What makes this difficult for me is that I have 2 main requirements I need to fulfill.

1st is that the many objects in the environment we’re ultimately making are going to move to music dynamically using some real time audio analysis tools I’ve got. We will impose behavior limitations to things to give them a range of actions and motions, but for the most part, the only time things have a chance at behaving identically a second time is if the user plays the same song again (assuming we don’t use any randomizing on the visuals). so a visual like this ultimately needs to be fully dynamic.

2nd is that this is a VR project. So users will not only get a full 360 view, but depth as well and I know how important shadowing is for depth perception. The effects look very pretty while glowy, but in VR land it looks a bit flat currently (especially on a mostly empty world). The reason is that there are 3 main factors used for brain depth cues. first is size to indicate distance (things don’t tend to scale in real life), second is movement over distance (Parallax), and third is shadowing. 2 of these are usually enough to determine depth but any conflicts the brain receives *can * lead to sickness. I also read somewhere that women prioritize shadow cue’s over parallax in visual processing. so yeah… Without particles casting shadows on each other, I run the risk of some users getting sick more easily. I guess I cant use particles as a main attraction like I currently am.

Since the project is still in early stages we wanted to capture the richness behind a song (pitch, onsets, and the timing of the rhythm) to create some organic movement with a particle system since it seemed like an easy place to start and its fairly abstract. something like the simple and yet rich movement seen in the youtube video in the original post (copy paste of link: Entering The Stronghold | Audio Visual Animation HD! - YouTube ) with the added bonus that it wasn’t limited to a fixed perspective. The rest is history. Seems I’ve hit a brick wall and lost quite a bit of time. If there is a solution to this I would love to find it, but I’m going to have to pursue other options meanwhile.

How did your artist render the normal maps, Or he didn’t render them just created one through the texture?
From my experience this isn’t how things work if he just created a normal by a 2d image.

shoot, hes offline right now. I’ll forward this to him and see if he can chime in. Sorry but you spoke alien to me. I only know what is currently working or isn’t. It’s entirely possible I am lacking some key knowledge here.

As far as I know, the only normals we’re using are on the material. and that depends on which of our scrap test blueprints we’re talking about.

if we’re talking about the 4 (or now 5 with the mesh version) clouds then they all use the same material. its animated and literally copy pasted from the content demo effects map if you want to see yourself.
The normals and current params look like the following image:

if we’re talking about the big fugly blue GPU system we ultimately want shadows on, then here are some pics with everything I’ve dealt with regarding normals (also on materials).
first, the settings showing normal sample image for the blue particles:

next, here is what no normals vs hooking up the texture vs generating them via the checkbox gives.
… and now that I’ve done it and made these screen caps, you can really see how each particle seems to sample the world and doesn’t consider the ones around it.
these are all taken from slightly different positions, although relatively the same viewing angle + or - a few degrees as I flew around and changed things around for each cap just now.


hopefully any of that will give you info you need.

forgot to mention, normally I keep the opacity of those blue particles at like 5-10% but I raised it up to 100% for the screen caps. its basically the same effect just harder to see the issue.

you’ll also notice that there is another particle system behind it. its actually just another emitter in the same system (which has a total of 7). I wasnt even gonna bring that issue up yet, but during this process we realized that the left most emitters on cascade always seem to render on top of the ones to the right. so I guess sorting by distance is based on emitter and then on particles at each emitter? At some point I would like to know if I could actually mix in the 2 sets of particles… fun times… maybe later. [/edit]

Ok if you are expecting to render proper normals they need to be made right, Even in fumeFX and other tools we have a way to extract RG channels and pulling out a normal map.

Are you talking about the same normal map applied as a texture in a material? or are you referring to applying a normal map somewhere else?
If you’re referring to the normal texture in materials, shouldn’t the “generate spherical normals” checkbox give me the correct result when my particles are literally circles?

The normal map that is applied inside the material, Has to be created in a specific manner when we (vfx artists) are looking into creating realistic reflections or shadowing.

In any case those are things that your FX artist should worry about.

Contact me over skype and I will link you to the best source / site that explains the whole idea once i’m back.

Alright, lets try this again from a fresh perspective.

So I decided that I’ve been tackling WAY TOO many things on that map to consider that testing environment clean any more. So instead I spent a good hour making a brand new clean map and setup some simple demos of what I see and made a short video showing things one at a time.

After sorting all of my information, I realize that I’m actually facing several issues that come and go depending on the settings used and where they are used. Some of the issues result from the type of lighting used, others just don’t work right under certain settings.

I know this is all a game of smoke and mirrors and that there will be limitations, but I’ll pose the big question. Am I missing something? Or, am I fighting the the limits of this system? Or heck, is this broken?

here is the video followed by a quick break down:

0:00 - The original cloud from the content demo was used as baseline. I used this in my other map as well to determine how settings are actually affecting things since i know what its supposed to look like. the only noticeable artifact I get with this system is from moving along the relative path of the directional light. Very forgivable IMO.

0:23 - Cloud with velocity to mimic smoke is basically the same thing. just shows how movement affects it for a baseline comparison. I circle around it to show that it works fine from any angle. keep that in mind…

0:45 - the above modified to be a GPU emitter instead. behaves roughly the same way so I don’t spend much time on it. proof of concept. it works fine.

1:00 - GPU cloud with tiny particles. I shrunk the particles quite a bit and made it so this system has roughly 1k particles at all times. whats noticable about this system is that all particles in the spotlight beam react the same way to the light. so even the ones in the back light up so to speak. This is one of the issues I’ve kept noticing and thinking was an issue with normals or something. turns out as soon as I flip over to directional lighting that goes away, but then we have other issues. As I turn my camera view roughly 90 degrees back and forth, the color of the particles gets altered. I figure it’s expected behavior considering these are billboards, but there seems to be a fairly big difference between observing it with spotlights vs directional. I guess its just the way the textures blend with directional light.

1:53 - 10k GPU Particles with spherical normals (and a sharp circle texture). Everything looks nice at first, you can see that all of the particles within the cone of light are illuminating properly. the only downside is that with spotlights they each sample the light and dont occlude each other from it. I switch over to directional light and this is where things get really… interesting? So while I understand that there may be some loss of precision as the systems get bigger and more complex, what I’m seeing is just not right. First, there are shadows in the front where there should be none. This is a short video, but I’ve looked at individual particles before and the shadow positions are wrong. second, you can see here that the shadows are moving up on the particles themselves. The colors are wrong. it’s no longer spherical, but flat! It also seems like there is some issue with depth, though that is not quite apparent in this one… Also just to add an extra note. I had velocity on these at uniform 80 - 100… i just made it 100 even while writing this and the shadow moving issue is still there.

2:45 - 100k GPU particles. the same as above, just more of it. (The only tiny difference is that I accidentally left the translucent multiple sscattering extinction color at white instead of default, but that only changes the color of the shadow, not the issue) Because everything is tightly packed, you can see the errors clearly standing out now. mostly the same as above. stuff pops in and out at places. One thing I do want to point out while looking at the back of the system (time index 3:28 is a good spot) I notice that the shadows on the shadowed particles … are on top?

3:35 - Mesh particles. I wasnt going to do this one originally, but I remembered seeing something very odd… you can see it right at the start. with this new minimalist setup, the shadows appear to be rendered right over the particle system. I think there might also be a depth problem, but whatever is happening is messing with my brain and I cant stare at this for long enough to figure it out. its apparent when I rotate the camera. I just triple checked this and PSORTMODE is “Distance to view”. My instinct tells me the shadow issue shown here is related to the other particles since they seem to show the same behavior for directional and spot lighting changes.

4:17 - I finally found the cause of one of my headaches!! read the note… watch the issue.


Also, while i was waiting for the video to upload. I added a 10k GPU particles emitter to the scene with a material that has NO spherical normals. just to make sure that it had no effect on what I’m seeing. sure enough this is just to fancy up each particle.
here you can see the same issues as the video with directional lighting.


Hey xdbxdbx -

There is an important tool tip that often gets overlooked when using Cast Volumentric Translucent Shadow (VTS, from now on) since it is only shown when hovering over the option, it reads, “[VTS]s are useful for primatives with smoothly changing opacity like particles representing a volume, But have artifacts when used with highly opaque surfaces.” So a lot of what you are experiencing in your video are these artifacts in your particles.

First, I want to explain the “brown” color which you see in your systems (notably the screenshot just above). This color can be set in the material for your particle under the option in Translucency Self-Shadowing >> Translucent Multiple Scattering Extinction. This setting is meant to smooth the transitions and to help sell the volume nature of your particle system.

Now to the heart of the issue, you have translucent particles but the opacity in Cascade is set to 1 which the engine reads as opaque, but because you are still using the VTS that blending is attempting to happen and it generates the artifacts. Alpha setup in your particle system is very important in an effect like you are trying to achieve. With a large amount of particles (small or large) you are getting a lot of overdraw (particles rendering one on top of another) and ignoring the performance issue for now, you are having to add the opacity so at just .5 (50%) opacity which is just 2 particles you would create an opacity of 1 and start to generate artifacts from blending issues. Using some of the information above and in your video, I recreated your effect and had to go to an alpha value of 0.0125 (1.25%) before I removed a majority of the artifacts. You would have to adjust this value based on the size of the particles and the amount of overdraw. For this reason it may be better to go for a masked or an opaque effect without the use of VTS but still casting shadows. With a smooth spherical normal in your material you should get some very good shadowing and the gaps between your particles would still allow for a kind of transparency.

This gets rid of the artifact issue but we still have a light calculation issue to consider. We have to break this down a little more because the light from the level is actually being correctly calculated. You can see this in your video as the light always dims on the backside and is bright on the front side. This issue comes when the particles themselves wanting to be treated as individual camera facing elements, but in actuality that is not what you want them to be. They are a small part of a larger volume and while the screen alignment allows us to adjust as we pan around the effect the normal of the individual particle is not adjusting it is remaining calculated based on Camera Facing which causes issue as your camera pans around the effect and your light remains stationary. (This is why your moving spotlights look better than the directional light there is less obvious normal calculation errors between the camera and the light). So Solution right, here we go, In the Required Module way down at the bottom there is a Normal section. Adjust it to Cylinder and the emitter will now calculate the normal based on being a cylindrical volume and not based on being a Camers Facing sprite, even though your sprite is still camera facing. (For your original effect it would be best probably to use a Spherical calculation, also an option.)

In summary, adjust your alpha WAY down based on your size and spawn amount OR use opaque particles without VTS and adjust the Normals to better match the overall shape of your emitted effect.

Here is an example of the Opaque example, VTS with 25% Opacity and VTS with 1.25% opacity:


Here is my calculated normal for an opaque particle (it looks a little better than the texture based sphere normal, my opinion) the hardness of the normal’s sphere mask is set to 0:

Thank You

Eric Ketchum

Great post Eric :slight_smile: That’s certainly cleared up a few things for me too, glad you looked into it!

Indeed, Thank you!

BTW: for everyone following along, the discussion has sort of also moved here: