I’m working on a Niagara system for rain effects. This includes droplet impact splashes. I want these splashes to happen on the surface of objects.
To achieve this, my idea was to use a SceneCapture2DComponent, pointed downwards towards the ground, to feed data to a SceneDepth RenderTarget. This RenderTarget is then sampled in the Niagara system to define the Z coordinate of the particles so that they spawn on top of objects.
However, the SceneDepth value returned by the SceneCapture2DComponent isn’t normalized, meaning that anything further away than 1 unit from the camera results in a bright white image.
In order to fix this, all you need to do is to remap the range of this value to fit within a 0-1 range. However, this is where I have become stuck. I don’t know how to perform this transformation within the context of Niagara.
Is there a way to perform this transformation inside Niagara? Literally getting the float value from the red color channel from the sampled texture and dividing it by the draw distance of the SceneCapture2DComponent would do the trick, but I don’t know how to break a color into floats inside Niagara.
Any help or pointers towards different/better techniques to get particles to spawn on top of objects in the world would be greatly appreciated.