Trying something crazy - pixeldepth and blurring

Hi. Noob here.
I am trying to use two different features together.

One is:

The other is:

I am trying to have the texture to blur based on the pixel distance from the camera. So the walls, floor, boulders, right close to the camera will be clear, but starting at just a meter away the texture begins to blur.

Using these two methods, at best, I was able to make the texture fade into transparent the further away it was from the camera. Was pretty neat that it would fade into existence as I moved forward, but it isn’t the effect I want. I need blur, not fade into.

Does anyone know how to implement?

Sounds like you want to use Depth Of Field, with the camera focused on a near point so everything far is blurry. Use the CineCamera, and look into tutorials on Depth Of Field, and you should be set.

Not exactly. I’ve been trying to use DoF but it hasn’t done what I am looking for it to do.

It isn’t as simple as everything far away is blurring. The blurriness needs to be distance based. As in things up close aren’t blurry, things a medium distance away are blurred a bit, things far are blurry, things far away are completely blurred. And to be able to fine tune it depending on the scene’s water content. As in dirtier water is blurrier, fresh water is clearer, etc.

kinda funny, just yesterday i ran up Dark Souls 3 and wondered this exact thing, on how they managed to get, what looks like, a stepped-depth-of-field for distant groups of objects…

Might take a whack. I’ll keep an eye on this thread and if I have any insight will share. I have some ideas but can’t dig into it now, later tonight possibly…


My ideal is to be able to use different convolution blooms (different kernels) based on depth. Like 0 to 1m away it gets one kernel, 1 to 4m away is another kernel, etc. But I haven’t been able to find a way to connect it.

What I have gotten to work for now is… I created a blueprint with a camera and a transparent plane 11.5cm in front of it. I created a new material for the plane which uses the SpiralBlur - SceneTexture node. I then use the Scene Depth node, divided, clamped and attached to the multiply to the opacity coming off the spiral blur.

My issue with this method, besides not being kernel based, is the drastic change between not-blur and blur beginning. I need a much softer or subtle gradient.

1 Like

This is how you might distance-blend at least:

Plug whatever into the a/b inputs of the LERP. In your case it could be the strength-value of the blur, and you can set a value at the nearest distance, a distinct value at the mid, etc, etc. The LERP is a ‘linear-interpolate’ it moves between the a and b inputs/values based on the alpha-input. Assum at 0-alpha you get 100% a and 0% b. At .5 alpha it will be a 50/50 mix, etc…

As you divide the scene-depth by your specified blend-distances you are generating a number from 0 to x, but saturate clamps this between 0 and 1 (the range we want). Feed that to the various daisy-chained LERPs and you can create multiple ‘steps’ and set values-per.

The buffer-value is added in as a negative number (vs subtract, add is technically faster and/or more-packable) so as to create a dead-zone around the player, or an area where the blending math won’t (effectively) start so alpha will always be 0. Think you want a clear area around the player before the fog-stuff sets in and you start blending with the background (or if not fog, whatever you use).

If you do it this way, you can keep just the one blur-function and just feed it various values over distance, time, whatever you tend to use. More efficient.

Hope this helps.

Hi! Thank you for this but I need some clarification. I assume this is a material blueprint, but I do not know where to plug this in. The last Lerp should go to which node on the material?

Yes, this is a material blueprint. You would use this to blend different values across screen-depths. You could use this in any material on a static mesh or whatever, but if you are like me, likely using it as a post-process effect to blur the whole screen vs a single particular thing (although why no if you want to).

That last LERP goes into whatever you were blending between up-front. In this case, you might use the strength value of the blur so that at 0-1000 meters it’s .1 blurred and then at 5000 it’s .5 blurred out to whatever. The idea is that you can LERP between any value(s) over scene-depth, whatever that(those) value(s) needs to be for you.

If you need a visual, plug in different colors on the a/b inputs of the LERPs with that last one going to BaseColor.

This does not seem to be working. For instance, if I attempt with color and connect to BaseColor, I only get the color which is connected to the third/final/lower lerp node. Nothing I have connected to the other Lerp nodes show up.