Real time Texture Blur.

Hello,

We looked everywhere to achieve real time blur effect through material on a texture (not a texture object), trying to blur a render to target input in the material.

Whats the best and fastest method to this, we tried a few custom codes but most of those are either for post process materials or work only for texture objects which is what we don’t want.

We also looked into offsetting the texture but didn’t go ahead with it yet, thought i’d ask here first.

your thoughts please?

I know nothing about graphics programming, but the first thing I would try to do is render the capture to a very small copy, stretch resize that copy and apply it to the material… somehow.

Thanks but that would pixelate the image, i’m trying to preserve the detail but just blur it.

the process intensive way which is both overkill and not intuitive. That is to project the render to texture in the scene on a plane and then use gaussian blur on a post process camera and get that feed again into the material. But that would be like having two scene captures to do one job plus gaussian blur DOF is more expensive i bet than doing something simple in the material editor.

Any other ideas?

The difference between a TextureSample and a TextureObject, is that a TextureSample has already been looked up only once for each pixel on the screen. That position is defined by the input UVs at each pixel. At that point, when you are reading the output of the texture sample, it is no longer even a texture, but a simple color value for each screen pixel. Each pixel can be thought of as its own little processor that runs independently of the pixels next to it (for the most part anyways). All reference to the texture is completely lost after it has been sampled. There is no way to blur it like that, unless you manually place a bunch of sample nodes with offset coordinates, which plenty of people do. Material function “Blur Sample Offsets” provides the vectors to do so.

A TextureObject on the other hand, is an actual reference to the texture itself, and it allows multiple samples from different UV coordinates to be taken, using either code or material functions.

I think you mean that doing a simple downsample will result in visible bilinear filtering artifacts, rather than a smooth blur. That is true, but for extreme blurs, downsampling is still the first step of the process, depending on the level of blur desired. It takes less samples to remove the bilinear artifacts than it does to grow the blur radius at full resolution. Still, for many applications, you can simply use Mip Bias to look up the mip maps, which would be like what BrUnO suggested.

If you want a higher quality blur than the simple mip bias downsample, you need to use multiple samples, either manually or by passing a Texture Object to a function that will do it for you. There are methods to cut down on the number of samples, such as looking up at UV coordinates with a 1/2 texel offset to get one level of blur for free, or by passing your blur iteratively through a few render target passes.

Using a scenecapture just to get the post process is indeed a pretty complicated setup and probably has a relatively bulky overhead cost, but it will require use of all of the above methods to roll your own blur that is faster than the gaussian post process in ue4. At what point that would tip in favor of the fixed overhead mentioned is a bit of an unknown as it could vary widely on different hardware.

So this begs the question, why exactly are you so adverse to texture objects?

edit a hacky method you could try is adding a small fraction of “Dither Temporal AA” to your texture UVs. By default that will cause a bit of a directional streak, but if you rotate the values over time it can spread it out. It will look smeary and messy though and objects moving over it will leave trails. Only works well if TemporalAA is used.

Hello all,

Same question here, but im trying to figure out specifically if there a way to blur a dynamic depthmap texture that is being captured by a scenecapture2d ?

I already got a working depthmap by using a custom blendable postprocess material added to the scenecapture2d’s post-processing list. It reads from a SceneTexture:SceneDepth node in the material.

I also got a blur working by using a blendable postprocess material in the scenecapture2d’s post-processing list too. This one uses a custom node with shader code in it to read from a SceneTexture:PostProcessInput0.

But i cant seem to try to combine/chain them. If i feed the outcome of the depthmap capture to the custom blur node, it blurs the original capture, not the depthmap.

I’m using SceneTextureLookup here. I tried a variant with Texture2DSample in the shader code instead too, but that only works/requires on a static textureobject and not a dynamic source like a frame capture. (no TexSampler i can use here)

(**blur shader ref **https://docs.unrealengine.com/latest/INT/Engine/Rendering/Materials/ExpressionReference/Custom/index.html)

Here is the final shader code and relevant nodes i mentioned:

3505803775a4196d62714e8f26045012e0a6b9a7.jpeg

update

ok i got the blending to work by moving out the blur shader node from inside the postprocessing material, leaving only the original scenetexture:scene depth node i put there.

I put the shader blur code/node in a normal material that accepts as input the rendertarget output of the scenecapture using a ‘texture object parameter’

then i applied the blur shader to that, after switching the code sampling from SceneTextureLookup to Texture2DSample again, and voila… its working.

2 Likes

That is because the “SceneTexture” input you hooked up is not being used at all in your first code example. That input is also a color value, not a texture value, meaning it could not be used to look up offset samples for blurring like you seem to be trying to do there. The only way to access the depth is directly using CalcSceneDepth. Hook up SceneDepth by itself and then view the HLSL code to get the exact call.

SceneTextureLookup does a regular scene texture lookup, and has no connection to your input pin named SceneTexture.

Your bottom example uses a texture object which is why it works.

Btw I wrote that blur example code a while back and it was made to be separate X and Y passes just for performance reasons so it wouldn’t chug out perf with even small values. But that is why the blur looks like a + sign. To remove that, you’d want to make the inner loop nested inside of the outer loop instead of having them one after the other like now.

The info is really helpful thanks

CalcSceneDepth is interesting, so i put together this new custom shader node to read the depth directly and ‘sample’ it to color blur based on the previous shader

it wont compile yet tho, it tells me uv is undefined, calcscenedepth has no matching 1 parameter argument, and that there is a mismatching parenthesis somewhere

im a noob with shaders, but i’ve checked it a few times times and i dont see what the problem is, i did create a new node input variable called maxdepth

could the problem be that i am not allowed to declare a function inside shader code?

I don’t see the () error, but in your depth =1.0… line, you use a lowercase uv and then an uppercase UV everywhere else.

Uppercase UV is set up as an input variable for the custom node and fed a TextureCoordinate

lowecase uv is just a parameter for the Proxy function with local scope

i think i cant use functions in the shader code

to test that, i simply replaced everything with a dummy shader with 1 function and it gave the same errors about undefined parameters and mismatching parenthesis

oh I see. I am so used to not using the function declaration line that I skimmed past it unconciously :slight_smile:

yes you cannot use the declaration like that in the custom node. the compiler creates it for you. the only inputs you can define are with pins.

But even if you could write it like that, you would be sampling lowercase uv as an uninitialized parameter since you never actually set it.

i just read somewhere that i have to modify some unreal engine source files to add my own custom functions to the list of intrinsic functions.

since i wont be doing that, i dont want to get over me head, basically what i have to do next is to get rid of the function defined and write the shader but now expanded lines of code. correct?

that horrible hehe… my shader code will hate me now for how it will look when its no longer modular :stuck_out_tongue:

isnt there any other way i can have/fake reusable function behavior with shaders without modifying source engine code files to add to the list of intrinsic functions ?

thanks in advance

Adding your own intrinsic functions directly to engine is super easy if you just add it to something like common.usf. There is also a way to do it using plugins but I am not sure what boilerplate code is needed for that.

Just make common.usf writeable and you can inject your function as it was, but you need to make all the function inputs in the declaration since there will be no pins. Then you can call it in the custom node like like doing

return saturate(x);

where saturate becomes your function.

edit and you can technically call other custom nodes from a custom node, but you need to chain them together to enforce the naming order which is done automatically. Ie first one will be CustomExpression0, then 1, then 2. Its actually WAY easier to add your function to common.usf than it is to goof around with nested custom node calls. Gets ugly quick.

ya its true, technically i agree 100%, u end up with ugly stuff

but then again i would have to share the updated common.usf with the team and im too lazy to do that :stuck_out_tongue:

that said, i got it to work just now !!

blurred depth map on a custom post processing shader :slight_smile:

and speak of ugly stuff, here is the final, but working, code i ended it with :stuck_out_tongue:

Thanks alot Ryan, your input was alot of help

Using that CalcSceneDepth hlsl thingy you mentioned solved this up :slight_smile:

I’ll leave you with one final tip. When you want to see how some node goes to code, the fast way to find it is to hook it to emissive (in an empty shader with no other pins connected is ideal), then view the HLSL and search for “local0”. That will always jump you to right where you actually node-code starts.

Thanks Ryan for the help, redphoenix2k is a teammate who is helping us out which is why i didn’t pitch in further.

While the blur worked out, it is unfortunately still low res relatively to a decal and by increasing samples it kills performance.

So we are thinking of a hybrid approach by using the two methods to give us more control and good performance while making things look good.

Thanks again for your help.