Sample SceneDepth mipmap via custom node? (raymarching)

Hey there,

So I was reading a paper about screen space reflections in frostbite engine which I would like to try to replicate in UE4 (also for more accurate refraction).

A critical part of their technique involves sampling the scene depth at a lower mip level… so my question is, can this be done in UE4 via custom material node?

Alternatively, is it possible to call a function (such as RayCast from ScreenSpaceRayCast.usf) from a custom node?

Yep, both sampling at specified mip and and calling function from USF file is possible in a custom node.

Hmm I figured it should be possible but for some reason I can’t seem to be able to (the material doesn’t compile), do I need to add an include or something like that?
Otherwise I must have done something wrong.

It will need to be a translucent material for it to compile. Only translucent materials can read the scene texture.

If that is ordinary material you are using, you can paste your function or add an include to MaterialTemplate.usf
for PostProcess material it is PostProcessCommon.usf
Both worked fine for me in the past. In any case, you can always do Window-> HLSL code in material editor to view what includes are relevant for your shader.

Please note, that after editing those USF files, you will need to recompile the shaders, as described here

I already am using a translucent material :stuck_out_tongue:

Btw how is ssr done in opaque materials then? if they are using the scene textures from the previous frame then can’t that be accessed instead?

So after trying everything that I can possibly think of…

  1. adding includes seems not possible via custom node (it makes sense, it’s compiled as a function, you can’t add includes inside functions right?)

  2. seems there is no way of sampling lower mipmap of scene depth

That kinda sucks :stuck_out_tongue: are you sure it is possible?

Thanks for your help so far though!

  1. Adding includes is possible, if you add the code to USF file, and just call the function from the custom node.

  2. I’ve kinda missed the part that you want to sample lower mip of depth texture, read it as simply texture for some reason, my bad. I believe depth texture has no mips stored, otherwise this:


return ConvertFromDeviceZ(Texture2DSampleLevel(SceneDepthTexture, SceneDepthTextureSampler, ScreenUV, MipLevel).r);

would have worked perfectly.

I might be wrong though, but I guess you would have to create a separate render target to store depth mips.

Internally the engine uses something called HZB occlusion to generate the mips. That code is the HZB setup mips code. I am not sure how you would go about reading the mips from that but if you delve around the HZB code maybe you will find something.

Also, re: why can opaque materials read from scene color? They can’t, themselves. All materials in the scene do it at once (thanks to deferred) and there is some cost to making the scene texture available to do that. Actually when you read scene color in translucent materials it stores a separate copy for each mesh in the world that uses it so its pretty inefficient. It can’t be accessed during the base pass unfortunately without engine code changes. I am not sure what the performance implications would be but they would not be good.

Right that makes sense, thanks for the info! :slight_smile:

I took a brief look at that but as far as I can tell that would require changes in the engine side to expose to materials (which is not an option, I want to be able to share this as a material or a plugin… not an engine branch :p).
Is there any chance we can add a feature request for this maybe :p? probably not eh