Custom Raymarch Material

I am testing moving an object from our engine to Unreal. I am trying to see how to create it using the existing systems without messing around with how the engine works. It uses a simple (1-4 steps) raymarch on a mesh to create a procedural volumetric effect. In order to do this we need to know the entry and exit point of the ray, which we do by reading from a depth buffer. This depth buffer has all the object with this materials backfaces rendered into it.

I have 90% of this working in Unreal but the issue I have run into is that I can’t achieve both things I want. The main issue is that I used the custom depth to create the backfaces but I can’t read from the custom depth buffer in an masked material. I set CustomDepth.Order to 0 which should render it first, but still doesn’t let me assign it to the material. Is there a way to override this? Or is there an easy way to create my own depth render texture pass to just render these objects into.

The other option is to use a Translucent Material, which would probably be fine, but I can’t get the lighting to work on that. The Pixel Depth offset was useful on the Opaque material for making sure the lighting and shadows affect the material correctly, but I don’t know how else to adjust the position the object receives lighting in a translucent material

The other issue I am having is, the march algorithm outputs a world space normal, and I don’t know if there is any way of using that for rendering. I was wondering if there is a way of overwriting the normal in a material. I can potentially see a way of reverse engineering that world space normal into a normal map value, but even then that seems like it has a number of issues.

If there are other suggestions of how to do that I am open to them. I don’t mind writing code, but I don’t want to mess around with the engine for this, and given my limited knowledge with Unreal atm, I am not sure where I would start with creating my own Material on a code level. Any resources would be appreciated

You can add extra depth pass, but that will involve quite a bit of coding and tbh only engine edit will be clean enough.

Regarding translucent object receiving lighting, for translucent, all lighting, including fog, will be evaluated at rasterized fragment position. You can use one of unused inputs (eg. Opacity) together with shader edit and preprocessor directive, injected through custom material expression to offset that position, so that all lighting features work correctly.

Regarding normals, material can be toggled to work with world space normals directly.

I found the toggle for world space normals so that worked thanks.

This seems like the best solution - Do you have any examples or resources that would advise how to go about doing this