POM material

Been thinking about technique a bit.

I have been using the code that reads a 2d texture of volume slices and converts it to ‘3d coordinates’ (for the volume raytraced clouds i was doing’). It’s perfectly possible to use that to store a distance field which in a texture that could then be used to raytrace into from a plane that faced the camera (gpu sprite or some kind of ‘imposter like’ card). The distance field can easily be created in houdini. you can pack a 256x256x256 3d voxel grid into a single channel of a 4096 map. or an 128 cubed grid in the same size map but have 32 frames of animation.

To output a float4 you could pack in the pixel depth offset, and for normals, you could use two channels for just normal offsets (since you already have the render plane normal). that leaves you with an extra single channel for outputting whatever you want (singe channel colour mask, or ramp that was stored in the texture).

Brian riffing here. The first thing I would like to experiment with is baking out a distance field of a grass patch from houdini and raytrace that on a little plane on the floor. that way you can have blades of grass that intersect each other.

Alsothe normal could be precomputed in houdini and stored in the texture as RGB with distance as A.

Yes you could do for sure. At some point you are re-creating lots of code that exists already using the global distance fields, but there is currently no way to raytrace only a single distance field in a material unless you define the distance field yourself (or using the Volumetric Decals which I have now messed with and added some basic shapes and combiner functions for).

FWIW its fairly cheap to compute the normal using gradient lookup at runtime so you don’t necessarily need to prebake the normal for it. You just need to do a lookup with offset X Y and Z, get the difference from the current sample and then normalize the result.

All the ray tracing examples people are posting are really neat. has been a great learning experience for me as well. Seeing the cat scan data is a really cool use of it. I was testing some cloud rendering stuff that is kind of similar (as well as a ray tracer with subsurface and transmission) but its all kind of in limbo right now as my various experiments are left in half completed states.

I am thinking at some point of making kind of an an animated example material that shows how the ray tracing stuff works for people who are still figuring it out.

If anyone else is on (or 's branch), it took me a little while to figure out how to set up the distance field volumetric decals:


Just create a material and set the default blend mode to Volumetric Decal. will make Light Vector sample the distance field function along a 0.0 to 1.0 coordinate space that you feed into Opacity Mask.

So am I right in thinking that the volumetric decals are an isosurface where you can feed in any distance field into the opacity slot? Do you still apply it to a mesh?

Its not applied to a mesh, but as a decal, but instead of projecting to a surface, it uses the volume area of the decal to generate the data.

Yeah that was the main thing I was wondering is how it defines the bounds. That sort that out then. Does it raytrace the field? I’m guessing it does the pixel depth and normal under the hood as there’s nothing plugged into the material slots. If the distance primitives are implicit, then is there a way to generate volumetric UV coordinates for volumetric decal texture mapping? sphere, cube, plane should be simple right? a distance field ground plane with noise would be nice little floor hightmap detailing system.

I’m thinking of wrapping up my 2d volume slice texture atlas as function and doing a tutorial on it. Just need to finish up the interpolation between Z slices. Would be nice to have a way to fill 3d textures on the GPU with data from the atlas. would save on that fake uv code and Z slice interpolation.

Heya, instead of using the world position for your start position of the ray, you can use the bounds0-1UVW node as an already normalized 3d texture coordinate. I haven’t tried your code but are you able to rotate the cube?

I meant to reply mentioning the same thing in your last post where you were talking about getting the bounds of the backfacing geomtry:

https://forums.unrealengine.com/showthread.php?49169-POM-material&p=315946&viewfull=1#post315946

I am not sure why you would ever need to define the bounds using a separate pass like that. Usually you know the bounds using the bounds_0-1_UVW node like you mentioned, and you know when you’ve exited the bounds since you increment the current position by a ray size that is divided to fit your bounds. So then you just know when you are at the other side that way (the POM material is doing the same thing actually but using the camera vector math). Maybe I was not understanding the original question but I think you can get by without having it.

i have been trying method for a video but it doesnt work can someone update it for 4.8 please W9xj_iNF posted by anonymous | blueprintUE | PasteBin For Unreal Engine

At the time I couldn’t figure out the math to find out when I had exited the bounds. I ended up using a box intersection test in the HLSL shader to get my start and end positions.

Hmm, while I have been following the thread and think that everything presented here is great, I do wonder what would use-cases there are for the volumetric decals?
Could somebody point out a few?

Greetings,
Dakraid

Wait wait wait… What’s going on here and how do I do !?

I started another thread to talk about the volumetric decals:

https://forums.unrealengine.com/showthread.php?75333-Experimentation-Volumetric-Decals&p=325739&viewfull=1#post325739

Hi, I was wondering what the best method would be to mix different POM materials.
I am trying to create a landscape material with different layers I am mixing based on a splatmap.
The ideal way would be that I could mix the height, color and bump layers first and then apply the parallax mapping, but I think that wouldn’t be possible.
When I use a POM function for each layer I am having trouble mixing the resulted layers. How should I apply the splatmap masker to each layer?
When mix the layers now, each layer will be on a different height and I can’t seem to match the height of the layers with each other.

If you want multiple POM layers, you need to do the parallax for each layer using its own heightmap. Technically even the alpha needs to have parallax to be correct but it is impossible to do that for landscape layers. In general as long as the base alpha is somewhat blurry that part should not be noticeable.

The custom nodes on the left are simple “SampleGrad” nodes that fix some bad mips from sampling with parallax UVs.

The code for that node is:


return Tex.SampleGrad(TexSampler,UV,InDDX,InDDY)

So I have the shader working without silhouette, but when I apply it to terrain it totally flips out.
Similarly, if I apply it to an object that has uv coordinates pelted, it crashes the editor or may even crash the machine.

Is there any place that discusses the different rendering constraints for decals and terrain? When I apply my parallax shader to terrain it goes crazy.

I was able to put a POM material on terrain with no. Are you using UVs or WorldCoords for the UV input? If you are using UVs most likely what is happening is the landscape coords are not in alignment with the worldposition axes… try swapping to UVs instead of worldposition or try rotating your terrain 90 degrees.

The with pelt mapping is strange. I have no idea what might cause that. I have tested on a variety of meshes and never seen different behavior on different meshes. If you can send me the mesh (just upload it somewhere) I can try to reproduce it.

I don’t seem to get it to work properly in 4.9Pre. Anyone has maybe an example Material?