Distance Field Flow Map

I’d love to have that too.

Hey Roel!
Thanks for this nice work.
Unfortunately I couldn’t get this to work properly. This is all I’m getting:

I had to remove the last vector normalize from DirectionToNearstSurface function, otherwise it wouldn’t work. There seems to be some problems with DistanceFieldGradient. Water doesn’t react to obstacles and it’s not producing any foam either. DistanceFieldGradient just produces blue (RG masking just outputs zero), so it’s not surprising that it’s not working as it should.

Did you guys enable ‘Generate Mesh Distance Fields’ in your project settings under Renderer? Also use of the distance field nodes require DX11 hardware (Geforce 4xx or better, Radeon 6xxx or better), and only works on PC / PS4. Basically, same requirements for all distance field features.

That’s it, thanks. I forgot to read The state of Distance Field GI in 4.8

This is simply amazing! I’m trying to recreate a scene from one of the cards of Magic Origins where there is a lot of water and using this technique might make the scene more organic!

Amazing, I need this in my life! ^^
Any hints as to how you warped the UVs. I’ve tried doing similar things but can’t quite get it to work.

Yeah i’d also love to see a tutorial on how you get the normals warping.

Also in 4.10 i find my distance field is pretty far out from the 3d meshes… how do i tighten it up closer to the objects?

Ok so I am getting back to this (see here https://forums.unrealengine.com/showthread.php?108052-Automatic-flow-map-material-for-water). I’m currently tidying up the material but the basic idea is this. I have a plane which has uv coords which I can sample V to get a direction vector for the main flow. I then do a ‘tangent’ operation between the input flow vector and the distance gradient (effectively its the normal to the surface or the direction to the nearest surface). This tangent is defined as cross( cross(A,B), B). You then subtract that vector (masked to RG to make it a 2d vector) from the texture uvs of the water. all it does is offset those uvs of the water texture based on that warp tangent.

Then the next step is that I’ve got a spline mesh river setup which I need to put this on and work on the actual water surface quality itself as this is just a placeholder to get the math right.

Here’s an update to my attempt at this…

Can’t seem to view your video @dokipen ? The shots look superb though! Mind sharing how you’ve achieved this? I did try the original as shared by the OP but didn’t achieve your results or even close :cool:

Hi @Cheshire. The video was uploading. Should be good now to watch. Post #48 was the gist of it but I hope to write up a tutorial soon.

pretty cool!

A tip that might make similar experimentation a bit easier for others to reproduce is that you can try the above technique with the flowmaps node. You just need to change the input flow direction. Simply lerp from the tangent Y vector the DF gradient using a small fraction and a divided distance field lookup (ie, clamp(distancetonearestsurface /100) gives a gradient going form 0-1 at 100 units from meshes. Some interesting things are possible for sure.

I’m also wondering; Could these Distance nodes be used to create procedural dirt effects? Like -example- 2 rocks intersecting and some moss/old stuff/wetness creeping up there? Something similar to the AO mask you can bake out via Lightmass. If I’m totally out of it, just ignore please :smiley:

In some cases, but the challenge there is that if a mesh is contributing to distance fields, it always returns 0 for closest distance since it is the closest to itself. That means if you have two rocks intersecting and you want a gradient, one of them needs to opt out of the distance fields, so you can never have two way gradients with it.

Ah hmm, pity though. Would have been so nice. I’m a huge fan of these little tricks that end up saving so much time (aren’t we all :rolleyes:). Thanks for the education!

Would something like a “distance field channel” setting be doable :wink:

Hmm I doubt it, since the lookups are done inside of the global distance field which is a composite of all the DFs. Doing channels would essentially require multiple global distance fields to be constructed and the global lookup is already a fairly high overhead system since it has to build it 3 times at cascading resolution for the clip maps. That is why when you add “Get distance to nearest surface” to a material you will see a pretty huge increase in shader instructions. Supposedly the rendering cost isn’t quite as high as the instructions since the branching means the gpu chooses between the 3 clip maps rather than doing all of them, but it sees all of the instructions none the less.

Of course maybe there is some way to add that as a new feature that looks up individual distance fields rather than the globals but that would also be much more expensive.

Here’s something I just came up with which might be a solution to that. You can sample the gradient and then do another sample of the gradient which is at a position offset by the first. Then you do a dot product on the two gradient vectors. then either remap ( i made a material function for that) or clamp that to 0-1 to get a ‘closeness mask’. You can multiply the first gradient to get a larger ‘sample width’ which will look further away for the next gradient. Its not a true distance value. Its more like an ambient occlusion against other things. You can actually calculate curvature of all objects globally (laplacian which is divergence of the gradient) to do a global curvature/ambient occlusion mask.

Excuse the monitor cameraphone shots…

final mask…

dot product result…

my working out…

I should add that this method (doing a dot product) is almost the same as the divergence thingy. The standard way of calculating these things is just to offset in x,y and z axis to get the sample offsets. The difference in what I’ve done is that I am using the gradient as the sample offset.

Here is an example of the ‘curvature’ laplacian operation to get the general nooks and crannies…

small sample offset width = smaller details

larger sample offset width = bigger features.

Remember to clamp those before using them as well because there are negative values which are the ‘open’ areas. You could reverse those values to get an ‘exposed’ mask.

That looks awesome! So unfair that you can do this :frowning: