Feeding material with world position offset data

Looking for a solution to achieve the following effect:

In motion:

I’d need multiple spheres displacing a single plane mesh underneath in real time, think gravity well:

  • The size of the plane and the spheres (and their number) is dynamic and will affect the strength of the displacement
  • The distortions will overlap but will not be cumulative on the Z axis (ideally, but not really a must have)
  • The effect is purely visual, no collision, shadowcasting or additional player interaction is necessary
  • Does not need to be particularly performant
  • Does not need to be very accurate
  • I was hoping to achieve it via materials

I’ve been experimenting with *WorldPositionOffset *and SphereGradient-2D and it’s easy enough to pull it off for a single instance.
Multiple displacement effects are, of course, doable if I simply duplicate the nodes and apply unique coordinates but that’s cumbersome
and the final number of the displacements is not fixed. And it does not feel like the right way do it anyway.

How do I send a bunch of coordinates to the material in real time? Is a dynamic texture implemented through C++ the only way forward?

Out of the box ideas are welcome, too!
Thanks

to get the sphere location and radius to the material you can use Material Parameter Collections | Unreal Engine Documentation

with this approach you will have to do the deformation for each sphere though. You could also look into using SphereMasks for the deformation so they are in 3D space.

Another approach could be to write a displacement texture via blueprint and render it to a texture target and use that for the deformation.

I have no issues with location and radius, my beef is sending information from multiple actors to the same material.

I’ll be dealing with 30 actors at most. So you’re suggesting a MaterialParamCollection with 30 location vectors, 30 radius scalars and 30 whichever-param-I-need and looping through the objects while setting the collection’s parameters accordingly. OK, I can see it work, as awkward as it sounds.

I used 3d sphere mask previously but run into the same issues but I can look into it again.

That sounds promising, I’ll do some extra digging, thank you.

If anyone has any other ideas, please do let me know.

Another idea is that you could have a scene capture actor set to Orthographic and set it to be a top down angle. Then in your spheres , you could use the vertex shader to make the spheres larger if the camera is a top down camera. You can just do a dot of cameravector with the known top down vector of 0,0,-1 to see if its true. Then you can use the depth of this expanded sphere texture on your ground to displace things.

This is how I solve certain billboard related shadowing problems when distance fields are not an option.

Yeah, a bunch of separate parameters will kill your performance. It’s faster to write the locations to a texture and loop through it in the material. This should get you started, it performs decently with 100 spheres:

This material doesn’t include a variable radius, but you should be able to write that to the alpha channel of your texture. You can probably find a better offset function than I used.

35e9867b3fa96eee6477e753ab538d93aedc5811.jpeg


  float3 sphereloc= Texture2DSampleLevel(Tex,TexSampler,float2(0.01+0.005,0.5),0.0).xyz;
  float3 offset=float3(0,0,0);
  float numstep=100;

    for(int i = 0; i < numstep; i++) {        
    sphereloc=Texture2DSampleLevel(Tex,TexSampler,float2(float(i)*0.01+0.005,0.5),0.0).xyz;
    offset+=lerp(float3(0,0,-500),float3(0,0,0),clamp(distance(sphereloc,wpos)/500,0,1));

    }
    return offset;




Create a CanvasRenderTarget2D blueprint.

0755ae6be8043ff42131b9fd85d6dfab93291b90.jpeg

This actor creates a bunch of spheres above the plane, and adds their location to an array. It creates an instance of the CanvasRenderTarget2D and casts the location array to it. It creates a dynamic material instance, and uses the CRT2D as a texture parameter, and sets the material on the plane.

859ef08348f28b651b56bf1d45308b3aadeacdc5.jpeg

Here’s what it looks like:

4 Likes

Hello, 1st 2nd and 3rd screenshots aren’t accessible for me for some reason (server saying access denied), could you please reupload them if you still have that solution? Or are there better ways nowadays, like Virtual Runtime Texture?

You can thank the eggheads behind the forum update for the missing stuff. Let it rip on the feedback section. Who knows. Maybe after a total of 300 posts they’ll do something about it.

As far as your issues go. Just use a render target.

A scene capture2d can capture the sphere, and write the scene depth into a texture.
You can then use that texture to displace the landscape.

Make sure you only capture the spheres, as you won’t really need much else.

2 Likes

Hi,
The data I need to pass to the material (and be used by a custom expression) is a array of struct with 7 floats in it. I’m putting it in a Texture2D and doing WavesData.mips[0][pos][jTex] to access the data but it doesn’t work properly.
I should use a RenderTarget ? What’s the difference ? Is it something to do with compression or how I access the data in HLSL ?

Passing data and displacing around spheres are completely different matters.

To pass data you can pack a texture pixel with out of range values. You can check how the pivot painter does it, derive something similar.

In a standard material you can’t do loops, or access an array or anything really complex that would require coding.

Using a Custom node, you can do for loops. However there is no real way to pass a variable amount of data in. You would be limited to the texture again, or to a maximum amount of variables off a material parameter collection.

If all you need is to displace around objects. A render target is a much cleaner overall solution.

No I’m not displacing. I need the data.
I have a custom node with functional HLSL because I need loops, it’s not the issue.
The data array I pass does not have a fixed number of elements so I need to put it in a buffer (which would be a texture here).
So I would need to look at the pivot painter ? I don’t mind using a texture, that’s what I am doing right now. But something is wrong with the configuration because I am not getting the same data in the HLSL code as the one put in the mipmap.

Not sure what you are actually doing.

If you need to use a texture, analyze how pivot painter works.