It would nice/more-flexible of one could use a material for this. Being able to programmatically define a deformation-pattern via a shader is more flexible and if using similar maths/materials on other things, help create meshes that will more easily blend with others. EG: I can use the same maths to create meshes that have the same procedurally-mixed shapes as my landscape spits out, vs a tiling texture.
Many games use a set of meshes pasted around/overlapping with one another, but they all have a âbuiltâ feel. Elden Ring and Metal Gear Solid: The Phantom Pain are good examples of this. Being able to use procedural-means to define things can help mix this up (aesthetic appeal).
BTW - tooltip says first-channel but the dialog actually lets you pick any channelâŚ
The challenge is that deformation happens in the vertex shader, but the material evaluation happens in pixel shaders.
What the GPU would have to do is render the entire material to a texture, and then sample that texture in the vertex shader. If thatâs what you want to do, you can set that up yourself with a render target as texture.
(I understand that while âmodelingâ manually in the editor, maybe some of these constraints could be worked around, but thatâs the basic technical limitation)
You are already using materials on the landscape which can do this, but it doesnât alter the original mesh data (collision, amount of vertices etc.). What you are looking for is probably not a shader effect but a more advanced sculpting tool that lets you edit the mesh, or automates that for you (say, automated water erosion on rocks if A or if B).
If you go the procedural route you might skip the landscape editor for most part. Diamond square algorithm for large terrains with lakes and mountains.
2 algorithms you need, (also solved in the posts). There are more algos of course that generate things the landscape system might not even support, I believe that is just a flat plane? I use marching algorithms for most things (caves, indoor areas, some organic shapes).
Collision is exactly why I am here. Have a great sampling algorithm, produces mixed-up results on the landscape/heightfield-mesh but as noted, no collision.
Hence the need to create some kind of actual mesh I can tie collision to. No biggie.
When I go to create a mesh, I can use a texture or noise but not both. Any mesh I create with a noise wonât match the patterns/textures I am using so it wonât blend. If I use a texture, it tiles, just simply tiles, which is precisely why I went the shader route on the landscapeâŚ
Conceptually, I need to be able to use that texture-deformation, but like with a texture-bomb where the texture is the basis of the noise. As is right now, I can only get a regularly-repeating type mesh as the texture simply tiles.
I get that the specific use-case I am bringing to the table is mine-all-mine, but more generally, the ask is to be able to procedurally deform a mesh, period. And if not with a material, what with?
Thanks for the links. IIRC the diamond is being used as the in-engine water-solution, no?
Iâm thinking you could create a c+ class that processes a tileable texture into a heightmap of vertices, before you even create the mesh. You will get collision and configure anything the way you like. Patterns / Textures can look stretched or deformed only if you deform the mesh after it has been UV mapped. You can also set up a shader to use XYZ location on the mesh to map a texture to instead of using UV, then it should never deform at all.
// Map of point (vertex) ID to point location (texture heightmap or your landscape data + tiled texture offsets.):
TMap<int32, FVector> Points = YourTextureProcessor::Process(YourTexture);
// My delaunay I linked earlier:
FDelaunay Delaunay = FDelaunay();
// Array of linked of IDs
const TArray<FIntVector> Triangles = Delaunay.Triangulate2D(Points);
// Then throw triangle data into the UProceduralMeshComponent or perhaps somehow landscape itself.
Donât know about that. For some basic good looking waves people used to animate multiple noise algos on top of eachother, donât know what they use these days.
Iâve done that, usually use panning normal-maps to WPO a flat-tessellated mesh. Use large-scaled ones for big waves, and move smaller waves through them, faster. Looks quite nice from a distance and the normals help add a lot of detail lighting.
Collision can always be added after the fact in the static-mesh editor. For me, itâs most-important that I can prod-deform a mesh w/o having to go into another app. If we can use a texture, then a shader would been seen IMHO as more an extension of that vs something novelâŚ
I COULD try and have the shader hit a render-target and just-use that but being able to use world-aligned texture on the thing and just-drag the edited mesh about in the world, âyeah it looks funky-enough right there, bam!â, instantiate the mesh and then go from there.