Create a material with a double layer and an in game editable mask

Hi, the effect I’m trying to achieve is some sort of no painted-painted effect.
What I thought about using a multi layer material with a mask, the issue is that the mask has to be edited live as the paint should be a mechanic of the game itself.

Is there any way to update a texture during runtime to edit the material?
I also thought I could use the vertex color instead of a texture but I couldn’t find any way of changing it during runtime.

Do you guys have any suggestions on how to achieve this effect/material?
Thanks!

In 4.13 there is both a trace returns mesh UV & blueprint rendertarget utlities, so you could use those to paint on a chalk board for example as a game mechanic.

Yep, I read the changelog today and downloaded 4.13 immediately. It will take a while to figure out how the new features work but atleast I have something to work on now!

I would not suggest using the ray trace to get UV for painting since it will not handle painting across edges. Instead the approach you should take is to prepare an unwrapped layout of your paint mesh using a unique lightmap type UV layout and render either local or worldpositions into this texture. Then for each brush stroke application, you perform a spheremask with the positional texture data and it will paint nicely across seams no matter how the mesh is unwrapped.

Thanks a lot for the suggestion, your solution would be the best as I would have to paint the object all around.
Could you tell me if this process is what you mean? I think I misunderstood something about rendering the worldposition into the texture.

-Raycast on the object and get the world position of the “stroke”
-Perform a spheremask from that world point
-Draw on the texture the spheremask

The uv map is not an issue, I’m just trying to understand the various steps to solve the problem as close to “code level” as possible so I can focus on solving them one by one while learning.
Thanks again!

This is actually pretty simple. The unwrap is just a vertex shader using the lightmap UVs. You can visualize it by animating a parameter to disable the unwrap:

unwrapanim.gif

Then for the emissive color you can either use worldposition or local position. Worldposition is easier since you won’t have to transform your brush location, but it will require an HDR format and take more memory. In this example I used local 0-1 UVW which can be stored in regular texture formats and you should be able to transform into the local space by doing an “inverse transform location” in blueprint. Make a simple test case first to see if they match up. Maybe just start with worldposition instead of local since its easier:

To save the unwrap you can just place a scene capture 2d, assign a render target and tell it to capture emissive in RGB. Then you just goto the render target in the content browser and right click and select “Create Static Texture”. Done.

Now to see why this works, here is a spheremask applied to the local positions while unwrapping the mesh:

unwrapanim2.gif

First of all I really want to thank you for the effort you put in helping the community, I didn’t really expect an answer that exhaustive, it really helped me.
Is there a way to render the unwrapped material without a scene capture 2d? I’m trying to do something very similar to the heightfield content example where i have a dynamic material instance to act as the brush (the material I use to spheremask), and another MID to render the surface of the material based on the brush mask. The issue is not how to pass the texture, I managed to do that through a rendertarget, it’s how to get the brush mask rendered according to the UV unwrap without using a scene capture.

Not that I am aware of. You could do it in 3dsmax or something as well if that is easier to make part of your workflow.