Training Stream - Blueprint Drawing to Render Targets Overview - Sept 20 - Live from Epic HQ

Is it possible to use this concept on a landscape?

Here’s an update to my previous video, showing how to paint deformations in local space and get them converted into worldspace to feed into the WorldPosition Offset.

I think I know how you got that negative value. Is it possible that one of the ricochet hits is read as negative since it didn’t come from the guns direction, for example: the ball falls down from hitting the wall and its backside hits the cube causing a negative hit?

Perhaps that’s the cause?

any ideas on how to get the material to lerp back to its original state over time?

Here is a view of my whiteboard using HTC Vive :

Can someone explain how exactly we are drawing on the Render Target? In the Stream they only say: “yeah, here is the SplatMaterial and you can use every other brush you want”. Nice jokes that you can draw unicorns and stuff, but it feels like you also have no idea how exactly to accomplish this. Especially the SplatMaterial was poorly covered. Can we get a full explanation, saying we raycast on the plane, get the UV coordinates and paint on it doesn’t help much. What is all the math involved in the SplatMaterial, how do you need to change the material to make those lovely unicorns you were talking? In the Stream you were only reading the nodes, no explanation at all, only stuff like “bascially, we need this to do stuff”. <3

All you have to do is change the Texture that is in the Splat Material and you will have a new effect when you use it.

See, that’s what I mean. I don’t think you understand what the Splat Material is doing, because you would know that there is no Texture that you can easily change. We give the hit location to the Material, offsetting the TextureCoordinates and then we draw a circle around this location. But the math behind it is what makes my head fuzzy and yours obviously, too. The only thing you said is where we could change the ForceStrength or the Size of the brush, but all the other important stuff is completely missing.

Please see post below for better information.

It’s not like the emissive output of the material is like a brush like you know from photoshop. Your adjusted material for example acts like a global stencil now, it’s not like like you press anywhere on the texture and you get your square shape. To make this work like a normal brush you need way more than just multiplying by a texture and it’s not only “adjust the texture”, because this is way more tricky. The material is not as easy as you think and your explanation are just reading from nodes. You don’t say anything why we need the vector length or why we are subtracting this length by 1. All you state are the obvious adjustable parameters. In the stream it feels like you wanted to explain why we are using this math, but hesitated and just said what it does and skipped all the math logic. If you reply with “I’m not here to teach you math” then you miss the point of teaching people how things work. In the end, I want to see you doing this material from scratch without looking at the solution. And no this is not specific for my needs, it’s just a plain simple brush we all know from any other paint application. If you can do this, I’ll take everything back <3.

Hey Ninjin,

I made a blueprint function that should be in 4.13 that make it easier to control brush size and optimize RTs by only drawing specified area. It is called “Set Canvas Material Scale and Position”. It will scale the RT material to fit the desired size. here is how to use it:

CanvasSize: hook the “Size” pin from “Begin Draw Canvas to Render Target”.
Position: This is the 0-1 position for the center of the material. 0.5 would be at the center of the RT.
Scale: If its set to 1, it will fit the material to the whole render target. If its 0.1, it will take up 10%.

If you set it up this way, you don’t need a position at all inside of your brush material and then you can use a simple texture. The postion would just be decided in the BP and used to position the material itself.

You can also do all of this in the material as well, but you will be paying to render all pixels even if your mask ends up being small. And you will have to use UV math to control the positioning of the textures. I also have some examples of how to do that but suggest people try using the BP macro mentioned above first since its a bit easier.

(edit: just looked at the function and 4.13 and for some reason it was marked as Non Pure. Shouldn’t affect anything but just know that it was meant to be a Pure function without exeuction pins. This is a known engine bug).

I realized that I was going about this in the wrong manner after talking with B. Here is a better way to use a Texture instead of the Proc brush that I was using in the example.

Here is what the results of using the above Material look like. Do not forget to set your brush Textures to clamped.

In your, Heightfield Painter Blueprint make sure to increase the Brush Size variable. For the above image, I used a setting of 5.0

How did you manage to do this? D: ! Having struggles on my end trying to use the idea to a format that isn’t conforming to a heightmap.

I’m having the issue of the size not being well… smaller. I’ve made the ForceSize as small as I could, but no such luck on it. But that’s only part of the problem.

When trying to paint, The paint only goes in one area. The mesh I have is unwrapped, and when I debug it, it stays at x 0 and y 0.

Here’s the paintbrush Material

http://imgur.com/X12QnGM

Trace Event

http://imgur.com/VUOOtbM

and the Drawing to Render target event

http://imgur.com/NHW3S2R

EDIT:

I’ve done a test to even see if it’s retrieving UV information… seems like it isn’t D: . It’s returning false.

http://imgur.com/fBCyWU6

Checking Trace Complex seemed to work with returning the coordinates… however, It’s still just painting at the 0,0 and still not adjusting size nor position

http://imgur.com/LQ0cDub

It looks like it could be happening because getting Texture Coordinates from line traces is not enabled in your project settings. Make sure you have that enable by doing the following.

Under Project Settings > Engine > Physics > Optimization, enable “Support UV From Hit Results”.

Hi!
I have a question about compare RT and some template texture mask, to get event when RT is filled. Maybe someone faced this and have a solution.

For example, I give some texture mask to a player as a tip, and I want to know is player filled this image as expected.
I can’t see any ability to compare textures in BP. It’s possible to do in material, but materials doesn’t provide events.
So, only one solution I can see here is to place array of collision primitives that match tip mask along drawable surface, and check if any of primitives hit at least once, then image filled correctly.
But it’s a little bit complicated…

Maybe you can advise?
Thanks!

One more question about FindCollisionUV function.
Documentation say that “If true, store extra information to allow FindCollisionUV to derive UV info from line trace hit result, using the FindCollisionUV utility”.
Am I right that it works only with line trace?
What about mobile touch hit result under finger? Is it not the same line trace?

Thanks!

I do have this enabled in my project sadly. It’s still not working properly. I am using eye tracking software in VR to line trace, but I don’t believe this should matter much considering it’s only producing one trace. I mean, it DOES get the coordinates info from the debug message from my last image ( http://imgur.com/LQ0cDub ) but like I said before. It’s not painting the texture anywhere else but one spot. I’ve even tried many different UV unwrapped models. still the exact same :confused:

http://imgur.com/gIQMbzn

the little donut is the line trace location.

The opacity mask doesnt have any other info except directly to opacity.

http://imgur.com/62Wf7aV

but even when I disconnect the opacity. Right when the line trace is pointed at the mesh, instant red color.

also I am using 4.14.3

Hello again!
I hope this thread is not completely dead.

So…

  1. Is it possible to use DrawMaterialToRenderTarget function with splat material that uses AbsoluteWorldPosition? For example when Location set as world location for SphereMask?
    For me it not working.

  2. I tried to use this functionality on mobile. On one android device it behaves like splat coordinates in UV space have inverted Y coord.
    I will check more devices later, but it will be really bad if it changes from device to device, so I need some device-specific setup…

  3. Also no mobile, with using texture mask in splat material, I get constant app crash when I start to draw. I tried it with and without customizedUVs for mobile.

Oh, I get it.
Mesh, have UVs where 0 is left-lower corner

while Splat material uses UVs location in different manner (left-upper corner)

I think render target texture also uses left-upper corner coordinates.

Anyway, on PC it works fine, somehow, but devices always have some sh*t…
So I just inverted Y coordinate in blueprint.