Is it possible to use this concept on a landscape?
Hereâs an update to my previous video, showing how to paint deformations in local space and get them converted into worldspace to feed into the WorldPosition Offset.
I think I know how you got that negative value. Is it possible that one of the ricochet hits is read as negative since it didnât come from the guns direction, for example: the ball falls down from hitting the wall and its backside hits the cube causing a negative hit?
Perhaps thatâs the cause?
any ideas on how to get the material to lerp back to its original state over time?
Here is a view of my whiteboard using HTC Vive :
Can someone explain how exactly we are drawing on the Render Target? In the Stream they only say: âyeah, here is the SplatMaterial and you can use every other brush you wantâ. Nice jokes that you can draw unicorns and stuff, but it feels like you also have no idea how exactly to accomplish this. Especially the SplatMaterial was poorly covered. Can we get a full explanation, saying we raycast on the plane, get the UV coordinates and paint on it doesnât help much. What is all the math involved in the SplatMaterial, how do you need to change the material to make those lovely unicorns you were talking? In the Stream you were only reading the nodes, no explanation at all, only stuff like âbascially, we need this to do stuffâ. <3
All you have to do is change the Texture that is in the Splat Material and you will have a new effect when you use it.
See, thatâs what I mean. I donât think you understand what the Splat Material is doing, because you would know that there is no Texture that you can easily change. We give the hit location to the Material, offsetting the TextureCoordinates and then we draw a circle around this location. But the math behind it is what makes my head fuzzy and yours obviously, too. The only thing you said is where we could change the ForceStrength or the Size of the brush, but all the other important stuff is completely missing.
Please see post below for better information.
Itâs not like the emissive output of the material is like a brush like you know from photoshop. Your adjusted material for example acts like a global stencil now, itâs not like like you press anywhere on the texture and you get your square shape. To make this work like a normal brush you need way more than just multiplying by a texture and itâs not only âadjust the textureâ, because this is way more tricky. The material is not as easy as you think and your explanation are just reading from nodes. You donât say anything why we need the vector length or why we are subtracting this length by 1. All you state are the obvious adjustable parameters. In the stream it feels like you wanted to explain why we are using this math, but hesitated and just said what it does and skipped all the math logic. If you reply with âIâm not here to teach you mathâ then you miss the point of teaching people how things work. In the end, I want to see you doing this material from scratch without looking at the solution. And no this is not specific for my needs, itâs just a plain simple brush we all know from any other paint application. If you can do this, Iâll take everything back <3.
Hey Ninjin,
I made a blueprint function that should be in 4.13 that make it easier to control brush size and optimize RTs by only drawing specified area. It is called âSet Canvas Material Scale and Positionâ. It will scale the RT material to fit the desired size. here is how to use it:
CanvasSize: hook the âSizeâ pin from âBegin Draw Canvas to Render Targetâ.
Position: This is the 0-1 position for the center of the material. 0.5 would be at the center of the RT.
Scale: If its set to 1, it will fit the material to the whole render target. If its 0.1, it will take up 10%.
If you set it up this way, you donât need a position at all inside of your brush material and then you can use a simple texture. The postion would just be decided in the BP and used to position the material itself.
You can also do all of this in the material as well, but you will be paying to render all pixels even if your mask ends up being small. And you will have to use UV math to control the positioning of the textures. I also have some examples of how to do that but suggest people try using the BP macro mentioned above first since its a bit easier.
(edit: just looked at the function and 4.13 and for some reason it was marked as Non Pure. Shouldnât affect anything but just know that it was meant to be a Pure function without exeuction pins. This is a known engine bug).
I realized that I was going about this in the wrong manner after talking with B. Here is a better way to use a Texture instead of the Proc brush that I was using in the example.
Here is what the results of using the above Material look like. Do not forget to set your brush Textures to clamped.
In your, Heightfield Painter Blueprint make sure to increase the Brush Size variable. For the above image, I used a setting of 5.0
How did you manage to do this? D: ! Having struggles on my end trying to use the idea to a format that isnât conforming to a heightmap.
Iâm having the issue of the size not being well⌠smaller. Iâve made the ForceSize as small as I could, but no such luck on it. But thatâs only part of the problem.
When trying to paint, The paint only goes in one area. The mesh I have is unwrapped, and when I debug it, it stays at x 0 and y 0.
Hereâs the paintbrush Material
Trace Event
and the Drawing to Render target event
EDIT:
Iâve done a test to even see if itâs retrieving UV information⌠seems like it isnât D: . Itâs returning false.
Checking Trace Complex seemed to work with returning the coordinates⌠however, Itâs still just painting at the 0,0 and still not adjusting size nor position
It looks like it could be happening because getting Texture Coordinates from line traces is not enabled in your project settings. Make sure you have that enable by doing the following.
Under Project Settings > Engine > Physics > Optimization, enable âSupport UV From Hit Resultsâ.
Hi!
I have a question about compare RT and some template texture mask, to get event when RT is filled. Maybe someone faced this and have a solution.
For example, I give some texture mask to a player as a tip, and I want to know is player filled this image as expected.
I canât see any ability to compare textures in BP. Itâs possible to do in material, but materials doesnât provide events.
So, only one solution I can see here is to place array of collision primitives that match tip mask along drawable surface, and check if any of primitives hit at least once, then image filled correctly.
But itâs a little bit complicatedâŚ
Maybe you can advise?
Thanks!
One more question about FindCollisionUV function.
Documentation say that âIf true, store extra information to allow FindCollisionUV to derive UV info from line trace hit result, using the FindCollisionUV utilityâ.
Am I right that it works only with line trace?
What about mobile touch hit result under finger? Is it not the same line trace?
Thanks!
I do have this enabled in my project sadly. Itâs still not working properly. I am using eye tracking software in VR to line trace, but I donât believe this should matter much considering itâs only producing one trace. I mean, it DOES get the coordinates info from the debug message from my last image ( http://imgur.com/LQ0cDub ) but like I said before. Itâs not painting the texture anywhere else but one spot. Iâve even tried many different UV unwrapped models. still the exact same
the little donut is the line trace location.
The opacity mask doesnt have any other info except directly to opacity.
but even when I disconnect the opacity. Right when the line trace is pointed at the mesh, instant red color.
also I am using 4.14.3
Hello again!
I hope this thread is not completely dead.
SoâŚ
-
Is it possible to use DrawMaterialToRenderTarget function with splat material that uses AbsoluteWorldPosition? For example when Location set as world location for SphereMask?
For me it not working. -
I tried to use this functionality on mobile. On one android device it behaves like splat coordinates in UV space have inverted Y coord.
I will check more devices later, but it will be really bad if it changes from device to device, so I need some device-specific setup⌠-
Also no mobile, with using texture mask in splat material, I get constant app crash when I start to draw. I tried it with and without customizedUVs for mobile.
Oh, I get it.
Mesh, have UVs where 0 is left-lower corner
while Splat material uses UVs location in different manner (left-upper corner)
I think render target texture also uses left-upper corner coordinates.
Anyway, on PC it works fine, somehow, but devices always have some sh*tâŚ
So I just inverted Y coordinate in blueprint.