How do you convert 3D coordinates to 2D/UV coordinates?

I have some code that allows me to paint on meshes by unwrapping their uv and using it to read coordinates.


My goal is to instead of just painting a random color in the shape of a sphere or box mask, I want to be able to use this to put stickers (like decals) on the skeletal mesh. This could be anything from bullet holes and blood to tattoos and scars.

As you can see in the above video, I have code that’s working to paint on the mesh from 3D coordinates.

The problem then is that to apply a texture instead of a sphere mask I need to scale that texture down so that it isn’t the size of the entire mesh, and then I need to position the texture based on the 3D coordinates.

I already know how to scale the texture (just multiply texcoord) and how to move the texture around (subtract texcoord) but I cannot figure out how to get the texture to be at the exact coordinates I want.

To do that in theory I would have to convert those 3D coordinates to 2D coordinates and I don’t know how.

Code

Generate paint mask based on position:


Send position to above material then draw it to rendertarget in order to paint:

The mesh preview material which receives above rendertarget:

The tutorial I followed:

https://www.youtube.com/watch?v=Gybd0456wvc

My most recent (failed) attempt to convert 3d coords to 2d ones

How can I do this?

Enable “Support UV From Hit Results”

Then you can access it via a bp node

Doesn’t seem to do the trick, unless i’m just using it wrong

Edit: Problem seems to be the find uv collision is just returning 0,0 the node seems to just not work.

Edit2: I got it to work but the node does not work with skeletal meshes so I can’t use it.

Summary

Hit mask material:


Feeding the data into the hit mask material:

Final material:

Edit3: I’ve sorta gotten somewhere, I found this video which is the origin of the technique I’m using:

And in there I found this formula, it’s using the technique I had found, of dividing X and Y with Z to convert 3d coords to 2d ones:

And I tried to imitate it in my context:

It brought me a lot closer but it’s not quite there yet.

It’s not quite centered and it stretches and skews a lot.

The reason I’m not using inverse transform matrix is because I’m already using relative location data in the first place.

You won’t have perfect coverage without distortion. It’s not world space UV’s so you will get stretching.

The outcome depends on how the UV map distributes the texture along the polygons. In some places where it needs less detail it will cover larger areas, where it needs detail it will be smaller. You can only change this if you modify the UV map in a 3d program by either better splitting up of UV Islands and using better algorithms to reduce distortion / keep proportions.

Well the distortion is as you say inevitable, but it’s also not really the main problem right now (not yet at least), i’m just trying to center the texture on the correct coordinates.

Edit: after studying it a bit more, i think i’ts getting the coordinate right it’s just not doing it the way i want, i decided to eset the texture samlper to wrap instead of clamp, and now it looks like this:


So it’s kinda clear there that the ‘pattern’ is centered around the stomach area where it is supposed to, however the texture seems to distort itself away from that point rather than centering itself on that point like I want.

In other words i’m getting the coordinates right but I am not using them correctly.

If the texture is clamped it still repeas itself (just less) which a clamped texture shouldn’t do…

I got a little bit closer

I got this by adding the two divide by 10 nodes…

But I have basically no idea what I’m doing lol.

Edit: I just realized I saw the other day some chart with some numbers for that bottom Z multiply math; this is to account for the perspective, i think these numbers if i recall correctly are for ‘perspective’ but maybe they need to be ‘ortho’ cuz the uv unwrap was done with an ortho scene capture camera? :thinking: but i forgot where I saw those numbers so I don’t know what numbers to try :man_facepalming:

The 57.3 is something to do with FOV, it happens to be the exact same number as another number used in a convert fov to angle in degrees node…