Baking Decals to UV texture maps?

Is there in Unreal a method of baking Decals down to UV textures?
It would be nice to be able to do this for say damage effects, just baking down a black and white channel from a dynamically placed decal and then have that texture wired into the material tree to make a incremental damage effect. Is this possible to do?

Cheers, Fred

I was just thinking about how useful this would be. I couldn’t think of a way, so if someone has an idea I’d love to know as well.

One could set something up that uses for example an Hit results UV coordinate and then use that to place some stamp objects to be 2D captured into a texture.
The problem is that you don’t want the stamp to spread to nearby uv shells necessarily.
If there was a good way to generate a mesh that looked like the UV layout we could probably use that to mask out/in the custom stamps, say by raising black polygons ontop of unwanted areas.

You can do this. There are a few steps involved though.

  1. Add function to vertex shader of all materials to unwrap mesh facing a camera.
  2. Create a virtual projection onto the unwrapped mesh, but using the “World Position (Excluding Offsets)” as the projection target. That would allow you to project as if the mesh were not unwrapped.
  3. apply your pseudo decal material
  4. Use either a Render Target hooked to a scene capture or HighResScreenshot to render out the various buffers.

If you use a Scene Capture, you would basically manually isolate each of the material pins into emissive, one at a time. Then right click the RT in the content browser and click “Create Static Texture”. Then you’d go to the next material pin and repeat until you capture them all.

You can also try to use the high res screenshot tool to export them all at once, but there are some gotchas with regards to sRGB. So always render out a value you know to be 0.5 and make sure it reads as 127 in photoshop. Otherwise you need to apply inverse gamma in the material.

Step 1 is outlined here:

It might be something you could do with a Substance material as well

Ha thanks that is very cool, I would have to actually try it to understand it though. :slight_smile: But it’s a great looking effect too!

I found a way to pre-unwrap the mesh to match UV’s in Maya. Then if I get time I want to do the thing where I modify that mesh to mask out unwanted mesh/uv shells, as when I project/place my decal pixels might fall on polygons which are near eachother in UV space but not near eachother on the actual model. I was thinking of throwing a cluster of rays to record a bunch of polygons and use that poly selection for a mask to the decal. This might be pretty prone to fail though I might easily get holes, come to think of it.
Or maybe I could use the method in your blog instead somehow but I’d have to understand it properly first, by just doing it.
Another way might be to use my Maya unwrap method and then bake down those unwrapped vertex coordinates to per vertex data for the real mesh that can be used in the Unreal shader somehow.
If I knew c++ and procedural mesh programming then I could probably make something that selected uvshells and manipulated polygon data to help with the masking for unwanted overlaps.

EDIT: Maybe one way I could do the UV shell masking would be to either generate a texture or bake vtx data where I give each shell an ID color/value. Then as I sample the hit point I look that value up and maybe I can mask by checking if pixel equals/near enough that value. If I can get values to not get noisy through the texture pipeline. Vtx colors might be better then?

EDIT2: Oh noe, the get UV on Hit is broken for Skeletal meshes so now I can’t do this as I wanted to place hit decals based on collision Hit’s. Please follow link and vote if you’d like this fixed!

I’m not sure if I will have time to do this right now if I do it’ll probably be something that ignores decal projection spill to start with.

There would be no such problem. You would be projecting onto the world space of the mesh, so things being close in the UV layout has no effect. I will make an example since a bunch of people have been asking about the UV projection part by itself.

Just reading this thread again and just want to clarify that I am trying to do this at runtime.
Ie continually throughout gameplay add to this ‘damage mask’ as Pawn is getting hit. Making it increasingly more damaged as game goes on.

To do that, what you would want to do is pre-bake out a positional texture map using the UV layout of the mesh, and then you could dynamically add new projections onto the mesh just by changing material parameters, without needing to use a scene capture to re-bake the mesh.

The position map would have to be in local space and you would then need to convert any projection origins and vectors into local space. It would always have to project using the neutral pose of the mesh though.

edit promise I haven’t forgotten about this thread. I have the projection example made but things are pretty hectic for GDC prep so I will try to post it back later tonight.

You mean bake out the neutral/bind pose xyz position coordinates to a UV map?
I started doing something like that and I wanted to use the Unreal functionality of getting UV coords from a hit to then apply my procedural hit ‘decal’ on that UV location.
But it turns out this function does not work at all on a Skeleton Mesh, just on a Static Mesh. (and I found bugs also on the Static Mesh version, which I have submitted)

I’m not sure how else I could do a UV placement from a hit on an animated mesh? The neutral pose could be far from pose of the character being hit in action.

EDIT: Maybe I could collect all up-hierarchy bone rotations and then feed those into the shader to apply some reverse/back-again transform with the whole chain to get back to the Neutral coordinate? :stuck_out_tongue: Sounds fiddely but maybe doable if pressed?

EDIT2: Probably better to do the chained inverse transforms in BP to avoid per pixel calculations? And I could do a loop there as I don’t think loops are available in Materials?


That is true. If you want to do hit projections using the actual animation pose, I am not aware of any method that let you do that currently, other than manually evaluating the transform chain and applying all the transforms for bones above in the chain. It would be pretty involved to set that up.

Ha, glad you came to the same conclusion! :slight_smile: Also glad you think it might be possible.
I won’t do this soon but if I do I’ll post back here.

Hey, talked with DanielW and he reminded me that you actually can do this without having a separate copy of your character.

It is possible because of a new feature that lets you modify the scene and use render targets in between visible rendered frames.

The concept works like this:

  1. Have a scenecapture2D follow your skelmesh. Set it to not capture every frame.

  2. After damage, change the material of your skelmesh to use a UV unwrap material. If set to render out the worldposition excluding offsets, it will actually include the skinned positions since its your actual gameplay mesh.

  3. Send event to scenecapture2d to Capture. This will push the current world positions into the UV layout. Then you can project your decal or blood splat /whatever using either a spheremask or a projection matrix. You can accumulate these into the same RT so the decals build up and they should match the skinned positions.

  4. Set your character material back to its default. Since all the above happened in one tick, nothing changes on screen other than how you choose to use the accumulated render target.

Cool thanks so much for investigating! :slight_smile: That sounds exciting to try!
At the moment I am very focused on a gameplay improvement stretch and this would be a bit of a luxury to pursue right now as it sounds like it would take a little figuring out for me, how to technically implement and check each step.
I definitely am going to try though as this would be a pretty important feature for my game.

I have a few questions on what you just said,

  • For the UV unwrap material could I just copy your one here further down, “The Unwrap Material” ?

  • How would I actually trigger the scenecapture2D? In my experience, when you turn off every frame triggering you’d have to keep in on ‘trigger on movement’ and then ‘jiggle’ it with an actual transform in the BP to force it to take a picture. Would I still do that, and then turn everything back to normal? Would it know to apply the new materials before taking the picture?

Cheers, Fredrik

There are actually 2 functions in engine that are very similar to the unwrap function from my blog. both would require just a little bit of tweaking to work.


That one unwraps it facing the camera (in any orientation), but it positions it centered on the bottom edge. You would probably want to tweak it to center on the center of the UVs to make it align with the render target more easily.


That one unwraps only facing up, but it was meant to be used with the “Render to Texture Blueprint toolset” and it has some scalar parameters inside the function that are not exposed as pins. “Unwrap” is a scalar that is meant to be either 0 or 1 to toggle the unwrap, and “Render location” allows setting the center to a specific value. So you could start with either of these functions with just a few tweaks. Should be easy to add pins for the 2nd one, or just create and MID and set them using a BP and it should work ok.

Re: triggering, there is an actual event you can call on them called “Capture” that triggers a single frame to capture. Using movement will most likely not work for sub-frame changes like this.

You would apply the unwrap material just before doing Capture, and then re-apply the original material right after.

Sweet thanks a lot I’ll definitely try this as soon as the little producer on my shoulder tells me it’s ok! :slight_smile:
I’ll post here if I manage to come up with something.

Cheers, Fred

I have been slowly starting to try this out in-between other tasks but so far I have no luck getting it to work.
I’m probably doing something wrong but I can’t get the ‘hit mark’ to seem to hit in the right place, it’s like the shader is not using the pre-unwrapped surface when calculating the distance gradient.
If anyone can see what I am missing please holler.





Have you tested this material without using events, just by placing it it in the world, and using a precomputed positional mask? its easy to make little errors that throw these setups off so I suggest breaking out each step.

Hi not exactly sure what you mean by the positional mask?
But here are pics of the material applied to the sphere both without the unwrap function active, and also with it active and a cube positioned where the hit location is.
If you notice in the shader i am feeding a constant position which represent the position of the hit as I printed out and wrote down. (you can see the numbers in turquoise in the previous last pic)
These too seem to indicate that the distance calculation seem to be happening in the post-unwrapped surface positions I think.


UNWRAP ENABLED, Coord Sys locator is the sceneCaptureComponent2D and the little cube is positioned in the hit location.


Ah, I think you need to select the WorldPosition node and set it to ‘Excluding Offsets’ mode. Looks its including the unwrap offsets right now which is why it is not matching. You want to perform the mask using the position the mesh was prior to the unwrap shader.

positional texture is just a way you can precompute this in local space and just use that instead of continuing the unwrap the mesh. but that wont work if you need animations to work and match up to the exact spot where hits occurred.