Wrinkle Maps? Tension Maps? How do you set them up? (workflow, requirements)

I want to create wrinkle maps (some call them stress maps) for my characters.

(I know that I have to create a map for stretching/extending then squeezing/contracting in a modeling or texture program, I’m using Blender btw.)

My question is: what do I do with them once I get them into UE4? How do I set up wrinkle maps to work based on Tension of the surface polygons, if not that: then what about bone rotation and/or morph target?

I saw this on answers, but it doesn’t go into detail to explain the set-up or typical workflow

(please feel free to correct any of my terminology, information on this topic is somewhat scarce, as I may be using the terms wrong)

Nicolas3D may have some info on this for you, I know he managed to get wrinklemaps into UE4.

I’m really interested in this myself.

Well, I hope he stops by then :slight_smile:

I would really like to know how this is done in UE4.

I have never done this inside the UE4 engine but wrinkle maps to me sound like a custom sculpted highres model version baked to normal maps and blended in using a tension map based on edge length.
You can export things like that probably with the SOUP plugin for maya during animation baking (since measuring that in realtime might be an overkill).
Or do it a little simpler and just have custom attributes (that get exported with facial animation) tied to blood flow and wrinkle maps applied on certain areas.

This is the part that I want to know how to do in UE4.

I already know how to bake out different normal maps, I just want to know how to make them work this way in UE4

You could, in theory use parameter collections with the value being driven by the rotation of certain bones, etc to get this functionality, but that would be a pretty manual process to set up.

as I mentioned, realtime tracking per-edge length on a deformed character meshe is probably very very resource costly. Id rather go with a pre-baked technique.
There is quite a few options here:

  • If you have a bone only rig (like Im currently working on) you could base those things on the distance certain facial joints from reference points. And then trigger pre-made map blends in the materials off of those ranges.
  • As an option or if you have a blendShape driven facial rig You could also just animate custom attributes that you export along with the facial animation itself (i.e. foreheadWrinkles, noseBlush, cheekBlush, eyeFieldWrinkles, mouthWrinkles, chinBlush, chinWrinkles, etc. etc.) Those then get connected to the map blends in the material and that would be their only purpose. That way you can ensure a more artist driven approach to your facial fx performance that would otherwise be driven by rather expensive systems.

In film I would simply hook these things to procedurally generated maps based on edgelength, but those processes are quite expensive and will certainly not be able to run anywhere near realtime. I cant imagine those being used in a large scale on current games.
We have to find ways to optimize resource usage for these types of features.

Ok, I see.

Say I go with the bone rotation/position method. How do I control it so only the deforming area’s blend into the next map? Because just blending the whole surface/material will cause other area’s to blend when they’re not even deforming.

I’m thinking about Arkham Knight, how the characters clothes have wrinkle maps to add folds and wrinkles depending on if they extend or contract a joint. That’s my goal.

A specific joint controls a specific area of the face. You can make a masked layered map approach where you have a bunch of wrinkle maps only for specific areas of the face that then trigger independently based on their driving attribute.
You should have a look at layered maps or masking methods in materials for that.

I’ll look into those masking methods in the docs, thanks :slight_smile:
So if I’m understanding this right, the vertex weight/influence of a bone can be used as a mask in the material? So all I would have to do to set that up is create bones that influence the area that I want to unmask before I bring it into UE4, then have UE4 reference that weight data to use as a mask/unmask via material nodes tracking rotation/location of the control bones?

I don’t really need a bunch of wrinkle maps, just 2 plus the base normal map (extend map, contract map, base map)

Batman Arkahm Knight uses only 3 normal maps in their wrinkle solution, and that’s what I’m seeking to recreate. (my main application for this is clothes btw, but I couldn’t find a good example video that used clothes as an example)

Using bone weight maps from the skinCluster to mask out material overrides sounds interesting, that wasnt what I meant tho.
In the end you manually create a normal map for each situation and area of the mesh and blend them into a layered map solution based on a single float driver attribute. Thats about the simplest way of doing I could think of.

Also if its for clothes you might want to find a way to measure when the cloth is supposed to show wrinkles. That could be by measuring the distance between two bones on the outer ends of an area. Or the angle between some links in a bone chain. Or by somehow measuring the overall surface area of the mesh (not sure how to implement that in ue4 tho)
… theres a lot of ways to go about this.

Maybe if you can post a screenshot of the cloth piece and of the wrinkle maps, we can help find a more optimized solution.

Sadly the model isn’t ready yet (still being modeled, not yet sculpted), as my goal was to figure this out before getting that far so the method is considered during the workflow, but here are the references that illustrate what I’m trying to achieve:

The angle of the joints would blend the maps in for the area that’s bending.
5aad46447058fb7f5e2fcdaaffb0426f3780c512.jpeg

Leg(s) extended: folds and creases are added on the front of the knee(s)

Leg(s) contracted: front of the knee(s) are stretched smooth, and folds and creases are added behind the knee(s)

This image is pretty much what I’m trying to do, but would be achieved by blending between just 3 maps in and out:


See how the rotation of the shoulder changes the wrinkles and folds in his jacket.

Pretty much every joint would reveal a different area of the maps, which map it reveals would depend on the rotation. for example:

(any axis)
rotation 0 = neutral map
rotation + = contraction map
rotation - = extension map

But this needs to work for every mesh deforming joint, as each wrinkle map (neutral, contract, extend) would cover the whole body, but only have small areas revealed as the joints move.
Uncharted 4 also achieved this, for clothes and Nathan Drakes arms:

This is what they sculpted in Zbrush

And this is it in their engine:

Batman Arkham Knight did this just using a base normal map, then an extending normal map, and contracting normal map that covered the whole character (well, they sectioned out the head, the cowl, and body). But I can’t post Rocksteady’s normal maps because that would be copyright infringement, but they only created 3 normal maps for a given section (head, body, cowl). I’m just trying to find out how to blend between them in UE4.

Perhaps something can be done with vertex colors? (but I wouldn’t know how to use them dynamically for blending the material in realtime).

The number of normal maps is less a problem than having the material know when to apply them. Have a look at Activision’s Digital Ira (the real time version of Ira) - they store per-vertex stress in textures, but there’s also some kind of solver for vertex deformation from a reference pose. If you can figure out what they’re doing, you could likely get an animation blueprint to push the relevant data to a material. I don’t think there’s anything in the material editor alone that will get you there.

Thanks, I’ll take a look at that.
I thought that this would involve blue print at some point. In fact, I wasn’t sure if I should post this thread in rendering or in blue print :confused:

What caught my eye int hat digital Ira demo is the “Dynamic Weight Map Texture Blending”.

I think this is what’s needed to accomplish this, would anyone know how to set up something of similar effect in blueprint or the material editor?

I’m looking for this as well, is there anyone who get to material shaders tension map?

Seems like just masking and blending between different versions of the normal map should go a long way.

Yes, but generating those masks is where I’m stuck. In blender they have a way of keeping track of vertex deformation and translate that data into vertex colors (which is used to mask different textures)

I meant just masking body parts, such as the hand, then blending between flexed, neutral, and extended normals based on bone orientation.