I want to create wrinkle maps (some call them stress maps) for my characters.
(I know that I have to create a map for stretching/extending then squeezing/contracting in a modeling or texture program, I’m using Blender btw.)
My question is: what do I do with them once I get them into UE4? How do I set up wrinkle maps to work based on Tension of the surface polygons, if not that: then what about bone rotation and/or morph target?
I have never done this inside the UE4 engine but wrinkle maps to me sound like a custom sculpted highres model version baked to normal maps and blended in using a tension map based on edge length.
You can export things like that probably with the SOUP plugin for maya during animation baking (since measuring that in realtime might be an overkill).
Or do it a little simpler and just have custom attributes (that get exported with facial animation) tied to blood flow and wrinkle maps applied on certain areas.
as I mentioned, realtime tracking per-edge length on a deformed character meshe is probably very very resource costly. Id rather go with a pre-baked technique.
There is quite a few options here:
If you have a bone only rig (like Im currently working on) you could base those things on the distance certain facial joints from reference points. And then trigger pre-made map blends in the materials off of those ranges.
As an option or if you have a blendShape driven facial rig You could also just animate custom attributes that you export along with the facial animation itself (i.e. foreheadWrinkles, noseBlush, cheekBlush, eyeFieldWrinkles, mouthWrinkles, chinBlush, chinWrinkles, etc. etc.) Those then get connected to the map blends in the material and that would be their only purpose. That way you can ensure a more artist driven approach to your facial fx performance that would otherwise be driven by rather expensive systems.
In film I would simply hook these things to procedurally generated maps based on edgelength, but those processes are quite expensive and will certainly not be able to run anywhere near realtime. I cant imagine those being used in a large scale on current games.
We have to find ways to optimize resource usage for these types of features.
Say I go with the bone rotation/position method. How do I control it so only the deforming area’s blend into the next map? Because just blending the whole surface/material will cause other area’s to blend when they’re not even deforming.
I’m thinking about Arkham Knight, how the characters clothes have wrinkle maps to add folds and wrinkles depending on if they extend or contract a joint. That’s my goal.
A specific joint controls a specific area of the face. You can make a masked layered map approach where you have a bunch of wrinkle maps only for specific areas of the face that then trigger independently based on their driving attribute.
You should have a look at layered maps or masking methods in materials for that.
I’ll look into those masking methods in the docs, thanks
So if I’m understanding this right, the vertex weight/influence of a bone can be used as a mask in the material? So all I would have to do to set that up is create bones that influence the area that I want to unmask before I bring it into UE4, then have UE4 reference that weight data to use as a mask/unmask via material nodes tracking rotation/location of the control bones?
I don’t really need a bunch of wrinkle maps, just 2 plus the base normal map (extend map, contract map, base map)
Batman Arkahm Knight uses only 3 normal maps in their wrinkle solution, and that’s what I’m seeking to recreate. (my main application for this is clothes btw, but I couldn’t find a good example video that used clothes as an example)
Using bone weight maps from the skinCluster to mask out material overrides sounds interesting, that wasnt what I meant tho.
In the end you manually create a normal map for each situation and area of the mesh and blend them into a layered map solution based on a single float driver attribute. Thats about the simplest way of doing I could think of.
Also if its for clothes you might want to find a way to measure when the cloth is supposed to show wrinkles. That could be by measuring the distance between two bones on the outer ends of an area. Or the angle between some links in a bone chain. Or by somehow measuring the overall surface area of the mesh (not sure how to implement that in ue4 tho)
… theres a lot of ways to go about this.
Maybe if you can post a screenshot of the cloth piece and of the wrinkle maps, we can help find a more optimized solution.
Sadly the model isn’t ready yet (still being modeled, not yet sculpted), as my goal was to figure this out before getting that far so the method is considered during the workflow, but here are the references that illustrate what I’m trying to achieve:
The angle of the joints would blend the maps in for the area that’s bending.
Leg(s) extended: folds and creases are added on the front of the knee(s)
But this needs to work for every mesh deforming joint, as each wrinkle map (neutral, contract, extend) would cover the whole body, but only have small areas revealed as the joints move.
Uncharted 4 also achieved this, for clothes and Nathan Drakes arms:
Batman Arkham Knight did this just using a base normal map, then an extending normal map, and contracting normal map that covered the whole character (well, they sectioned out the head, the cowl, and body). But I can’t post Rocksteady’s normal maps because that would be copyright infringement, but they only created 3 normal maps for a given section (head, body, cowl). I’m just trying to find out how to blend between them in UE4.
Perhaps something can be done with vertex colors? (but I wouldn’t know how to use them dynamically for blending the material in realtime).
The number of normal maps is less a problem than having the material know when to apply them. Have a look at Activision’s Digital Ira (the real time version of Ira) - they store per-vertex stress in textures, but there’s also some kind of solver for vertex deformation from a reference pose. If you can figure out what they’re doing, you could likely get an animation blueprint to push the relevant data to a material. I don’t think there’s anything in the material editor alone that will get you there.
Yes, but generating those masks is where I’m stuck. In blender they have a way of keeping track of vertex deformation and translate that data into vertex colors (which is used to mask different textures)