How can I stop animation re-targetting deforming my mesh?

I have followed the documentation on the UE website down to the last sentence, but I am having issues retargetting my animations. The source animations came from Mixamo and the target model is from Blender with a custom rig. The humanoid rig is definitely set up correctly for both models. I first exported the custom rig from Blender with the default Blender units and the result can be seen below with the huge feet. In the viewport both models appear as the correct size by default. In the retargetting window, you can see the poses are similar, and they’re not so different to cause the problems I’m having. The models both look like they are the same size but after the animation is retargetted the model is basically unrecognisable. It appears as if the feet are 100x their original size also. So then I tried changing the units in Blender to centimetre and exporting, but then the model imported at a size that was 0.01 of what I wanted. Then when I retargetted the lower half of the body seemed fine, but the top half was messed up as can be seen from the last photo.

If I export as centimetre and scale by 100 on export it is the same result as the first attempt with the big feet.

I really don’t know what else to do now. Anyone any suggestions?

I have been importing from Blender to Unreal for over a year now and have seen this exact problem many times. So, I feel your pain. I have some advice that I can think can chase away many of the sources of errors for others to follow. However, I have to admit that even following my own advice, I still sometimes get this issue from time to time and cannot explain it. Like everybody else, I am still learning and my knowledge has limits.

I am a programmer myself, so I will be honest and admit that I don’t know python or C++. I want to be clear, I don’t want to disrespect the work that the developers on the Blender or Unreal side of the pipeline have done. I think they are both awesome tools and have brought me so much fun and creative satisfaction that it gives me great pain to be negative about them in any way. On top of that, the community of creators behind Blender and Unreal have given me so much free help and inspiration with other aspects of modelling, rigging and animation. I would not be in a position to even partly answer this post without knowing all the things that they have taught me. Having said all that though, viewing the behaviour objectively from the outside, without access to the source code, I would say there are bugs in the tools. I cannot pin-point exactly where because it is actually so close to working that you only have to miss or forget a detail here and boom, it doesn’t work any more.

Tip#1: Scale
I can’t say this enough, scale is probably the single most important thing to get right. You have to make sure that you always have the scale settings correct, preferably before you start doing anything. If you forget and change the scale part way through your work, you might get issues. So, go to your Scene tab and make sure it is set to Metric, Degrees and Scale 0.01. If you change the scale part way through, you have to remember to apply it. If in doubt, just before you export to the engine as FBX, always do CTRL-A Scale, CTRL-A Location, CTRL-A Rotation to apply scale, location and rotation. This avoids a lot of trouble. Now, when you believe your scale is correct, do this simple sanity check. Delete everything from your Blender scene (cameras, lights whatever), add the default cube in Object Mode using Add/Mesh/Cube and then export just that single object as FBX. Use the exact export settings I have shown. Now import that into the unreal engine using the import button on the content browser, then drag the mesh into the arena that comes with a third person project containing the mannequin. If you did not know, the edges of that cube are all 2 metres in length, the mannequin should comfortably fit inside it, with some headroom. In fact all the humans I know personally will fit inside that cube. See my screen capture of the scale reference between the mannequin and the standard startup cube. So you can reduce your pain by creating your own startup file and FBX export operator presets in Blender but there are plenty of resources where they are explained. I would keep this cube FBX handy, somewhere on your file system where you will have no trouble finding it. Whenever you are in doubt, import it into your blender scene as a scale reference. I also keep one half the size, with edges 1 m in length a cubic metre, which I use all the time. I cannot recommend doing any scaling via the export settings or inside the engine. Just rescale your blender model until it is the correct scale. Check out the guide to player scale on the world of level design site, which is a good reminder that getting scale wrong breaks a players immersion in your game. Once you have a clean forward pipeline from blender to unreal in terms of scale, then you can open up your blend file with your skeletal mesh and make sure it is the correct size. And I don’t mean just believe it is correct, prove to yourself it is correct. You can import your cube FBX into your blender scene or you can actually export your character’s skeletal mesh to the engine, drag your skeletal mesh into a level and check that it compares well with the size of the mannequin.

Tip#2: Armatures
Now the fun begins with the bones. I guess the most intuitive approach for a blender user would be to use one of the blender armatures (human metarig or the pitchpoy version of it). A few clicks and you get a complete armature. Nice. The problem with this approach is that the bone naming conventions do not match the unreal naming convention. If you don’t know what the unreal convention is, you can open the mannequin’s skeletal mesh in the editor and look at the Skeleton Tree tab or turn on the bone heirarchy (using Show/Bone Names and Show/Bone/All heirarcy). So, I don’t know how much you know about Retargeting in Unreal but it shouldn’t really matter about the bone names, as long as the skeleton roughly matches the heirarchy you can map each bone in your skeleton to the corresponding bone in the unreal skeleton, just using drop-down lists. It is often quite easy to do with random skeletal meshes you have download for free from various 3D model sites. I have to say, though, I have never been successful doing this with the blender armatures. I have tried many times to retarget by mapping bones or by renaming the bones in the blender rigs. I just can’t get it to work. I don’t know what the rules are. I don’t know which rule is being broken by which bone or bones. UE4 will always complain about some insurmountable difference between the skeletons. I have fiddled around with it for months but, although I learned a lot, I still don’t know enough to get it to work, or to know if it is even worth pursuing. So, anyway, the short story is you might have to abandon your efforts to use the standard blender armatures as your forward pipeline from blender to unreal.

Another approach that may have occurred to people, is that this should be a really easy thing to do in theory. All you need to do is find the mannequin’s skeletal mesh in the content browser in the engine. Right click on it and select Asset Actions/Export. Then all you need to do is import that FBX file into blender and I should have a 100% compatible skeleton to work with inside a mannequin mesh. All you need to do is delete the mannequin mesh and parent your own character mesh to the unreal skeleton. Sweet! Right? Well, no unfortunately, look at my picture of what happens. The mannequin looks like a porcupine! All the bones on the left hand side of his body face backwards and all the bones on the right hand side of his body face forwards. All the bones in his central column, spine bones, neck and head also point backwards. The freaky, non-anatomical looking, larger bones are the IK bones, by the way. So, I wanted to show you all that. You may have to abandon your plans of a backward pipeline too, to be able to take things from Unreal into Blender. I don’t know how to fix all those bone rotation and scaling issues, at least not quickly. To me, from the outside like I said earlier, that just looks like a bug in the unreal exporter or the blender importer. I can’t say which. Maybe there is a quick and easy way to fix the issues in blender. I just don’t know them. But I do know that trying to test your armature within blender using those bones is very difficult for a beginner.

So, I am afraid the only remaining option to get a working armature is to build one yourself in blender, adding a single bone at a time. This might sound super difficult to a beginner but actually it is very quick and painless if you know two little secrets (1) bones can be subdivided and (2) you can use an X mirror on your armature - most characters are symmetrical, so you only have to do one side of the body and blender will mirror them and rename all the bones correctly. So there are plenty of resources that can explain how to build armatures like this. But essentially the simplest approach is to add a single vertical bone, tip pointing up, which you subdivide into 6. Name each piece as follows (bottom to top) pelvis, spine_01, spine_02, spine_03, neck_01, head. Then add another new vertical bone for the right leg, tip pointing down. Subdivide this into 4. Name them (top to bottom) thigh_r, calf_r, foot_r and ball_r. Make pelvis the parent of thigh_r. Now you can mirror that over for the left leg. Now add a horizontal bone for the right arm, tip pointing away from the spine. Subdivide this into 4 (we will ignore the fingers for now). Name them (closest to spine to furthest from spine) clavicle_r, upperarm_r, lowerarm_r and hand_r. Set the parent of clavicle_r to spine_03. Mirror this over to the left side. That’s it. If your character is in a class T pose, you should be able to position the bones if you have a small amount of anatomical knowledge, in position within your character mesh. If you feel adventurous later on and want to do the fingers and thumbs, they are fiddly but easy. Create five bones named thumb_r, index_r, middle_r, ring_r and pinky_r and subdivide them all into 3. After subdividing, the bone that kept the original name should be parented to the hand bone. Then you should rename them all. So, taking the thumb as an example. After subdividing, you will have three bones: thumb_r, thumb_r.001 and thumb_r.002. Parent thumb_r to hand_r and rename it to thumb_01_r. Rename thumb_r.001 to thumb_02_r. Rename thumb_r.002 to thumb_03_r. Repeat the same process for each finger bone. If you watch Sebastian Lague’s youtube video about called ‘Blender Character Creation: Rigging 1/2’, which might seem a little more complicated because he doesn’t use the bone subdivide trick. It is still simple enough to see the kind of thing I am talking about. The whole thing is very quick, a 15 minute job. Anyway, to cut a long story short, depending on where you got your armature from, you might be bringing yourself a lot of trouble. It is far, far simpler to create a blender armature with the right bone names and positions and relationships. And the beauty of this is, once you have created the armature, you can save it as a blend file called unreal-compatible-skeleton and keep it for all of your characters. All you have to do is import your mesh and fit your compatible bones inside it. But read the next tip before you set up the parenting with your mesh.

Tip#3: Weight painting
So with all your scale and armature woes behind you, the next source of trouble is weight painting. In some cases, you might think that your skinning and exporting from blender and importing into unreal has failed, when in fact it hasn’t. It just looks so ugly, you make that assumption. You can get these weird, sometime comical, sometimes slightly disturbing looking animated winged creatures. They have body parts that seem to project out in odd directions and walk as though they are made out of chewing gum that sticks to the ground or other bones. I have done this myself many times. And when I have fixed the weight painting issues, sometimes it works, sometimes it doesn’t. Unfortunately, weight painting is not really for beginners. If you have an expensive, mouse, keyboard or monitor you should probably not weight paint alone in case you feel the urge to smash them into a million pieces. You should consider having a chaperone, a calming influence beside you. I have been using blender for a while and I have to say, I am just not intelligent enough to figure out how the weight tools are supposed to work. I am befuddled by them. The documentation isn’t too helpful. For me, it focuses too much on the detail without giving any kind of wider perspective on what it is you are actually trying to achieve with all this. I had to try and figure it out for myself. Let me try and explain …

Blender supports three types of animations (1) transformations (2) deformations and (3) inheritance, which is movement based on the movement of another object. So, for rigging we are talking about method 3, individual bones inherit the transformations of their parent bones, then your character mesh inherits the movement of a bone within the armature. When you skin the armature with automatic weights, blender creates a vertex group for the bone and gives it the same name as the bone. If you look at your data tab after parenting the armature to the mesh, you will see all the groups that have been created. So, some group of vertices in the mesh have been assigned a weight, a number which defines how much the vertex is going to transform in response to the bone movement. I am not sure exactly how Blender figures out which vertices belong to which bone but I guess it uses some kind of distance calculation. Only trouble is, it sometimes doesn’t work very well. For one, it seems to work really badly when your mesh is broken into lots of different pieces. Now, I know there are lots of really good reasons for having your mesh in lots of different pieces - you can have clothes, weapons, bags and equipment that is removable and it makes it easier to manage your materials and UV maps etc. But skinning a rig should not be on that list unless you really know what you are going. I don’t want to upset your creative workflow too much but I cannot recommend doing anything other than the following - before you skin your mesh, join all of the separate pieces into one mesh using CTRL-J in Object Mode (or Object/Join in the menu). You don’t have to save your file that way from then onwards but you should always save the latest version of all your separate pieces somewhere different to a version with all the pieces joined together because it is ready to be skinned. Remember, I am only saying all this because I am not experienced or clever enough to figure out the weight painting tools myself. If keeping two separate versions of the file is too much of a headache for you but you also want to keep separate meshes for your character, then you will have to get really good at weight painting.

After you have skinned your armature, it is usually a good idea to go into pose mode and move the bones around to see if the weight painting has worked. This is where you need to be prepared to see the chewing gum appear, and see some hilarious and some rather grotesque body deformations that you might only witness in cartoons or horror movies. Another issue that will add to your problems is that if you think you are clicking on a bone to select it but you are actually in weight paint mode, you will have unwittingly changed the weight on every vertex that is below your cursor. And if you are in wireframe view, just can be applied throughout the entire model following your line of sight below the cursor. You might not even be aware you have done it. So you are clicking around your model, clicking all the bones to test them out but at the same time you might actually be ruining it, changed all the automatic weights. Then once you have done this, it is actually very hard to correct. In all cases, once I have spotted an issue like that, I just go back to the automatic weights and start again. That’s because I do not know how the weight paint brushes work. They are named Add, Subtract, Blur, Darken, Lighten, Mix and Multiply. The weight painting is blue where the bone has no influence and red where the bone has 100% influence. Then there is a gradient between the two colours for any in-between value. So, say I have a vertex that is red but I want it to be blue. If you instantly know which brush of those 7 brushes is going to deliver you that solution, then maybe you can do weight painting after all. I would love to help you but I can’t. I have fiddled with it for hours and I cannot really tell you with any degree of certainty what any of those brushes is really doing, which ones are opposite in effect or whatever. Anyway, if you can figure it out great. Whenever you are happy with a bone, you should be ALT-G, ALT-R, ALT-S to reset your armature back to its reference pose (usually a T pose) before moving onto the next bone.

Don’t get the wrong impression from all that, as long as you join all your mesh pieces together and have a clean mesh model, the automatic weights are usually really, really good and only require a little bit of tweaky, say in the armpit area for example. But if you do not create the models yourself, and are just quickly trying to get a model into a game, there are often separate pieces, duplicate vertices, overlapping UVs, unsmoothed faces and other issues to take into account. In contrast, I have tried to get lots of character models into the engine, some that I have actually payed for, and run into these kinds of issues.

So anyway, that’s my tips and techniques. If you follow this advice and you are still having trouble, then I will probably be as stuck as you are. I think there is still a lot for me to learn.
I hope that was useful to anybody facing these kind of issues.

1 Like