Changing the proportions for a meta human to fit mocap exactly

For the project I’m working on it’s really important to us to keep 100% of the fidelity of our mocap performances; this would require keeping the precise proportions of all of our character’s relative heights, limb lengths etc, (ie no retargeting) This is especialy important for scenes where character’s interact. Does anyone have a workflow for how this is achievable currently with Metahuman?

You can the source Maya file that you can get from Quixel and modify the limbs/height of your character as you wish, create a new bind pose and export it in Unreal by replacing the current base skeleton, so that all the skeletal meshes will be updated as well.

The alternative way, which is something that might works better, is to drive specific situations, so that the interaction between objects will be more precise, using IK.
In short, you dynamically adjust the hands position/orientation when you get closer to an object, ensuring that when you grab it or interact with it the precision will be enhanced.

But I strongly recommend using realtime IK Rig retargeting, where your source can be a character with tghe exact limbs/height you need, but the IK retarget will be driven by IK, so interaction can be matched between the source character and the retargeted Metahuman, without the need to modify the Metahuman skeleton externally.

1 Like

Thanks very much for that. I’ll look into those options.

Easiest thing is to set the skeleton settings to animation scaled.

If you then bring in an animation with the correct bone sizes, the mesh should adjust to match it exactly.

Since you are using meta humans you are very unlikely to be making a videogame. So for a quick cinematic take the above soulition would take a whole 5 minutes to implement.

Maybe 10 if you have to actually figure out how to right click and retarget a mesh to the same skeleton with different bone spans…

1 Like