Using multiple different proportion models on same rig

I am generating different models through MakeHuman (all of whom have identical skeletal structure) and then I wish to use a set of animations on them. My current process which involves retargeting animations for each set of models doesn’t scale up well,so I am curious to know what would be a better way to do this?

One issue in importing with the same skeleton is that the models are different in proportion (height, width etc.) which means that their mesh will “swell” possibly due to different bone weights.
Any suggestions on a better workflow would be appreciated!

We are making use of a single framework set up, meaning using all resources applied to different proportioned models, by making the model a full body morph, or injector, and activating the target at run time.

So, do you suggest retopologizing different models all based on the same rig. From your previous posts (in other topics like this one: https://forums.unrealengine.com/development-discussion/animation/1423389-best-way-to-create-dialogue-talking-animations-using-only-blender-unreal) it seems the process of creating the injector was done in ZBrush. Any suggestions on how I could do this in Blender?
Thanks!

Well Make Human already does the retop for you as to overall character design so the trick is in the process of converting the results to a valid morph target. I’m not up on Blender so I’m not even sure that it supports blend shapes but would be easy enough to test out by exporting the base model along with a reshaped version done using Make Human. Since a morph is not effected by skin weights in UE4 then the reshaping is not effected by joint rotation that would cause the model to crush and distort once a common animation set is applied. Eye position for example would tend to bug out due to the change in relative position.

Retargeting animation will not work effectively as all it does is retarget animations using one set of naming to another that uses a different set.

Since the target morph is activated at run time the result will be additive to the base rig running the animation so in theory you can have as many different shapes using the same resources as would be used on a single character model.

Piratical use wise lets say you have a need for a large number of NPCs, like in Assassins Creed, you could create an unlimited number of characters using a single frame work

To put it out there a nice feature addition to UE4 would be a morph loader.

Thanks again, FrankieV for your help on this. Regarding the above point, I’ll likely use Maya to create morph targets following the steps in the tutorial suggested here (FBX Morph Target Pipeline in Unreal Engine | Unreal Engine 5.3 Documentation), is there anything I should keep in mind to avoid the scenario you mentioned above (besides not changing the joint rotation of course :slight_smile: )

Well I would need a lot more information as to your intended player model design and requirements but if you make all of your characters morph targets, injectors, but the progression works much like the stack would in 3ds Max where the action below will be preformed first followed by the reaction of the morph if animated and activated at run time. If you back the shape as being a unique character then you “might” have issues of mesh crushing if the area effected does not match the pivot points of the host rig.

If you make the arms bigger but maintain the original pivot position then the joint will still rotate properly but if you move an area of mesh away from the joint, lets say move the eyes around to account for different eye shapes, then the animation is going to pull the eyes back to their original position.

For our needs we just licensed Daz Studio and Genesis 3 and generate all of our player models as a procedural process in UE4 and so far it seems that there are no issues doing it this way

I attempted to work along the lines of what you suggested by creating morph targets of different variations of the base model of MakeHuman. The target being to create similar kind of modifications (age,gender,height,weight etc.) as MakeHuman does but inside the game within its character creator. To achieve this, I simply used the base model (with default values on all modifiers) as base mesh and imported extreme variations of each and created blend shapes for each (so one for age, another for gender and so on). In all these cases, I was applying morph target to the entire body.

Importing them to UE4 seemed to suggest they atleast worked in theory as long as the model was in T-Pose. However, the moment I tried to apply an animation on it, I got results as you can see below.
I can see that they seem to be “pinned” to the joints. Any idea how I can uncouple that or if I’m indeed doing this the wrong way? Thank you again for your help!

As pictured the problem is not with the applied morph targets but a difference in the bind pose position as to the set up of using the t-pose on a rig set up done in “say” the A-pose position. A strong indicator as to this being the problem is the extreme rotation of the arms relative to the rest of the model which “might” be close. AT first it would look OK as there is nothing effecting the fidelity of the model until you add animation which was made to match a character in a different bind pose position.

Of course this is just a guess but is a common problem in character animation when the animation data is simply retargeted from on character where it’s bind pose does not match the originally authored animation that does not match the rigged character From what I gather this can be fixed but I do my animation stuff using Motionbuilder which does more than retarget naming convention used in UE4.

Ah I see, that makes much more sense! How would one go about fixing such a problem in character animation but with Maya or Blender perhaps? I feel the HumanIK tool of Maya does the retargeting efficiently but doesn’t cover for problems like this.

Personally I use MotionBuilder as retargeting is one of the tasks that it does by design but Maya has a sub set that I believe they call FBIK (Full Body IK) which also has the ability to retarget animation from an imported animation onto the changed or custom character rig. I’ve never used it so can not advise you of the process but the basic function is to characterize the two data set and transfer the matching animations from on rig to the other via the Control Rig.

A very basic overview of using FBIK to retarget animation in Maya.

For it to work the way it’s suppose to you need to characterize both character inputs/outputs.

Not sure why Epic made an animation tool for use in Maya as the FBIK is already an advanced animation tool that uses the same technology use in MotionBuilder.

Thanks, FrankieV! I was in fact using the HumanIK/FBIK tool that you linked to for retargeting animations to my rig and the issue wasn’t arising from that. As you can see, the animation works fine when retargeted from the original rig, but the moment the full-body blendshape is applied the skin cluster messes up (you can see that in image 2). Strangely only the arms seem to behave strangely. Also, changing the input order of skin cluster and blendshapes (pre-deformation versus post-deformation) results in the third image.
At this point, my newbie intuition suggests it is something with the way the model is skinned and applying blendshapes to it is affecting the way it interacts with the joints.

What do you think might be the issue here? Thanks again for all your help! :slight_smile:

Well if the result is outside the expected result then the issue could be in the set up at the source level.

Test you can do is to divided the two requirements of a rigged character with morph target into their base components and test them in UE4 to see if they are working as expected.

At this point I would start suspecting an issue with the set up as to the requirements needed by UE4 to make it function as expected so I would suggest

  1. import the character with out blend shapes and see if it functions as expected with out morphs.
  2. import the character model as a static model and test the morphs to see if they function as expected.

This should tell which component is causing the issue

The thing is morph targets by their nature are additive values when applied to a mesh object and only add to the current vertex position so if the result messes up when only the blend shape is applied then I would suspect there is a problem with the blend target.

I did a video blog series on how to use Daz Studio to make injectors using the Genesis 3 framework that might help you understand a working workflow. :smiley:

I looked at your video series and they were very informative and got me looking into Daz Studio and Genesis 3 as a possible option. You approach the issue I’m having around the 1 hour mark in the second video about sharing animations while applying full-body morph injectors. If I created a Genesis 3 figure and applied a morph injector (child or adult slider, anything that changes the morphology enough), how would I go about applying a sample animation from Daz3D’s default animations folder. I am able to export the morph targets into UE4 and modify the base model. But just like earlier, once I play the animation I exported along with it, I have a similar deformation issue?

How would I go about solving such an issue with a Genesis 3 model? Or perhaps you can guide me to a video you made on that topic. :slight_smile:

Well making use of Daz Studio and Genesis 3 will at the very least maintain the fidelity of the source as the two work hand in hand but now your talking injector technology and not just a simple adjustment of the proportions of the base character model.

As for deformation issues you should not be having that as to the extent of the pictures as the result is so far out side the expatiation curve that something has to be wrong with the set up.

To put the result into some kind of context ever see the Wolf Of Wall Street? The scene where Jordan Belfort tried to drive his Lambo home stoned and he though he made it home in one piece? The fact of the matter was the Lambo was totally destroyed that makes you wounder what the hell happened on the drive home that would make Belfort think he made it home safely in the first place. :wink:

No disrespect but by the looks of the pics the only thing I can think is wrong is you drove head on into a Mac truck. :smiley:

Now dealing with injectors is much easier to deal with at the source level as the morphs always match the rigging as exported and it’s becomes more about how the data is implemented in UE4 as to the desired result. If the data is baked as being unique to the character model then you might have to retarget the animation to the character “or” in some cases you might not have to as in the joint rotations and relative position still lines up.

For example.

4 unique characters but they all still use an instance of the same G3 framework using custom animations targeted onto and used with the unique character.

The other option is to export the injector as part of the morph export rules with in Daz Studio.Lets say you have a Tom, ■■■■ and Harry characters set up as injectors. If you export them as part of the morph export rules you would only have to set the target to 100% in UE4 to achieve a totally different character model.

For example.

Bottom line though Daz Studio makes an excellent development tool as the usable result is expected and since UE4.16 the import issues as to using G3 have been solved so one can now sort out the ways and means in a much quicker time frame. :wink:

Personally I’ve been using DS an Genesis for years so at the very least I can confirm that it “does” work… You just have to figure out the process.

By the way I did a retarget of the base Epic animations onto the base G3 rig, including facial animations using cluster shapes, along with instructions on how to do the set up.

Using DS with G3 along with the SDK I provided at the very least the combination should work.

Thank you very much for the examples! All the videos and the character theory stuff was very interesting.
I realized the thing I was doing wrong all along was trying to force one set of animations on all possible full-body morphs rather than retargeting animations that take changes in joint position into account and then just exporting them as a separate layer. I’ll hopefully be able to fix this and significantly improve my character pipeline largely thanks to your help! :slight_smile: