Morph Target Skeleton Size Problem

I retargeted the character model that I exported from Daz for mixamo. But when I start using morph targets, the skeleton remains small and this causes animation issues.

i have been looking at the same problem for a couple of weeks now, figured out how to scale the skeleton, but then the mesh scales even more and still does not match. Could not find the way to temporarily decouple the two, scale the skeleton, morph the mesh, and then couple them back. I am looking at c++ solution, but any hints will do…

Ok man, I found a solution, but trust me, you do not want to go this path… So first a few recommendations:
Daz morphs are of three basic types, and for the sake of sanity you only want to deal with the first two.

Type 1 are mesh only morphs that do not affect skeleton at all. Quite a few of those, for example bodybuilder morph. No problem with these, export them using Daz to Unreal bridge or whatever method you are using and use as normal morph targets in unreal.

Type 2 are not really morphs, but scaling dials in Daz. PBM Chest Propagating Scale or PBM Shoulder Width are of this type. Do not export these at all, they are much easier to deal with in unreal directly through adjusting the ReferenceSkeleton pose. Keep in mind though, that in Unreal transforms’ scales are propagating to children, while in Daz they are not. So PBM Shoulder Width in Daz sets scale only for two bones (Left and Right Collars), while in Unreal you need to set it for the collars (lets say to X) and then set the compensating scale for child bones (Left and Right Shoulder Bends in this case) to 1/X so that the rest of the arms are not affected.

And if you do not want to do any C++ hacking, stop here. There are enough morphs of those two types in Daz to get a nice variety of characters.

The third type of morphs in Daz, and most of facial morphs are of this type, offset and sometimes scale the bones as well as move vertices around. The solution that I found is in decoupling bone induced vertex changes from pure vertex changes. I do it at runtime in unreal, so on the editor level I have one skeletal mesh, and at runtime for each spawned/created character it bakes a morph set into an instance of mesh for this particular character that is saved on the server and replicates to client. In the process of baking I do the following steps:

  1. Duplicate the base mesh, tweak the reference skeleton in this copy, set it as a mesh for my component, and get all the vertex coordinates:

    		USkeletalMesh* meshWithSkeleton = DuplicateObject(femaleMesh, nullptr);
    		const int32 numBones = meshWithSkeleton->RefSkeleton.GetRawBoneNum();
    		TArray<FTransform> fixedTransforms;
    		const auto pose = meshWithSkeleton->RefSkeleton.GetRefBonePose();
    		for (int32 i = 0; i < numBones; ++i)
    			fixedTransforms[i] = pose[i];
    		auto skeleton = new FReferenceSkeletonModifier(meshWithSkeleton->RefSkeleton, meshWithSkeleton->Skeleton);
    		const int32 BoneIndex = skeleton->FindBoneIndex("root");
    		fixedTransforms[BoneIndex].SetScale3D(FVector(1.0f - morphValue * 0.45f));
    		skeleton->UpdateRefPoseTransform(BoneIndex, fixedTransforms[BoneIndex]);
    		meshWithSkeleton->RefSkeleton.RebuildRefSkeleton(meshWithSkeleton->Skeleton, false);
    		auto meshComponent = npc->GetSkeletalMeshComponent();
    		meshComponent->SetSkeletalMesh(meshWithSkeleton, false);
    		meshComponent->SetCPUSkinningEnabled(true, true);
    		TArray<FFinalSkinVertex> skeletonVertexBuffer;
    		meshComponent->GetCPUSkinnedVertices(skeletonVertexBuffer, 0);

in example code above i just scale the root bone.

  1. Make another duplicate of the base mesh and create a vertex buffer for the vertices wit the morphs applied. The code is a slightly modified version of RuntimeMorphBaker from Morph Tools Plugin in the marketplace. Not mine, but very useful.

  2. Now that i have vertex positions after skeleton changes and vertex positions after morph changes, I subtract the first from the second, and get vertex shift that is only mesh dependent. Add this shift to the original mesh vertex positions, and store new vertex positions in the final mesh as RuntimeMorphBaker does.

Don’t forget to set your component back to GPU skinning.
Since the final mesh is based on the original skeleton, you will need to reapply ReferenceSkeleton to it before you render to see the scaling effect.

Also make sure this ReferenceSkeleton is used when you put animations on top of it, but I did not go that far yet. I also hope that someone who is more familiar with the engine than I am will suggest a better way to do it…


So after testing this solution extensively I can confirm that it mostly works, but there are a few caveats.

Dealing with LODs is messy, the only way i found so far is to use render thread to create mesh vertex buffers for all LODs, but there are race conditions there when sometimes you try to read the buffer before the LOD switch happened and you get the wrong buffer

Duplicating mesh is a very time consuming operation, takes 1.8 seconds on my box. Add it to the LOD thing, and I am getting about 6 seconds per character, which is prohibitive.

Two morphs cause particular problems for me: Youth morph from Growing Up package and Caryn. Both are complex morphs, but what is a real issue is that skeleton offsets change linearly with the morph value, while vertex offsets seem to change nonlinearly. I get perfect match between body and skeleton for morph values 0 and 1, but for 0.5 there is a mismatch. The skeleton matches the one you get if you export the morph with value 0.5 from Daz, but the mesh does not…

EDIT in April 2023

It does not let me reply to my own posts more than 3 times, so I will edit this one:

Mind you, I did not spend three years doing exclusively that, but in the last two weeks I was porting my solution from 4 to 5 and found some new insights.

Firstly, as 5 has no tessellation, I went from regular resolution G8 to level 2 subdivisions, with about 250k tris per mesh in LOD 0. That makes duplication absolutely prohibitive (15s per mesh, double it since you need a copy for morphing and a copy for posing in the old approach).

So I went a different path: only use one copy and make it through

FGenericPlatformProcess::ConditionalSleep([]() { return !IsGarbageCollecting(); }, 0.01f);
bakingMesh = NewObject<USkeletalMesh>(this, FName(id + "-bake"), RF_Standalone, record->baseMesh.Get());

Do not sweat over ConditionalSleep for now, I’ll explain it later on…

Then I added Duplicate method to FSkeletalMeshRenderData in the engine:

FSkeletalMeshRenderData::Duplicate(FSkeletalMeshRenderData* Other, USkinnedAsset* Owner)
	int32 index = 0;
	for (auto& otherLODData : Other->LODRenderData)
		auto lodData = new FSkeletalMeshLODRenderData();
		lodData->Duplicate(&otherLODData, Owner, index++);

	NumInlinedLODs = Other->NumInlinedLODs;
	NumNonOptionalLODs = Other->NumNonOptionalLODs;
	ensure(LODRenderData.Num() >= NumInlinedLODs);
	LODBiasModifier = Other->LODBiasModifier;
	CurrentFirstLODIdx = LODRenderData.Num() - NumInlinedLODs;
	PendingFirstLODIdx = CurrentFirstLODIdx;
	bSupportRayTracing = Owner->GetSupportRayTracing();

Similarly, FSkeletalMeshLODRenderData also needs Duplicate

FSkeletalMeshLODRenderData::Duplicate(FSkeletalMeshLODRenderData* Other, USkinnedAsset* Owner, int32 Idx)
	RequiredBones = Other->RequiredBones;
	RenderSections = Other->RenderSections;
	ActiveBoneIndices = Other->ActiveBoneIndices;
	BuffersSize = Other->BuffersSize;


	auto NumVerticies = Other->StaticVertexBuffers.PositionVertexBuffer.GetNumVertices();
	auto NumTexCoords = Other->StaticVertexBuffers.StaticMeshVertexBuffer.GetNumTexCoords();
	StaticVertexBuffers.StaticMeshVertexBuffer.Init(NumVerticies, NumTexCoords);


	if (HasClothData())
//		ClothVertexBuffer.SerializeMetaData(Ar);


	if (Owner)
		Owner->SetSkinWeightProfilesData(Idx, SkinWeightProfilesData);

	SourceRayTracingGeometry = Other->SourceRayTracingGeometry;

//	MorphTargetVertexInfoBuffers = Other->MorphTargetVertexInfoBuffers;

Note that there is no need to copy vertices, as they will be overwritten anyway. Also note that I did not play with cloth, and MorphTargetVertexInfoBuffers appear unnecessary here.
MultiSizeIndexContainer and SkinWeightVertexBuffer need their own duplications but that is about it on the engine side.

Now in my own project I need to run something like

for (int32 lodIndex = 0; lodIndex < numberOfLODs; ++lodIndex, ++targetLodRenderData, ++sourceLodRenderData)
		TArray<FMorphTargetInfo> morphEvalInfos = initEvalInfos(morphMap, morphWeights, lodIndex);
		const auto& sourceBuffer = (*sourceLodRenderData)->StaticVertexBuffers;
		auto& targetBuffer = (*targetLodRenderData)->StaticVertexBuffers;

		int32 totalVerts = sourceBuffer.PositionVertexBuffer.GetNumVertices();

		for (int32 vertexIndex = 0; vertexIndex < totalVerts; ++vertexIndex)
			auto uv = sourceBuffer.StaticMeshVertexBuffer.GetVertexUV(vertexIndex, 0);
			targetBuffer.StaticMeshVertexBuffer.SetVertexUV(vertexIndex, 0, uv);

			UpdateMorphedVertex(targetBuffer, sourceBuffer, vertexIndex, morphEvalInfos, morphWeights);

UpdateMorphedVertex here is just a modified version of similar method in CPU skinner, so it takes whole buffers rather than individual vertices as first two parameters.

After that you need to run bakingMesh->CalculateInvRefMatrices(); and here is the fun part: ABSOLUTELY NOTHING here has to be on game thread. So I protect my NewObject in the very beginning by accidentally running during garbage collection with ConditionalSleep and happily run the whole thing on a thread:

		AsyncThread([this, id]()

It takes from 100ms with just a few morphs to 800ms with about 400 morphs, all done in the background. One slow thing you have to do on game thread (part of it requires it) is
bakingMesh->InitResources();, it takes about 150ms, so you probably do not want to stream these meshes in, but man, comparing to 15s…
Oh, and when you do it on the thread do


on your base mesh before you use it as a template, and


on your new mesh, so that GC is not complaining on game exit.

That’s basically it, I am loading 24 unique G8 characters with 250k tris in LOD 0, 12 of each gender, from 2 base meshes using up to 400 morphs each in about 4 seconds total. I would have posted a video, but like I said, I did not play with clothes yet :stuck_out_tongue:

One more note: morphs that use any kind of scale modification are best dealt with the following way:
copy original *.dsf, open it in text editor and remove everything that mentions scale;
then when exporting from DAZ use your version of the morph instead of the original one;
then apply bone scaling on unreal side.
This way you will have perfect alignment of skeleton to skin no matter what value you want to apply…

Well, there are some minor details that I skipped to make this post shorter, but the gist is here…

Bit dated as an OP but info is info.

Interesting thing with Daz Studio there is a button for almost everything and since character development is what it does there is a push button solution

1 Like

Interesting. Ty.
But can anyone make this for UE4/5? I dont have DAZ and that was 2 years ago.

  • Ive seen 3 sellers of Skeleton/Morph changers on the Market that work incorrectly (when you make Legs shorter, the Root bone and Capsule size isnt adjusted to match).

  • They all need to have this DAZ feature where we can load 2 states/sets: the Base skel/mesh set (normal scale), then load the changed skel (scaled)/mesh (vert change) set. And it makes a Morph target 0-1 for the 2 states.

  • Ive also seen modular character for sale that have Morphs on the body mesh, but not on every clothing mesh!! Thus when I change the body Morph, the clothing mesh does not match anymore. And this is the problem with using Morphs to make big body changes (that affect clothing/accessory boundaries).

  • And thus why we need an easy asset that loads 2 states, set of Mesh and matching Skel.

That’s your game code - not the asset’s morphs.

That’s not a thing a GAME engine should do, it’s the job of any DCC…

That’s a bad product.
But, you can pass morphs from one object to another using blender with a bit of time and patience.
Also, cloth usually assumes that the simulation will take care of things with morphs.
The reason it doesn’t is that Unreal is a bust on that front… If you were to say use the same model inside the Apex clothing tool and configure the cloth as cloth, it would simulate it’s positions proper/respecting the modded mesh…

last but not least - all games (stupid or not: ARK!) that do this in unreal, manipulate the length of each character’s bones rather than use a morph target.
That’s because the bone stretching DIRECTLY morphs everything along with it rather than the individual mesh.

So, the real baseline problem to this topic is “you are doing things wrong at the root” rather than “this engine sucks (which it does, but really this would be the same for anything?)”.

Drop the idea of using morphs.
Edit your chracter’s ABP and pass in values to manipulate bones with transforms.

You probably only need to use a single axis on each (scalar) or some-times, say the chest area growing bigger, a combination of axis.

Use that baseline abp as the start for all your characters/and or, code it up in C++ to be assigned to your characters so that all of them have the same variables available.

At worst it’s like a 2 day job to get it working…

Mess with morphs, and you’ll really need a drink and 3 years to get something working somewhat Ok…
(not that morphs aren’t great for some other things related to characters!)

1 Like

That’s their (Sellers’ plugin) code. And I know it is with AnimBP > Bone scaling (not morphs). But I was mentioning it (related to the morph topic) - that the Sellers do not scale their bones correctly to adjust for Root when anything below Root changes in length. And they dont scale Capsule height or Character Mesh offset > to account for the longer SK leg bones.

  • I asked a few Sellers to fix their plugin; told one how to do it (communicate AnimBP change to CharBP > change Capsule). But he hasnt yet.

I am saying that the Marketplace sellers of the Body/bone scalers - need to add this correct/easy method to their product.

  • (Technically when their product scales bones, it changes all connected verts. But that DAZ addon does it intelligently. By allowing users to input a Small elf body mesh > then match the Normal Skeleton scale to small body. Thats what I want the UE Sellers to add to their asset = easier way to perform a set of multi-bone scaling/body sizing all at once. E.g. aging or changing race. I dont want 20 sets of anims for my different proportional races.
    But does UE5.2 have live-anim bone-scale-matching? Or must those IK solved anims be presaved?)

Thanks. Blender calls them Shapekeys correct? Got a link on how to transfer Shapes from a body to non similar armor? Or does it only work on similar verts?
I found this, which is helpful to also explain how to drive the change in UE.

Agreed. I tell people not to use Morphs for large body changes, only for ears or the face.

  • But I found this topic while searching if we can change the Morphs of an SK (instance) placed in the Level, without any code?
  • (I clicked on my SK in the level > details panel, could not find the name of my morphs. So I did the tut above, added my SK to a BP > to access the Morph with a Widget.)

Im doing it now :frowning: lol. But for small things. Tests are going well. But if I could have a plugin that let me Import a changed Mesh > and UE matches my skel to the new Mesh size (and calculates the delta bone scaling to save as a Template), that would save time.

Yes, shape keys.
Not really, but it’s done by proxy.
No, being done by proxy basically means they take the value(s) of the nearest vertex that is avaliable in 3d space.

Three are free plugins, but there should also be a built in way by now. Check the manual on copying shape keys.

1 Like