We did a mocap session with Optitrack for body and iPhone for face. In UE5, it created ONE animation clip with both the blendshape and skeletal animation merged.
We cleaned up the skeletal data using the original Opttrack output, so no blendshape information.
Now we have the cleaned optitrack data and we want to use it in sync with the blendshape data from the UE5 anim sequence asset.
I tried copying the blendshape curves from the UE5 clip, but I can only paste them in to the cleaned animation file starting at frame 0 and it is impossible to move the keys’ start frames to a specific time with accuracy, which I need to do in order to sync everything
Perhaps there is a way to have two animation files playing from sequencer, bot one controls skeletal anim and one controls blendshapes, but I have yet to discover a way to do this.
You can copy/paste blend shape animation curves from one animation sequence to the other, here is a video showing how to do that
You can’t really be super precise while moving the curves though.
The alternative non-Sequencer setup would be to:
Create an AnimBP for the character
Drag&drop the Optirack animation with no blend shape informations and also the anim sequence asset that contains both body and blend shape data.
Use a Layered blend per bone node, plug the Optirack cleaned data to the Base pose and the UE asset to the Blend Poses 0, select the Layered blend per bone node, and in the details panel make sure to add a layer, expand the branch filters and in “Bone name” put the name of the head joint ( which I guess is head )
By doing so, you’re telling the UE animation asset to consider just from the head up, meaning tha is going to take into account only the animation curves.
Create a BP, set the skeletal mesh and your AnimBP, drag&drop the BP into the scene and hit playm you’ll see both animations running at the same time, and you should see both facial and body animation
In the AnimBP tweak the Start Position value for either the body or head, the constantly hit play to see if the two matches
As another alternative solution, a while ago I created a simply script in Maya that gets the Animation curves from the blend shapes ( which are stored in the root joint of the FBX exported from Unreal ) and transfer the values to the each blend shape, using a proxy rig for the transfering of the data. Maya - UE4 facial animation pipeline
I think that you can use something similar in MoBu/Maya to merge back the blend shape animations with the body.
Thanks for your reply, and awesome suggestions. I was aware of copying the keys, but as we both know, moving an extreme number of keys precisely is impossible in UE5.
What I ended up doing was copying all of the curves from the facial asset to the asset with the body animation, then I exported this and brought it into Maya, where I was able to grab all of the facial blend shape data and move them a specific number of frames to resync.
I would be nice if Epic could add this ability in engine. It would be very easy for them to implement.