Anyone succeed to use Sequence Recorder with Morph Targets?

Dear all,

I’m currently testing the sequence recording capabilities starting from UE 4.12.

At the moment, I have a rig with bones and morph targets. I’ve made a plugin that retrieve data from the network, and provide, at the Blueprint level, the necessary coefficients for constraining bones and morph target.

.fbx importation of the rig is successful, I’m able to constraints my rig using the “Transform (Modify) Bone” node in the Anim Graph. In the Event graph, using “Set Morph Target” allows me to constraints morph targets. As no events are usable in the Anim Graph, I use a shared variable between the Event graph and the Anim graph to retrieve bones coefficients. Below is a basic a representation of the data flow:



1. [Network] --> [Event in Event Graph] --> [Retrieving bones & morph targets coeff] --> [fill the shared variable] --> [Read the variable] --> [Constraints Morph Targets]
2.                                                                                                                  |-> [Read the variable] --> [Constraints Bones]

Caption: 1. Code in Event Graph.
         2. Code in Anim Graph.


This is my Anim Graph Blueprint:

When I use the sequence recorder, I noticed my bone’s modifications are well recorded, while the Morph Targets are not. It’s likely the Event Graph is not recorded.

I think I didn’t get the underlying functioning of the Anim Graph vs Event Graph under the hood.

So these are my questions:

  1. Is it possible to record morph target as an Animation?
  2. How can I change my Blueprint to record Blendshapes?
  3. What is driving the Anim Graph? (in term of Thread/tick/refresh rate) According to the online tutorial (https://www.youtube.com/watch?v=ZezNr-DOSRI), the Anim Graph is a State Machine?
  4. I think my design is not good, and I’m not respecting the animation principle in Unreal. Any suggestions?

Thank you for your time and your help.

B/R

Any solution please?

No one for an answer?

So, it is impossible?

Well, I have an idea for creating such a system but I have no time to implement it in the Engine.
I have opened an issue on the bug tracker, as well as an answerhub question. This feature has been backlogged.

For 4.14 and later, there is a new “Modify Curve” node that allows procedural control of morph and material curve animation.

Some other things to be aware of:

  • Morph target curves (and material curves) are recorded using the final output of the animation graph.
  • There is a bug with the pose blend node where all curves will be filtered out of that branch of the tree (its like the pose blend node ‘absorbs’ the curve data when it uses it do drive pose weights), so the curve data never reaches the root.

I did get this working after a lot of painful trial and error. I spend days trying to get this to work with an actor blueprint to no avail.

When I was about to give up I tried the Third person template.

I’m usig 4.19

I replaced the default manaquin skeletal mesh, with my own skeletal mesh that has morph targets and put it in the Third person character blueprint. I also created an animation blueprint for my mesh and replaced that in the third person character blueprint.

As Tom suggested above, I created a modify curve for my morph target in the anim graph of my animation blueprint which is updated by my variable.

Hey presto, when I record the third person character in the sequence recorder, the morphs work in the PIE, and thank goodness, it also plays back and scrubs in the sequence editor.

BOOOOOM!

There is something about how the Third Person Template is set up that allows it to have its morph targets recorded. What that is I couldn’t tell you. My guess is because it used the Character Blueprint and not just the Actor Blueprint.

One small problem, I can’t seem to find the generated key frames in the sequencer and when I export the sequence as an FBX the morphs are nowhere to be seen.

Opening the fbx in 3ds max reveals the morph targets aren’t exported.

I’m currently working on a workaround for this, where the mesh has a separate dummy bone that corresponds to each morph target. In the animgraph I use the Transform (Modify) Bone node to rotate the dummy bone in the x axis an equivalant amount to the morph value.

As it’s a bone rotation, this is recorded by sequencer and can be then exported to 3ds max, where I can link it back up to the morph target. This is a bit hacky, but should work.

I’m trying to build a performance capture set, similar to how motionbuilder works, so in an ideal world I’d like to be able to just record and cleanly export. Hopefully this workflow will be improved in further releases.

If anyone can tell me how the result of the modify curve can be made to show up in the recorded sequence as keyframes, I’d love to know.

1 Like

Hmmm

How MotionBuilder works the animation data is contained with in some kind of container depending on the device configuration and once your ready to export one needs to plot the animation down to the base model. If you don’t the container is not exported and the animations don’t show up in the target app be it UE4 or 3ds Max.

Thinking along the same line since adding a curve modifies the output the result will need to be ploted/baked to animation take.

Another consideration is 3ds Max only support 100 morph channels and any more than that will either cause 3ds Max to crash or not even load the morph data.

Assuming of course you have your importer set up to accept morph data. :wink: