Control rig and mocap

If I’m using a mocap suit to create animations for my character in UE5 can I also use control rig to edit those animations later? Or is it a either or situation? Right now I’m only using mocap and have not created a control rig for my character. It would be easier to pose my character manually sometimes without having to put a suit on and connect everything just wasn’t sure if I can use both together

You can build your control rig to use controls to drive joints ( forward solve) , or to have joints that are driving the controls ( backward solve ).
Take a look at a Metahuman Control Rig and you see the Forward/Backward nodes used in order to do so.
While using a mocap suit usually what happens is that the AnimBP node retarget the animation data from the mocap software onto the joint hierarchy, and once recorded you can bake those animation curves on the ControlRig and do cleanup in Unreal.

See that’s what I was unsure of since I already have the animation blueprint set up for mocap and live link I wasn’t sure if I can add control rig also. I don’t want to break my mocap animations when using it but I’d also like to use control rig for posing and simple animations when not using mocap

You are always better off exporting the result into a DCC for cleanup/editing.

That way you can remove extra keyframes and clean up the motion properly.

I’ve found it much easier and faster to just use mocap and take recorder inside Unreal

Sure, but after thats recorded you usually export it, clean it up, and import it into the real project…

Control Rig is used mostly to create animations from scratch or to do animation cleanup after you baked the data onto the Controls.
The data from the mocap software via LiveLink is applied on the joint hierarchy via AnimBP, which also optimize performances since it just runs animation curves without any extra logic.

Usual professional workflow involves 1 project where you record/do stuff.
You need a bunch of different rigs and BS that won’t ever make it into a game/video whatever.

From there you export the final result out to a DCC / and or directly into another project (in rare cases where cleanup isn’t required).

So, essentially you can have different rigs to do whatever recording you want to do without having to worry about much of anything.

Leave the livelink version alone, make a new one to use control rig with.

And maybe even make a hybrid version of the 2 after you know how to handle control rig.

Generally speaking, thats how a production pipeline works.

1 Like

Hello,
So @MostHost_LA are you saying that the live-linked character is used mostly to calibrate the final character to the environment and lighting in virtual space? I have dabbled with both live-linking and regular import-export of animations and have found that recording and a live-linked character within Unity/Unreal in a production environment is less easy than working with imported animations which originally came from MoCap pipelines. Do clarify please. Thanks.

I’m saying that at the moment, the only thing this engine is good for is doing the actual Take / recording the base animations - which you then export and clean up in a dcc - which you then import into your final project.

And that you should not use livelink at all in anything but a test project specificially set up to do the take recording.