Hi everyone,
I’m working on a MoCap setup using Mocopi for full-body tracking, but I want to replace the Pelvis/Root rotation with a dedicated hardware tracker for better accuracy.
Goal: I want to combine the rotation data so that the Pitch (Y) and Roll (X) come from the Mocopi stream (to maintain body tilt/gravity alignment), while the Yaw (Z) is driven by my external tracker to define the character’s heading.
Current Setup:
-
Engine: Unreal Engine 5.6
-
Input: Live Link (Mocopi) + Vive Ultimate Trackers
-
Logic: Currently calculating the blend in the Animation Blueprint Event Graph using
Break/Make Rotatorsand converting back to a Quaternion.
The Issue: I’m running into issues with gimbal lock and jitter when interpolating between the two sources in the AnimGraph.
-
What is the most efficient way to isolate the Yaw from one Quaternion and the Pitch/Roll from another without causing “snapping” at the -180/180-degree flip point?
-
Should I be using a Modify Bone node in
World SpaceorComponent Spaceto avoid fighting with the parent/child hierarchy of the Mocopi skeleton? -
Are there specific nodes (like
CombineRotatorsvs.ComposeTransform) that handle this specific type of “Upright Orientation” blending better?
Any advice or screenshots of an optimized Blueprint node graph for this would be greatly appreciated! I have attach the node flow that has been done, but Z glitches


