Own mocap data to MetaHuman animation

Hi, I have my own generated mocap data, based on the Coco 17 dataset, that I want to use to animate a MetaHuman. The issue I have is how to encode the animation using only positional data of the points.

I should say that I don’t have much experience in UE – I’ve been having success but a lot of the time I’m not 100% sure how I got things working.

Why desired workflow is: my programme creates the fbx file (using the fbx sdk) and then I import that into the sequencer in UE5. I have been able to get this working for basic models but I’m a bit stuck on the best way to do that for a MH. The issue I have is that I’d ideally like to use the IK solvers (as I have position data) but the control rig seems to only support specifying the hands and feet, with other parts being relative in a way I can’t quite understand. Ideally I’d like to pass shoulder, elbow, hand, hip, knee and feet positions for each frame and let the IK solver come up with an acceptable solution.

Quality needs to be ok but not perfect, as this is for internal use only. It is for sequence rendering, not gameplay. I’m not so bothered about head animation.

My thoughts are to either create a new, or modify the existing, control rig so it exposes direct control of these keypoints, or specify via FK. The problem with FK is working out the angle or matrix info to match everything. It’s something I feel I can probably work out but I’m rusty on this, I’m not experienced in the rigging process, not experienced in fbx structure, and generally see this as a lot of hard work. I’m wondering if there is a simpler solution to this with the control rig – given that the behaviour of the hands and feet is exactly what I’m after, I would just like the same control for a few more points.

Quite open to the idea that I’ve missed something obvious. Appreciate any help.