Live Link Face ARKit recordings, applying them to a custom character in UE5

So a workflow question here for those familiar with the Live Link app and importing csv files.

I’m trying to find the simplest workflow to get the csv data for the ARKit onto a custom skeletal mesh with ARKit blendshapes and blend it with some animations I have for the character’s body already saved.

Currently it involves after importing the csv file/lvl sequence:

  1. creating an Anim BP for the character: with a live link pose node>adaptive blend node>output and selecting the iPhone csv take from the LL Pose node dropdown.

  2. With this I can Simulate and bake the face keys into an animation for this skeleton.

  3. Then I can Edit the FK Rig, and copy just the face keys (all the body bones will just be 0)

  4. then I open the body mocap animation I already have saved in a level sequence on the character, edit it to FK Rig, and paste in the face keys I copied. (I realized if I didnt do step 1-2, I would be prompted to paste only one bone row of keys at a time, way too long)

  5. With face keys pasted onto the FK with the body animation, I can bake out into an animation sequence with body and face together.

This is probably not the most efficient way to do it, so please if you can share a better way please let me know.

(This is not for MHs or Echo characters with Live Link functionality already built in for these iPhone takes, this is for a custom Skeletal Mesh character from blender with ARKit blendshapes)