Hi all,
I’ve integrated the iPhoneX into my Full Body Mocap setup ( using either the Perception Neuron, IKinema Orion or the Vive Mocap Kit ), and using Sequence Recorder I was able to record the entire realtime performance.
The Facial Rig is built using Blend Shapes, and in order to export the facial animation as well, I created a proxy rig within the skeletal hierarchy of the character, specifically creating as many joints as the number of Blend Shapes.
In UE4 I then got the curve values from the realtime facial performance ( simple 0 to 1 float ), driving the Z axis of each “proxy” joint.
I then exported the animation in Maya, and by using SDKs ( via MEL script ) I was able to transfer the 0 to 1 values to the Blend Shape Facial Rig.
Regards,
Nicolas Esposito