I’m receiving Al-generated facial animation data over a socket- around 51 values (similar to the values from livelink) per frame-and I’m having trouble applying these to UE5’s animation curves (what Live Link normally produces and animates).
I’m an experienced Python developer but new to UE5. I tried driving the facial shapes through a custom C++ Blueprint, but that approach hasn’t worked.
Any advice on how to map those 51 values to Unreal’s curve system (or another recommended workflow) would be greatly appreciated!
There are various options, such as creating LiveLinkSource or manually passing curve values from AnimationBP, but if you are familiar with Python, I think it would be quicker to refer to the NeuroSync code.
NeuroSync implements facial animation by reusing the UE standard AR Kit without any extensions on the UE side, and sending signals compatible with AR Kit from Python.