I have a skeletal mesh which is a head.
I would like to use C++ to animate the mesh so that it can have facial expressions.
Is it possible for me to adjust the shape of the mesh based on the Live Link Face information inside the AnimGraph of the animation blueprint, so that real-time facial expressions are possible?
The reason I use C++ is that I have data to transform bones based on ARKit coefficients and this is hard to realize via BluePrint without coding.
Thanks!!!