Can I control Metahuman face Expression using python script?

Hello Metahuman Creator,
I saw some videos about controlling meta human face expression using control rig in unreal engine YouTube channel.
So I think it also should be cone via python script like talking expression. Is it available ?
and can you make tutorial video or documentation guide to do that?
Thank you.

You may be able to write something that leverages the Python bindings for Live Link: Search — Unreal Python 4.27 (Experimental) documentation

Live Link is the API that things like the iPhone or iClone use to provide animations for the body and face of characters. You’d need to find or implement something to drive the API from Python and you’d need to adjust the facial animation blueprints of the MH so they have a reference pose that can translate the inputs you’re providing into the appropriate animation parameters.

Hello Guy_Paddock, thanks for your reply.
Can I use python script for controlling control rig like changing location and rotations of bones? I saw something in blender and It’s good for automation. Is that available in unreal ?
Here is the YouTube link of that which very simple but very useful for automate speech.

I’m not sure. I haven’t ever tried using Python with UE myself but if there are bindings for it I suspect it’s possible.

Yeah, we can… unlike blender here, we can automate the facial expressions with more feelings and sentiments (combined with NLP & NLU techniques) in Realtime with python + control rig + blueprint. refer to this YouTube channel - link #pycharacter #pycomponent #development-discussion