Integrating Metahuman Facial Expressions with Convai Plugin for Real-Time Interaction

Hello Unreal Engine Community,

I’m embarking on an exciting project and could use your collective wisdom. I’m looking to bring life to my Metahuman character by allowing it to converse with users in real-time, showcasing a range of facial expressions. While I’ve mastered animating control rig poses stored in the ‘Metahumans/common/common’ folder, I’m stumped when it comes to adapting these for real-time dialogue.

I plan to use the Convai plugin for the conversation mechanics, but integrating the facial expression poses to respond dynamically during the chat is where I need insight. The documentation covers animation creation with control rig poses, yet it doesn’t quite clear the fog for real-time applications.

Has anyone ventured into this territory and willing to share their map? Any guidance or resources would be a treasure.

Thanks a million in advance!

1 Like

I’m currently looking into the same thing. Did you find something on this topic?