How to get the data from coming from real time source to drive MH facial animation?

Hey guys!

I am looking to track, gather, see, manipulate the data coming in real time from either my webcam or audio source, to my metahuman. I have two cases here..

  1. I am using my webcam to drive facial animation and lip sync in real time.

2. I am using my microphone to speak to drive my metahuman lip sync in real time.

What does this data look like? How do I find it? How can I use it?

This is what I have so far. I am using ‘Evaluate Live Link Frame’ with my subject ‘Webcam’.. But I can only print this on initialized, as on every tick crashes. I can print these values, in my output log they look like this: ‘LogBlueprintUserMessages: [BP_NewMetaHumanCharacter_C_0] 1.0’

Is there another way? A better way to find and get to this data? Any help appreciated :slight_smile: