I have audio2face streaming the live audio and facial animations to metahuman. I would like to play an animation asset to body when the audio starts playing and revert back to blueprint animation when done. It seems simple, but isnt. There is no property or flag to see when the audio2face is streaming the data. I notice a small yellow dot beside the live link subject name , Role that turns green when the audio2face is sending data. There is no name for that. So far i have tried some approaches.
- Tried adding a Open Sound Control (OSC) server with ip address of local host 127.0.0.1 (default ip for audio2face) and port 1230. Bind event to on OSC message received, create a new event and play animation asset with that. Tried for the port 1231 as well, where my audio2face subject is listening to for the sound.
I have tried both on Message received and on OSC Bundle received. The event does not seem to trigger. audio2face documentation mentions that it uses burst mode with submixplayer. haven’t had a breakthrough yet. Any pointers and help would be greatly appreciated. I am sure many of us are trying this. It would help.