Hello,
I’m facing some problem with using LipSync and Metahumanhead. In project im working on we are using Articy to feed text to Lipsync plugin and change it in to animation. I have it working on other heads already. So i was able to feed text in this system and on the end of code send ready anim montage to mesh and start animating.
Now we want to use metahumen heads in our project so i need to adjust settings and so on. And that’s my setup:
And its not working for some reason.
Interesting is that when in node “Create Slot Animation as Dynamic Montage with Blend Settings” i will feed in animation i made in editor everything is working and animation is playing.
So apparently whole setup from this node forward is right and communication with animBP is right. So all i think about is ATLAudio to Lipsync for some reason make animation that is not accepted by mesh later. I try all setup options and cant nail it for like 3 days now.
Interesting. If i go to content browser and find any wav file and use Create Lipsync Animation function in editor, i will get animation that works no problem with my setup. But when i use same function in run time animation seams to be not compatible for some reason.
I have no idea how to fix this… I think im very close to solution but i’m missing something important…