Hello everyone,
regarding the metahuman SDK available for free on the marketplace, i have some issues regarding the runtime application.
I have followed the suggested instructions:
Select the desired .wav file and the skeleton as " Face_Archetype_Skeleton" which is related to face of the metahuman.
After doing the above, the lipsync animation is not playing, the metahuman don’t show any animation, also the output log don’t show any warning/errors related, and also debugging the BP don’t give any useful information as that the code execution stops at “ATLAudio to lipsync” node.
I want to know first: why is the code not working properly? Then what fixes should i make to get things working?
Also I would like to ask about any useful documentation regarding the BP nodes of the SDK? for example : “Make DigitalHumanATLInput” node has some pins that are not very clear and also unknow values that lack any documentation.
Hi,
Well he last time i tried that was from 4 months ago, and there was no fixes yet.
I don’t know if there are any new update regarding the metahuman SDK lipsync.
If anyone tried it, please tell us. @EURO.SM try this link: Metahuman animation
I don’t have issues with Audio To Lipsynk and this plugin. At least with the version 1.6.0 .
You need to make sure that you applied right skeleton - better to get him from your skeletal mesh.
yeah for metahumans it’s FaceArchetype.
then you need to choose right mapping for metahuman - you can choose it from plugin, or could use default Arkit mapping with a standard metahuman.
and don’t forget to enable bSetupForMetahuman flag.
In case anyone needs it, I recently created a plugin called Runtime MetaHuman Lip Sync that enables lip sync for MetaHuman-based characters across UE 5.0 to 5.5. It supports real time microphone capture with lip sync, separate capture with lip sync during playback, and text-to-speech lip sync.