Metahuman Performance Blueprint Generation

When using a Metahuman performance to generate an animation based on an audio clip, sometime 5.6 will freeze for a long period of time (maybe 20 seconds or more.) Is this a known issue?

Also on a related subject, I believe at Unrealfest it was mentioned that we can now call the Metahuman animation lipsync generation from a Blueprint. Can you confirm, and if so, can you point me to an example (haven’t been able to find one.)

Steps to Reproduce

Hi Jerry,

At the start of processing audio data, the MetaHuman Performance asset will solve across the whole frame range before processing individual frames and updating the UI. The time to do this initial solve is proportional to the length of the clip, and so will be longer as the length of the clip increases.

Does the freeze you describe correlate with clip length? If so, this is likely what you’re seeing - this hasn’t changed in Unreal Engine 5.6 and so we’d expect you to see the same behaviour in Unreal Engine 5.5. If it doesn’t appear to be related to clip length then it suggests a different cause.

We have some Python examples that show scripting MetaHuman Animator, either from audio or other sources such as depth data. While we don’t have specific examples of using a Blueprint, everything you can do in Python can be done in a Blueprint as well.

Thanks,

Mark.

Thanks Mark.

My experience with generating lipsync has been very inconsistent. In some case using an audio clip it processes the clip and then returns fairly quickly. In other cases, Unreal appears to completely lockup (using the same clip as earlier) and freeze (the editor will even report as “not responding” during this time.) Eventually it does come back, but it may take much longer. I mentioned 20 secs+ earlier, but I’ve seen this take minutes as well.

Thanks for pointing me towards the Python examples. I’ll take a look at them.

That’s interesting - have you used audio driven animation in Unreal Engine 5.5 (to act as a comparison for whether this is new behaviour), or is it the first time with Unreal Engine 5.6?

Yes, I’ve used it in 5.5 frequently, and I’ve never seen the issue there.

Quick question on the scripting metahuman animator. Is this something we can do dynamically while the editor/game is running? We want to be able to dynamically bring in audio clips, generate the lipsync animation, and play them.

MetaHuman Animator can be scripted while the engine is running, yes. However, we don’t directly support this outside of editor mode - e.g. if you are running a game - as while audio driven animation is realtime it hasn’t been fully tested as suitable for runtime.

As for the freeze, it sounds like a regression however not one our QA team have been able to reproduce. Could you perhaps share a log file from a session when the freeze has occurred?

Hi Jerry -- For the observed freeze when processing an audio clip, it would be great if you could share through any logs from that session as Mark requested, as well as an Insights Trace file for us to analyze more granularly what is the bottleneck in the engine during that freeze. Instructions for capturing an insights trace file can be found here:

https://dev.epicgames.com/documentation/en\-us/unreal\-engine/trace\-quick\-start\-guide\-in\-unreal\-engine\#option4:fromtheeditor