Tutorial: How to Use MetaHuman Animator in Unreal Engine

Watch this video and learn how to use MetaHuman Animator to produce facial animation for your digital human in minutes.

MetaHuman Animator enables you to capture a facial performance using just an iPhone and a PC and turn it into facial animation for your MetaHuman. You can use a stereo head-mounted camera instead of an iPhone to achieve even higher quality results.

In this video, we’ll show you how you use MetaHuman Animator to turn an actor’s performance into high-fidelity facial animation in Unreal Engine, step-by-step.

Want to learn more about MetaHuman Animator? Check out our blog post:
MetaHuman Animator liefert in nur wenigen Minuten qualitativ hochwertige Gesichtsanimationen und ist jetzt erhältlich! - Unreal Engine

https://dev.epicgames.com/community/learning/tutorials/eKbY/how-to-use-metahuman-animator-in-unreal-engine

7 Likes

YEA BOII!!!

3 Likes

Thank you . Cant wait to put this to work on some of our digi-doubles.
Cheers,
b

2 Likes

I’ve been hyped for this moment since forever, you know what I’m sayin’? Ever since I peeped that GDC showcase, I knew this sh*t I’m waitin’ for is legit. This right here is a major moment for all the indie homies out there! It’s a massive step forward on the grind to our dreams! Much love and respect! <3

2 Likes

GAME CHANGING TECH :star_struck:

$100M movie quality for $100K indie budget. :money_mouth_face:

Democratized Filmmaking. Thank you!

2 Likes

Can we get some recommendations on Head Mounted Cameras (HMC) that work with MHA?

2 Likes

is it possible to just use the solver without having to go thru making a metahuman that looks like you or is it necessary with the whole calibration setup?

basically i only want the face capture animation and use it on a custom character.

also am i able to export this animation to a third party software like maya and clean it up?

looks great as is already but just figured i’d ask. thanks for all that you make!

Awesome. Love to see how this will enhance my stories.

1 Like

Does anyone know if it’s possible to extract the raw depth and video data from livelink face on its own for things other than metahuman animator?

Great work!
I want to know how to generate face texture automatically ?

1 Like

Where’s the actual documentation for this thing?

3 Likes

Update 2:
so the work around (if this doesn’t get patched) to using your MHA but still animate it in maya would be to export a standard MH in 5.2 but keeping the same body proportions as your MHA. The animations will work on any MH once exported to UE, so it’s all fine as is.

UPDATE
i upgraded an old meta to 5.2 and exported fine to maya. seems like it’s just the MH created from MHA is the one that doesn’t export correctly, which makes sense since it’s still in beta.
//////

metahumans generated from MHA don’t export in maya :frowning: let me try a work around tho.

I’m getting really bad memory leaks when starting the neutral pose step after importing data from live link. (Archive) Anyone else unable to get past this step due to memory leaks?

I have 24 GBs of ram, so that should be sufficient(?)

The tutorials provided are not sufficient for demonstrating how to achieve what was presented in GDC23. In GDC it seemed that you could go from a livelink capture to a realistic animating avatar of the performer within “10 minutes”. It doesn’t look like this is the case. In the animator tutorial, the livelink performance is used to animate a stock metahuman… Also skips over applying hair to the model.

Clicked ‘Process’ the same with 7:50.But got an error: ‘The Processing Pipeline failed with an error.’

Hey,

I’m having the same error, have you found a fix?

Such a powerful tool. I just succeed copying the vedio step by step and got an animation of myself. One more question. How can I export animation back to Maya. I mean with the facialboard_rig so that I can modify details in Maya. Anyway to do that?

Hello all. I updated to version 5.2. Decided to test the metahuman animation. At the last step of preparing metahuman and clicking on “Prepare for Perfomance” I get this error. The card is rtx3070 laptop. I downloaded volcano drivers, but it didn’t help. Has anyone had this happen?


2023-06-24_23-07-01

Wow! This is amazing! Just wondering, can it only be done with an iPhone as the capture device or can Android/Samsung devices work?

Love what you’re doing with Metahuman Animator, it’s seriously awesome. But gotta say, a lot of us are having a tough time getting the facial animations just right with Metahumans. There are many workarounds and videos about that. Any chance you could drop some tips or make it a bit easier?