Epic_ELT
(Epic Education, Learning, and Training)
1
Watch this video and learn how to use MetaHuman Animator to produce facial animation for your digital human in minutes.
MetaHuman Animator enables you to capture a facial performance using just an iPhone and a PC and turn it into facial animation for your MetaHuman. You can use a stereo head-mounted camera instead of an iPhone to achieve even higher quality results.
In this video, we’ll show you how you use MetaHuman Animator to turn an actor’s performance into high-fidelity facial animation in Unreal Engine, step-by-step.
I’ve been hyped for this moment since forever, you know what I’m sayin’? Ever since I peeped that GDC showcase, I knew this sh*t I’m waitin’ for is legit. This right here is a major moment for all the indie homies out there! It’s a massive step forward on the grind to our dreams! Much love and respect! <3
is it possible to just use the solver without having to go thru making a metahuman that looks like you or is it necessary with the whole calibration setup?
basically i only want the face capture animation and use it on a custom character.
also am i able to export this animation to a third party software like maya and clean it up?
looks great as is already but just figured i’d ask. thanks for all that you make!
Update 2:
so the work around (if this doesn’t get patched) to using your MHA but still animate it in maya would be to export a standard MH in 5.2 but keeping the same body proportions as your MHA. The animations will work on any MH once exported to UE, so it’s all fine as is.
UPDATE
i upgraded an old meta to 5.2 and exported fine to maya. seems like it’s just the MH created from MHA is the one that doesn’t export correctly, which makes sense since it’s still in beta.
//////
metahumans generated from MHA don’t export in maya let me try a work around tho.
I’m getting really bad memory leaks when starting the neutral pose step after importing data from live link. (Archive) Anyone else unable to get past this step due to memory leaks?
I have 24 GBs of ram, so that should be sufficient(?)
The tutorials provided are not sufficient for demonstrating how to achieve what was presented in GDC23. In GDC it seemed that you could go from a livelink capture to a realistic animating avatar of the performer within “10 minutes”. It doesn’t look like this is the case. In the animator tutorial, the livelink performance is used to animate a stock metahuman… Also skips over applying hair to the model.
Such a powerful tool. I just succeed copying the vedio step by step and got an animation of myself. One more question. How can I export animation back to Maya. I mean with the facialboard_rig so that I can modify details in Maya. Anyway to do that?
Hello all. I updated to version 5.2. Decided to test the metahuman animation. At the last step of preparing metahuman and clicking on “Prepare for Perfomance” I get this error. The card is rtx3070 laptop. I downloaded volcano drivers, but it didn’t help. Has anyone had this happen?
Love what you’re doing with Metahuman Animator, it’s seriously awesome. But gotta say, a lot of us are having a tough time getting the facial animations just right with Metahumans. There are many workarounds and videos about that. Any chance you could drop some tips or make it a bit easier?