Suggestion, idle face animations without ARkit

is there an easy way to add idle facial animations without the ARkit?
I don’t and have no desire to have an iPhone and cannot afford the other far more pricey facial mocap solutions either, my characters honestly don’t need that level of facial performance anyway, just some blinks,
a few expressions maybe a talking animation not saying anything in particular.
A few animation presets would suffice,
Even the one shown in the Character Creator.
I can retarget the body to premade animations or the thirdperson playerBP easily enough but frozen expressionless faces make using these characters pointless for me when I can import talking DAZ people pulling all the faces I want easily. :rofl:


thanks, I will try to digest that but I am a very simple user who just uses animation pre-sets mostly but am trying to learn. :kissing_closed_eyes:

1 Like

Check out the Facial Control Rig. It’s fairly easier to use to create eye blinks and basic facial express. However I agree with you in terms of having some default face idle. In theory someone could post some basic facial animation capture baked out from an iPhone to the Marketplace or somewhere else. I’ve also suggested they allow people the ability to download the example animations in creator so you could put it on your character n a scene to test out.

1 Like

yes I was thinking more along the lines of some facial mocaps of a live person rather than poses using the control rig as more natural.
Just a few looped animations of a person blinking, talking, making a smile etc.

I’ve seen a few videos that utilize iClone and its integration with LiveLink to feed animations into MetaHumans. iClone isn’t cheap, but it has the advantage that it has a library of animations you can choose from, including idle animations.

Exciting stuff, though there are some bumps along the way trying to get iClone to read in the skeleton exported from UE4 as a result of what seems like an FBX export bug in UE.

Check out these videos, though:

I want iC7 and CC3, ( I use iC6 pipeline) but as said price my main reason for not having it yet. I thought they still used an iPhone though.

OK I have found a solution that works for me, I can copy paste the sample Metahumans keyframes from their sequencer into my own, I did part of the whole one here but could see how I can do just the blinks etc and build up an idle, talking cinematic coupled with other retargeted body movements for a cinematic sequence.

Yes this would be very useful:
“I’ve also suggested they allow people the ability to download the example animations in creator so you could put it on your character n a scene to test out.”

I just saw that Metahuman SDK audio file lipsync using a Russian cloud server but am unsure about it, I don’t tend to go to sites in that country, not saying unsafe, I do use Duckduckgo sometimes after all and some 3D assets made there but not quite the same.
I am wary of anything cloud based anywhere TBH.

I agree with the suggestions in this thread to add the idle animations shown in the metahuman creator to an asset pack for easy downloading/testing. Is this something that’s likely to happen?

I doubt you need arkit or anything else to generate animations.

You can copy poses from a skeletal setup to morph targets.

The process would involve posing the bones manually into an animation, and then recording the output for the various 52 morph targets you need to make the face do stuff.

Naturally, making facial animations without a mocap setup is hell, but its been done for ages before we invented better solutions…

As someone who hates apple, I have to say that the iphone 11 I specifically got for face mocap is great at it.
Probably better than the rest of the mocap equipment that costs 60 times as much.
Though not as good as face markers and multi camera setups, the fact you fire it up and get it working in 10 seconds is where the value is at.

If you want to animate a mesh you have to learn how to rig and generate morphs either way, in fact, if anyone here doesnt know how its probably metahuman’s fault.
If they did not exist, you’d have already learned how to make it happen…