We see the boy from Boy & his kite demo fully animated using the Metahuman Animator pipeline.
The thing is I can’t seem to find proper documentation on how to implement fully custom characters like this one to the Metahuman rig pipeline. The “Mesh to metahuman” plugin just uses a basemesh to generate a matching metahuman, but still uses the Metahuman mesh, hair, materials etc… While this looks like a fully custom mesh with the metahuman rig adapted to it.
There are some people on Youtube who managed to reuse Metahuman face rig for arbitrary characters in Maya (with some issues), but it would be nice to have official files and workflow from Epic. Hopefully they will explain more once it’s released.
The meerkat is does not use the metahuman rig though, so it’s not the exact same use-case. It really looks like the boy model (from my initial post) is using the full metahuman rig.
Do you have some links for those Youtube videos you mention? I’d be curious to take a look as well.
The face thing uses the ARKit morph targets so every face which can be animated following those standards can work with the animator - in theory.
If you want to learn how to make your custom mesh have the correct morph targets, you should make a separate topic on it.
It’s at least 2 pages worth of a reply.
Have they stated anywhere that Metahuman Animator uses the ARKit standard? Seems to me that the Metahuman rig is very very different from the ARKit template…
That Boy was released many years before metahuman systems were created, although I think it was rigged by the same people, but the complexity of the rig doesn’t seem to by that high.
I really think they implied a universal approach for any rigs with blendshapes / morph targets, so you could even animate abstract piece of wood with beans as eyes or whatever you want - without anything resembling an actual metahuman rig. This is my assumption at this moment.
I suppose the biggest difference in Metahuman Animator compared to ARKit is that it doesn’t rely at all on iPhone’s own detection of facial parameters - it only captures raw feed (RGB + depth) and streams it to PC and then they do the tracking with their own solution (that seems to handle lips much better and be more offline quality oriented for pre-recording performances instead of real-time), so that it can also work with industry standard dual camera rigs, not just iPhones.
The boy and his kite model was fixed up when they introduced the iphone’s ARKit functiomality in order for the related demonstration to take place.
The standards and definitions are all part of the ArKit definition.
A simple search for ArKit blend shapes would have turned this up.
Currently, this is the only “standard”. Mostly because apple makes the only product which is used by the masses and allows for face capture by any owner of an iphone.
Partially, because no one in the dev community has stepped up to put apple in its place with a better standard - and metahuman related stuff will never be that.
I don’t see why it would be its own universal animation thing that works with any rig, it looks more to me that it’s tailor-made for metahuman characters.
@MostHost_LA I really don’t see what you are basing all your assumptions on tbh. Unless I’m really missing something obvious, everything seems to point towards a metahuman rig being fitted in the Boy model rather than metahuman animator somehow magically working with any rig (which would be great, don’t get me wrong).
The fact that animations you buy are all mostly going to be based on the apple ArKit standard.
But yes, chanches are you need to fit the model on whatever makes metahuman animator work right on top of it.
The only thing that will always be compatible is the name of the 52 blend shapes…
Again, that’s because its the only “standard”.
Like skeleton nomenclature though. You can call a bone hand or apple. It doesn’t change anything at all since what you have is data, and that data can be labeled anything at all.
The contents are what matter…
Ofc if the metahuman people went in and renamed all facial morphs to something else they would literally have done so just to ■■■■ people off. So its 99.9% sure they did not…
They definitely used morphs up to the initial release. After that I’m not sure, but 99% certain they did not mess with it.
And if they added bones for keyin on top, thats just good practice - my rigs all have both options.