What skeleton does metahuman use?

Got it, using them as a “preview” is a good way to establish your game world or specific scene.

If your game has lots of character interaction ( ala Mass Effect/The Witcher ), it would be good for you to understand exactly what you need in order for the characters interaction to work, such as proper body mocap animations ( generic or specific ), proper facial animations ( recorded together with the body mocap or separated, but either way those animations are disconnected from body animations ), eye gaze, some contact behaviour( grab objects from one another, so proper IK setup ) and in general NPC behaviour.

Nowdays you have a lot of different options for mocap, from a base Rokoko/Perception Neuron mocap suit, using an iPhone for facial tracking or Facegood/Faceware, you can stream the results directly inside Unreal and see everything running in realtime.

For dialogue, if you’re concerned about lipsync, you can use Audio2Face, which is free and does a good job translating an audio track to lipsync, or you can try with an iPhone, which produce better results compared to Audio2Face, but you do need to record yourself talking/acting, or you can goa step up and use Facegood/Faceware, where you literally track and solve the actor performance to a digital character.

I strongly suggest you to take a look at what’s available on the market, between mocap animations that you can buy on the marketplace to a full body mocap suit, then decide which way to go.

Even though 99% of the softwares available can stream inside Unreal, you will need to work on your animations to have them working properly in your game…I’m talking from cutting the animation clipt to length, to tweak/smooth the anim curves, so tweak the body posture, and so on.

2 Likes