What skeleton does metahuman use?

Got it, using them as a “preview” is a good way to establish your game world or specific scene.

If your game has lots of character interaction ( ala Mass Effect/The Witcher ), it would be good for you to understand exactly what you need in order for the characters interaction to work, such as proper body mocap animations ( generic or specific ), proper facial animations ( recorded together with the body mocap or separated, but either way those animations are disconnected from body animations ), eye gaze, some contact behaviour( grab objects from one another, so proper IK setup ) and in general NPC behaviour.

Nowdays you have a lot of different options for mocap, from a base Rokoko/Perception Neuron mocap suit, using an iPhone for facial tracking or Facegood/Faceware, you can stream the results directly inside Unreal and see everything running in realtime.

For dialogue, if you’re concerned about lipsync, you can use Audio2Face, which is free and does a good job translating an audio track to lipsync, or you can try with an iPhone, which produce better results compared to Audio2Face, but you do need to record yourself talking/acting, or you can goa step up and use Facegood/Faceware, where you literally track and solve the actor performance to a digital character.

I strongly suggest you to take a look at what’s available on the market, between mocap animations that you can buy on the marketplace to a full body mocap suit, then decide which way to go.

Even though 99% of the softwares available can stream inside Unreal, you will need to work on your animations to have them working properly in your game…I’m talking from cutting the animation clipt to length, to tweak/smooth the anim curves, so tweak the body posture, and so on.

2 Likes

Understood!

There’s quite a few different ways to do… well, everything, so I’m pretty stubborn when there’s other methods.

Don’t get me started on AI. I avoid behavior trees like the plague. I’m comfortable with blueprints, and I’m not budging.

Okay, I will check out Audio2Face. I’m learning Faceware and it’s pretty great, but it has trouble tracking me in certain lights… probably because I’m black. Software still has some trouble with darker skin in certain lighting.

I actually fully intend to act in the game (I’ve got some chops) so it’s funny you said that. So far I’ve successfully made fully functional missions with audio dialogue, combat, object interactions, inventory, etc… (Just no facial animations… and I don’t intend to go the way of “Choo-Choo Charles” - shudder).

Lastly, my game will be small in scope, RPG, yes, with quite a bit of dialogue, but only a handful of characters (6 tops). It’ll be post-apocalyptic which really, REALLY helps with scope, let me tell you.

Thank you for the insight! Anything else come to mind, I’m listening.

I believe I aleady told you this: hire a professional mocap studio, save yourself from 3 years of work and learning.

Put your time and money on properly creating a custom character that you own the rights to.

First of all, the tech is junk. Thats a factual opinion given by several others with years of expereince in character design. Adjusting a bone structure based on variables is nothing new, its been around for ages. Adjusting animations based on whatever is nothing new which has been around for ages as well.
Unawareness doesn’t make something “amazing” to the rest of the industry.

Second of all, its absolutely not free. Read the licensing for it. Its only “free” when used in specific ways.
This may not matter to joe small making a game that will never see the light of day, but it matters to anyone actually intending to eventually publish. Particularly if you won’t be using this sorry excuse of an engine due to its constanst performance loss.

Third of all,
All that metahumans have been doing since release is to devaluate the actual artists behind character design.
“Oh you don’t need to hire a character designer, just use a metahuman. You won’t even need to know how to create a rig or how to create animations! Heck, you won’t even need to know what haircards are or what the tris count of a mesh should be!”…

1 Like

This has been the most informative conversation on these forum that I’ve ever read. From the future, thank you guys for putting in time sharing your opinions with the rest of us. LA, I’ve been running into issues that evoke temptations to switch engines, but there exists sunken cost into unreal. Your opinion is my opinion, but I hammer forward always telling myself it’ll get better. Technically for the most part it has, but wondering your brief opinion on the engine in general, does your team still use it in projects? Heavy cpp I assume?

Ue4 use is sporadic at best - usually when required by a client, or paid for, or on a legacy project.

We rely very heavily on c++, actually we really only run things off source in custom built engines, so basically only c++.

Blueprint has a few uses on smaller things and prototyping.
For instance, you wouldn’t go code a custom class to get one actor to turn and face another.

In general, the engine has really just gotten worse these past 3 or 4 years.

It started to dip around .22 with nonsensical updates that added stuff which sounded good in theory but were never fully realized (like RTV for one).
During covid it seriously fizzled down into the most uncosinstent clunk of bloatware with a side of real chaos thrown in by the nonsensical physics switch of physics engine with the same nomencalture.

By now, the latest engine can’t even run a vulcan project properly. They have had countless core issues and attempts at fixing them which I don’t think went anywhere… at least considering the current trend of forum posts which bash on the very same issues.

Nanite could have been a welcome addition, but most people turn it off because it makes things worse or non-functional.

Lumen seems the same I guess, since shadows get blurred and other issues come up with it on.

Compared to direct competitors, ray tracing is trash at best, since cry engine does it a heck of a lot gaster being done volumetrically
(Though Cry has other shortcomings, like needing to code a bunch of things yourself so its not for everyone).

Topic, yet non engine related…
They doubled down on meta humans, which now have a galaxy worth of bones and wierd control rig setups.
They created some sort of an “animator” thing that apparently takes your face and puts animation on a metahuman via an iphone (only saw the demo really, so cant comment to much beyond: all of the stuff that was filmed was already in existence and possible by just doing some modeling leg work.

They added more fluff to the engine, calling it ai powered where in reality the only thing AI in the engine is the BlackBoard tree you build yourself… not that those level replication things don’t look cool on tape btw, the fact is they’ll never work on a real project and they all know it.
(And sure, their idea is it gets you close so you can do editing, which is somewhat fair. Id rather just do the acrual work, but thats just me).

Fortunatly (or unfortunately if that’s your view point) I just never really shared any of your optimism towards this engine getting better…
Most projects (all of which use 100% custom made assets anyway) were migrated out.

I don’t really know of anyone who published at all (indy ofc) on anything that’s not 4.25 or below in the past 3 or 4 years.
I do know of several who cannot publish if they update their peoject(s) due to the endless slur of bugs it introduces…

One thing the engine is good for at the moment, is to pass motion capture into and use it as a studio/recording spot.
This is only because the face capture layers on top of any livelink input and is free to use, while also allowing you to view the actual final performance of an actor’s face (only as good as you rigged the poses anyways, but its a step up from most DCC since they dont have the same low treshold of precision the engine does, and therefore render the expression much better than the engine, which leads you to believe you have something when you in fact really don’t. Running in engine you are at least sure its as good as you can get within it)…

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.