Training Livestream - Getting Started with Character Morph Targets - May 9 - Live from Epic HQ

WHAT

Ed Burgess joins us to talk about how to get started with morph targets using a character’s head. Ed explores how to bring a face to life in a few steps. He’ll explain how to export, import and control the mesh using UE4 and 3ds Max. If you are interested in building a custom character creator for your game or how to drive facial animations with UE4, then make sure to tune in!

Ed has provided an example project to go with this stream. Go to the DOWNLOAD PAGE here

WHEN
Tuesday, May 9th @ 2:00PM ET Countdown]

WHERE
Twitch
Facebook
Youtube

WHO
Ed Burgess -

Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it’s not always possible to answer’s questions as they come up. This is especially true for off-topic requests, as it’s rather likely that we don’t have the appropriate person around to answer. Thanks for understanding!

Archive:

It seems you accidently linked 's twitter account. :rolleyes:

Very interesting topic. I was wondering if you could make a quick example of driving morph targets using sound with this method: "Experimental support for facial animation"? - Feedback & Requests - Epic Developer Community Forums

Doesn’t have to be super fancy, just a morph target that changes with a sound file playing.

[Question] How do you use the facial animation importer and what are its use cases?
[Question] Is it reasonable to update curve data from outside an anim instance? e.g Another asset contains curves for phonemes and I want to send them directly to the animation graph for evaluation.

Great topic. Though I admit I’d love to see a good example setup of using the Improved 4.16 RBF Pose driver to drive corrective/muscle bending Morphs.

need some Blender Luv’n - just sayin’ :cool:

Can you post the 3ds file so we can follow along?

Or like a tweaked version if there’s copyright issues

Yes, yes. I tested morph targeting out of Blender a while ago and it works fine in Unreal.
Please cover the Blender workflow too. :slight_smile:

exactly this… :slight_smile:

Hi
Can we dl the project before the live ?

Good stream, ill be watching the recording for sure.

Questions:

  1. The poseDriver node seems a bit unituitive to use, does it need one poseAsset PER corrective? And it barely has any control for value mapping. Can you quickly show how to drive corrective morph targets using the pose reader node from animBP?

  2. How would you do facial animation? With a phoneme based system or with a FACS setup? (might be explained in stream by default, just asking to make sure)

  3. How would you deal with implementing body corrective morphs that might have influence on a characters costume? If the character has like 20 different costumes he can wear, would the only solution be to have EACH of those have a matching set of correctives?
    In that case it probably would be easier to solve rough volume problems on body with extra bones instead? (im currently doing it that way)

What’s memory performance like for morph targets, anyhow? One of my people is really, really concerned about additive morph targets on characters - mixing ‘smile’ and ‘angry’ say. They’re both floats, so if you combine them, well, it can get ugly.

Id like to have the 3DS file and the UE4 project aswell to experiment later after watching the stream. Is it possible?

I suppose a better way to ask the question is to ask about optimization tricks, especially when dealing with multiple morphs on the same area, with regards to hardware performance and load.

[Question] How would you use a pose asset to do facial animation on the fly rather than using preset curves?

Using Face Mo Cap in UE4 Sequencer

Anyway you can describe a pipeline on how to capture facial animation - use it in the sequencer - with the intention of getting it into Adobe Premiere. Our team’s goal is to use UE4 with the intention of using it for a short animated fiilm. Any recommended way to go about this - without having to use Maya or anything else - would be helpful.

[QUESTION] :: Anyway you can describe a pipeline on how to capture facial animation - use it in the sequencer - with the intention of getting it into Adobe Premiere. Our team’s goal is to use UE4 with the intention of using it for a short animated fiilm. Any recommended way to go about this - without having to use Maya or anything else - would be helpful.

Furthemore : Would we be able to “record” Morph targets - i.e “Character A raises eyebrows” in UE4 - then use it as a precanned animation that can be activated in Sequencer… if I imported a facial model - Could I “ragdoll puppet it” in UE4 - capture the animation - and then use the way other animations work in the sequencer editor

Hey there,

Was watching the stream and wondering if you could explain the 2 axis, radial control you’ve made in UMG to control 4 morph targets at the same time and blend together. You were using it for Mouth Movement control.

Later on, If you could release the project for reviewing and reverse engineering too, that would also be useful but I think a lot of people would be interested in learning about this system for facial animation and body morphs as well.

[Question] Any lipsync feature ( ala Source2 ) planned in the near future?
[Question] Would you be able to isolate ( from an image sequence or streaming webcam ) markers locations ( green or red ) on the face ( using multiple raycasting/blueprint functions ), average the pixels location, put an empty actor as the average location, get XY movemet, and then transfer the motions to the morphs?
Alternatively to manipulate the image ( getting alphas from the color channels ) so that UE4 is able to recognize the shapes ( contour of the mouth/eyes, similar to the tracking techonology Faceware is using ), and get tracking data from them?

Did this get uploaded somewhere? Couldn’t be there sadly, and really want to watch this

Click the latest video here: Twitch