Faceware & Unreal Troubleshooting for Metahumans

Hi everyone! I am creating this topic for anyone that needs help with troubleshooting when working with Metahumans and Faceware for facial motion capture.

7 Likes

Hey Gabriella, do you have any tip on calibration/tuning for faceware 1.5, I get quiet mixed results.

I’m using faceware 1.5 with a diy HMC with an iPhone, the diy HMC is solid and I’m using the iPhone like with a diffuser at 120fps 720p with the professional head cam setting, I have a profile setup and can sometimes get some really good results but it’s quite hit and miss, I’m trying different rest poses depending on mood I’m going for. Is it standard practice to have to calibrate lots of times until you get a good result on mouth shapes?

The last video you put out on YT was really good, are you just using faceware with the mk iv HMC or are you cleaning up the animation as well.
Thanks
Jonny

1 Like

Hi! Quick question, what do you mean by “with a diffuser at 120fps” as I believe iPhones can only stream in at 30fps which is fine. You want the fps of your phone to match the fps in faceware.

For calibration/neutral pose, try this. Before you make a neutral pose, try and scrunch your face first, like you just tasted a lemon, then let your face muscles completely relax. That should give you a good neutral pose.

Depending on what part of the face you really want the solver to hone in on, like the mouth for instance, move your camera slightly lower so that its looking not straight ahead but just a tad bit lower, looking up to where your nose is.

Another tip, the Pro Cam settings do not always give the best results. I find that whether I am using DSLR prerecorded footage or footage from the Mark IV HMC, sometimes I get better results from the stationary cam.

One last thing, make sure your light isn’t too strong. You should have even lighting, so adding a little reading light helps, but if the lighting is too strong and is reflecting on your lips, that reflection can confuse the solver in the mouth area. It takes some time to really perfect things so maybe play around with the camera position and lighting before you record or stream and see if you can find the sweet spot for your specific face.

Hi, I mean I have a diffuser over the iPhone light so it isn’t too bright and I stream through the obs app so you can choose your output settings for the FPS, that’s how I’m getting the 120fps it will go to 240fps but thought that would probably be too much, I’m thinking 120 might even be too much.

Thanks I’ll continue playing around with the position and might see if I can tweak it some more.

Thanks

Hello Gabriella, nice idea to have a thread about faceware studio. It would be nice to have an example of how you manage to achieve such a nice facial mocap in your last videos. I m using a good stationary Logitech 60fps web cam, but I have not managed to get the results i would like. Besides of Animation Tuning tab, did you also use the Motion Effects tab to adjust the curves ?

thank you

1 Like

The mouth left and right of the actor are different.
Snipaste_2022-01-17_18-39-14

How do I set the mouth in FWS?
Thank you for any help you can offer.

So the thing is though, regardless of obs, if you are using an iphone that is capturing at 30fps i would leave it at 30 and not mess with it. The highest you can stream into Faceware is 60. Faceware will automatically pick up the fps you are streaming in, so I always leave that alone. I have tried increasing the fps in FW but encountered other issues such as audio sync later on down the road.

1 Like

Using a webcam to achieve good facial motion can be tricky. What you want to do is get even lighting on your face first, then you do want your face to be close enough to the camera so that your face should be taking up the majority of the screen. The closer your face is, the better the solver will be able to track your expressions.

If you are not using the head/neck rotation, try to not move your head a lot. If you notice you are doing something like moving your head alot and it is throwing the solver off (those lines around your eyes, nose, brows and mouth after you calibrate) the cleaner and smoother data will be the end result.

I personally do not use the male head in Faceware as my reference. The best way to see how the data is translating onto your Metahuman is to stream directly into Unreal and make your adjustments with the multiplier in faceware that way. You can also try switching from Stationary Cam to Pro cam.

I got it.Thank you for your help.

Ah ok I will try some different settings, the iPhone I’m using is capable of capturing 240fps natively in the phone and when I enable it from the phone faceware picks it up as that FPS so I assumed it was capturing at the frame rate that was set in faceware, I also have it wired over usb, I’ll check to confirm if faceware is actually capturing at the frame rate I have set. Thanks for your time help again.

Does the multiplier in faceware refer to the data streaming of the Animation Tuning sliders?

When I adjusted the value of the data streaming to suit one of the facial expressions,sometimes it’s not suit another facial expression.

I posted in your discord as well Gabriella… Here is a problem I discovered with faceware studio to unreal engine… when the head turns left and right looks nice in face ware studio but in unreal when the head is (how to say it) in front of camera (at the initial position) it stops for some ms… this gives a somehow a jittery movement… i believe that small same issues exist in other movements as well… check here carefully :
faceware1

Do you have save issues guys ?

Hello Gabriella,

My issue is a straight forward one really.
My trial of FaceWare Studio runs out tomorrow and I spent all my cash on a mic to make tutorials
and the massive vet bill for my dog.
Other than me taking on extra hours and extending my nightshift at work, do you have any idea idea how I can fix the problem and get another 6 months free ?

My poverty to one side, I have been having a problem where using an mp4 as a media source
does not always work. I mean it wont play back. Would it be better to use an image sequence in that case ?

1 Like

Anyone know if there are any faceware studio updates planned? Was hoping there might be a few refinements as I still find it kind of hit and miss, never a straightforward operation getting in and recording some facial mocap, normally spend hours trying to refine and recalibrate only to end up with something I can’t use.
Thanks

Ok so just been watching the online learning for the MH workflow with faceware, lots of great info in there that I didn’t know about, looks like I’ll have to put some more work in instead of just thinking it’s an out of the box solution. If anyone hasn’t seen it there is some great info in there and we’ll worth it.

1 Like

Hi Gabriela,

I have a question regarding combining the head rotation animation for the body alongside a custom animation like dance or crouch. If I try to bake the animation and then add another animation it just doesn’t seem to register. Is there any documentation or do you have a tutorial on how to do this?

Thanks!

Hello everyone,

I have a question regarding opening the Anim Blueprints in the motion logic folder. I’m receiving multiple errors regarding “shape points”. Does anyone know why this happens?

So there are two motion logic blueprints. Faceware Live Link plugin has its own motion logic blueprint as well as Glassbox. Looks like you are trying to use the Glassbox motion logic blueprint with the faceware one. If you go to the Faceware Unreal course on here, download the zip folder, it should be in there. The motion logic bp labeled FWLL is the faceware one, fyi.

Hey there. I must have missed a step when trying to setup my faceware studio to ue5 metahumans workflow. I have the faceware Livelink plugin installed, I’ve imported my metahuman character, livelink is running and faceware studio is successfully streaming. However, it appears that I’m missing the livelink blueprints that are driving the ABP’s for the face and body. Where should I look to get these two assets? Thanks.