Download

Faceware & Unreal Troubleshooting for Metahumans

Hi everyone! I am creating this topic for anyone that needs help with troubleshooting when working with Metahumans and Faceware for facial motion capture.

3 Likes

Hey Gabriella, do you have any tip on calibration/tuning for faceware 1.5, I get quiet mixed results.

I’m using faceware 1.5 with a diy HMC with an iPhone, the diy HMC is solid and I’m using the iPhone like with a diffuser at 120fps 720p with the professional head cam setting, I have a profile setup and can sometimes get some really good results but it’s quite hit and miss, I’m trying different rest poses depending on mood I’m going for. Is it standard practice to have to calibrate lots of times until you get a good result on mouth shapes?

The last video you put out on YT was really good, are you just using faceware with the mk iv HMC or are you cleaning up the animation as well.
Thanks
Jonny

1 Like

Hi! Quick question, what do you mean by “with a diffuser at 120fps” as I believe iPhones can only stream in at 30fps which is fine. You want the fps of your phone to match the fps in faceware.

For calibration/neutral pose, try this. Before you make a neutral pose, try and scrunch your face first, like you just tasted a lemon, then let your face muscles completely relax. That should give you a good neutral pose.

Depending on what part of the face you really want the solver to hone in on, like the mouth for instance, move your camera slightly lower so that its looking not straight ahead but just a tad bit lower, looking up to where your nose is.

Another tip, the Pro Cam settings do not always give the best results. I find that whether I am using DSLR prerecorded footage or footage from the Mark IV HMC, sometimes I get better results from the stationary cam.

One last thing, make sure your light isn’t too strong. You should have even lighting, so adding a little reading light helps, but if the lighting is too strong and is reflecting on your lips, that reflection can confuse the solver in the mouth area. It takes some time to really perfect things so maybe play around with the camera position and lighting before you record or stream and see if you can find the sweet spot for your specific face.

Hi, I mean I have a diffuser over the iPhone light so it isn’t too bright and I stream through the obs app so you can choose your output settings for the FPS, that’s how I’m getting the 120fps it will go to 240fps but thought that would probably be too much, I’m thinking 120 might even be too much.

Thanks I’ll continue playing around with the position and might see if I can tweak it some more.

Thanks

Hello Gabriella, nice idea to have a thread about faceware studio. It would be nice to have an example of how you manage to achieve such a nice facial mocap in your last videos. I m using a good stationary Logitech 60fps web cam, but I have not managed to get the results i would like. Besides of Animation Tuning tab, did you also use the Motion Effects tab to adjust the curves ?

thank you

1 Like

The mouth left and right of the actor are different.
Snipaste_2022-01-17_18-39-14

How do I set the mouth in FWS?
Thank you for any help you can offer.

So the thing is though, regardless of obs, if you are using an iphone that is capturing at 30fps i would leave it at 30 and not mess with it. The highest you can stream into Faceware is 60. Faceware will automatically pick up the fps you are streaming in, so I always leave that alone. I have tried increasing the fps in FW but encountered other issues such as audio sync later on down the road.

1 Like

Using a webcam to achieve good facial motion can be tricky. What you want to do is get even lighting on your face first, then you do want your face to be close enough to the camera so that your face should be taking up the majority of the screen. The closer your face is, the better the solver will be able to track your expressions.

If you are not using the head/neck rotation, try to not move your head a lot. If you notice you are doing something like moving your head alot and it is throwing the solver off (those lines around your eyes, nose, brows and mouth after you calibrate) the cleaner and smoother data will be the end result.

I personally do not use the male head in Faceware as my reference. The best way to see how the data is translating onto your Metahuman is to stream directly into Unreal and make your adjustments with the multiplier in faceware that way. You can also try switching from Stationary Cam to Pro cam.

I got it.Thank you for your help.

Ah ok I will try some different settings, the iPhone I’m using is capable of capturing 240fps natively in the phone and when I enable it from the phone faceware picks it up as that FPS so I assumed it was capturing at the frame rate that was set in faceware, I also have it wired over usb, I’ll check to confirm if faceware is actually capturing at the frame rate I have set. Thanks for your time help again.

Does the multiplier in faceware refer to the data streaming of the Animation Tuning sliders?

When I adjusted the value of the data streaming to suit one of the facial expressions,sometimes it’s not suit another facial expression.