Face capture pipeline

I’ve done a lot of experimenting with facial capture for Metahuman. Where I am with my research:

1a. I’ve experimented for a few hours: Faceware with a web camera and the results aren’t great
1b: I haven’t tried: using a high quality camera/lens

2a: I’ve experimented for 20 hours: iOS Live Link with a new iPhone and ethernet cable. The results are promising
2b: I haven’t tried: Incorporating Reallusion’s iClone Unreal Live Link Plugin, which I’ve heard will improve the capture?
2c: I’m curious if there are other solutions besides Reallusion’s that will work with the new iPhone’s depth camera.

3a: Anything else out there?

Best to try to calibrate the system for your rest face and extremes which may include doing some adjustments in the AnimGraph

So working with the AnimGraph is a better route than incorporating iClone or some other program?

It’s been awhile since I used iClone and they’ve come out with a new version since I last used it so I would suggest exploring it as well. No system will be perfect out of the box so any will require some means to calibrate or adjust for your particular face and the character you’re animating. Some setups are easier/faster to adjust than others.