Hi, brand new beginner to UE here. I’m trying to find a workflow that lets me use an iOS app to track facial movements, then have it mirrored/brought into UE5 to be projected onto a character.
I understand there is plenty of documentation out there for such a workflow, but I’ve gotten overwhelmed with what’s even possible on a macOS or where to go from here. I figured the MetaHumans plugin for UE5 would be the way to go (connected with Live Link Face app for iOS) – but the MH plugin isn’t even supported on macOS yet?
Do I even need the MH plugin for UE5 to create this workflow I’m seeking? Or can it be accomplished another way? I’m not necessarily attached to my subject character being a MH – it can be any kind of character.
Thanks so much to whoever is kind enough to answer this noob-level question!