@SimBim - Here’s a really hasty little test of the facial motion capture in iClone, with only a couple of settings changed.
This was not under ideal circumstances – the only place I could find to mount the phone (rather than holding it) was a bit farther away than would be recommended, and that positioning caused there to be light glare on my glasses from a window.
I’m still pretty impressed. With proper positioning and some time/attention to tune the capture parameters to map my face more closely to Alex’s, I suspect I could get something truly seamless. (Alex being my game’s protagonist there, doing freelance work as a test dummy since she happens to be rigged in iClone to start with.)
(I didn’t bother to drive this in Unreal directly, because it felt like that might be more effort than was worth setting up for a quick test; I just rendered Alex right out of iClone with that recorded expression track.)