Is there any way we can use this on MetaHuman? I mean to record facial animations for MetaHuman.
I second that…buying a new RTX is enough expensive and buying an apple device is too much even to use MetaHuman.
I loved it very much, I needed a tutorial to connect this system to the character.
its posible, with the android app you are getting value pairs like {blendshape:float} so you have to link those values to the curves used on the metahuman animblueprint, you would have to do this using blueprint code
Have you looked into using this? Detect faces with ML Kit on Android | Google Developers
Downloaded it, seems pretty cool!
Hi, I tried to install in two android phones and your app says the phone isn’t compatible. Where can I see what phones are compatible with your app? Thanks
Hello @paranoio excellent work!!!
Its posible to implement Facemocap for UE 5.1?
Recently I found a YT Channel with tutorials for animate MetaHuman Characters with iphone app, but maybe you can make it work with your app… What do you think??
Best regards
Jerome
i could do it but i dont have too much time these days, im focused on getting a job . Besides i was waiting for arcore (my app facetrack library) to be updated since currently blinking and eye tracking is not available which i think is important.