Hi all
We are capturing face motion using Live Link face app into UE5.
It works great.
We want to have the option to capture someone’s face remotely that don’t have UE5 installed so they can’t record the movements.
Seeing that the app has all the data locally anyway, is there a way to record the data in the app without Unreal and then send it to a developer to place within a scene and onto a metahuman?
The record option in the app seems to only save video.
It uses the animation exported as CSV to create animation curves, but it only works with characters with FACS ARKit. Perhaps you want to contact the seller and address your questions.
It seems that LL Face App records video and CSV files. There are other apps that also record FBX. I have never used LL App, perhaps @alberto is correct, if you can record the videos, your iphone should be able to re-capture the data from the pre-recorded videos. It’s worth a try. The problem will be related to calibration and tweaking the facial expressions.
If your voice actor can record the videos, you can use the Faceware Studio which uses pre-recorded video and has an animation blueprint already setup for metahuman. There is a 30-day trial available, so you can test it out before production.
That is the problem with niche tech, it gets outdated really fast. NVIDIA’s audio2face is “free”, given that you have a RTX graphic card. And iClone was my best choice given all compatibilities with Metahuman, DAZ Studio and Audio2Face, as well as functionalities for automatic lip-synching and motion blending and retargeting.
Well let’s see next year… perhaps Iclone not even exist and there is a new product similar in Apple Store or adobe, like my substance!! I did pay a lot and now…
Yes, like Apple acquired FaceShifter, and Dinamixyz was acquired by a game studio.
You’ll still find Substance available on Steam, for lifetime license, no subscription needed.
Yes, I wasn’t really sure that iPhone would do it. Perhaps Faceware is your next choice, it does capture videos, but the actors need to learn the proper intensity of their facial expressions, otherwise it will be difficult to have visually accurate results using offline approach. It can be calibrated later using the software to transfer over UE5.
Here is a result using video without any calibration whatsoever.
Whenever you initiate a new recording from the Live Link Face app, the capture is recorded to the device in two files:
A .mov file that contains a reference video recorded by the camera…
A .csv file that contains the raw animation data captured by ARKit during the recording.This data file is not currently used by Unreal Engine or by the Live Link Face app. However, the raw data in this file may be useful for developers who want to build additional tools around the facial capture.