Live link face offline capture

Hi all
We are capturing face motion using Live Link face app into UE5.
It works great.
We want to have the option to capture someone’s face remotely that don’t have UE5 installed so they can’t record the movements.
Seeing that the app has all the data locally anyway, is there a way to record the data in the app without Unreal and then send it to a developer to place within a scene and onto a metahuman?

The record option in the app seems to only save video.

Thanks

Hi there,
I haven’t tested, so I am not sure if this is really helpful. The only “offline” converter I know, which is NOT compatible with UE5/metahuman, is this one:
UE4 Marketplace - CSV To Animation Tutorial - YouTube
CSV Animation Converter in Blueprints - UE Marketplace (unrealengine.com)

It uses the animation exported as CSV to create animation curves, but it only works with characters with FACS ARKit. Perhaps you want to contact the seller and address your questions.

1 Like

Well you can send the video to the computer and play, then capture again the screen as face hahah

Not the best way but it works haha

I don’t understand why you can record video and don’t use later, makes no sense :+1:

It seems that LL Face App records video and CSV files. There are other apps that also record FBX. I have never used LL App, perhaps @alberto is correct, if you can record the videos, your iphone should be able to re-capture the data from the pre-recorded videos. It’s worth a try. The problem will be related to calibration and tweaking the facial expressions.

If your voice actor can record the videos, you can use the Faceware Studio which uses pre-recorded video and has an animation blueprint already setup for metahuman. There is a 30-day trial available, so you can test it out before production.

Faceware Studio and Epic’s MetaHumans - Faceware Technologies, Inc. | Award-winning, Gold Standard Facial Motion Capture Solutions.

But to capture the face from a video in the screen… is coolest haha

Just joking.

I tried faceware and others and nah, sadly I see nothing like…

I did hundreds of facial, and when I saw it for first time in real time… I was like I cannot believe.

I heard that LL Face is the best and more accurate. I’ll be using Audio2Face/Audio2Emotion combined with iClone Live Link for the lip-synching part.

And how much? I bought a second hand iPhone because of crazy prices Hehe

In 2013 I was using face shift and Kinect
And iPi for body

I bought licenses

And now I have more than 100 animations are not compatible, and a lot of devices for nothing jaj :heart_hands:

That is the problem with niche tech, it gets outdated really fast. NVIDIA’s audio2face is “free”, given that you have a RTX graphic card. And iClone was my best choice given all compatibilities with Metahuman, DAZ Studio and Audio2Face, as well as functionalities for automatic lip-synching and motion blending and retargeting.

1 Like

Well let’s see next year… perhaps Iclone not even exist and there is a new product similar in Apple Store or adobe, like my substance!! I did pay a lot and now…

Yes, like Apple acquired FaceShifter, and Dinamixyz was acquired by a game studio.
You’ll still find Substance available on Steam, for lifetime license, no subscription needed.

Look what I found!!! :face_with_peeking_eye:
https://www.ipisoft.com/2022/07/ipi-mocap-live-link-plugin-for-unreal-engine-5-0-released/

1 Like

Thing is, it’s the iPhone sensor that is used for face mimics on realtime. Not video analysis.

Yes, I wasn’t really sure that iPhone would do it. Perhaps Faceware is your next choice, it does capture videos, but the actors need to learn the proper intensity of their facial expressions, otherwise it will be difficult to have visually accurate results using offline approach. It can be calibrated later using the software to transfer over UE5.

Here is a result using video without any calibration whatsoever.

1 Like

Mmmm

Raw Facial Recordings

Whenever you initiate a new recording from the Live Link Face app, the capture is recorded to the device in two files:

  • A .mov file that contains a reference video recorded by the camera…

  • A .csv file that contains the raw animation data captured by ARKit during the recording.This data file is not currently used by Unreal Engine or by the Live Link Face app. However, the raw data in this file may be useful for developers who want to build additional tools around the facial capture.

I don’t understand very well this

1 Like

That plugin I’ve mentioned does the conversion, but not for UE5/Metahuman
UE4 Marketplace - CSV To Animation Tutorial - YouTube

I know that there are plugins for Maya and Blender, I am not familiar with their workflows to import back to UE.

1 Like

Well at the moment, I have not so much more facials to do and this is quick and just to click record

image

Do you have some video about how it looks using your setup, I mean something you did

No, I don’t have that project anymore. Here is an old video from a YT channel demonstrating Faceware

This is the first retarget I did from the old face shift animations

And this quick comparison:

1 Like

If you use natural light in the ambient, you reduce the jittering in your recordings.