Live link face app csv

Hello. Does anybody has experience using the csv files generated by the live link face app? My issue is that I want to generate this face data remotely form different performers/talent and then being able to import it and use it on my project. Maybe it is possible to get the remote recordings and put them on my ipad pro and live link those recordings? anyone has experience with this?

@VictorLerp

well, the csv is just data, right?
you would have to parse the data and populate curve values/morph values.

how you go about it depends on how the CSV is laid out.

The better solution would be to convert it to an animation (animating morph targets) within blender, and exporting an fbx file (with curves).
because of python scripting you can also manipulate the csv file much easier imho.

There’s currently no existing feature to use the .CSV files, but there’s planned support for an integrated easy-to-use workflow.

2 Likes

Ok thank you for the input!!

Is there any info how to use this csv data in Blender?
I’d love to use UE for capturing, but then I want to be able to edit the animation in external tools. As I understand I can’t export the recorded morph data from sequencer, only if transferring it to empty joints. But csv is supposed to be for this purpose. So any workflow tips will be very welcome. (Other packages are ok too, my main is Houdini)

i was wondering the same thing. Just found this about using the data in Blender. about to read it but wanted to share here for you guys.

https://faceit-doc.readthedocs.io/en/latest/epic_utils/

Any updates on this? At the moment, of course, we can record “video reference” with the Live Link Face app but not actually record and playback facial mocap. :slight_smile: It records that pesky CSV but there is no way to import it into Unreal. I just tweeted at Victor in a cheeky way to remind him we’re all jonesing for this ability! :slight_smile:

I have not tested this but apparently this will import the csv into blender and also can make the morphs/blends to use with the facelivelink Faceit : Facial Expressions and Performance Capture - Blender Market

1 Like

Yes, really looking for this functionality asap, especially considering lockdown etc.

Livelink is great, but the talent needs to do the input.

I may be wrong, but does the input rate using Livelink straight into Unreal Editor depend on the update speed of Unreal Editor?

e.g. If you’re capturing at 60fps on the iphone but only 25fps on the workstation, will the data only be captured at 25fps?

or would you get better fidelity being able to import the CSV at a higher fps?

Ta,

Adam.

1 Like

I’ve used the csv file in a plugin called** faceit** in blender. there’s an option after you rig the character’s face to use live link’s csv file for animation.

@VictorLerp Is there any roadmap for imprting raw csv or any directions you could pinpoint us? I would love to convert the csv file into an animation asset which could drive the metahuman arkit mapping pose. Right now my recording workflow is destructive since all enhancements to the arkit animation are baked into my animation.

It’s a feature we want to add but we currently don’t have a timeline as to when it might be implemented.

3 Likes

Thinking about a workaround. Would it be possible to select which animation curves are recorded in take recorder? So we could only record the ARKit blendshape curves while applying additional cleanup logic in the animbp (like drive secondary motion, curves, etc), but they would not be baked into the anim file. This way the raw Arkit data woild be preserved in an animation.

Edit: It would be helpful to determine the point of recording in the chain of evaluation inside an anim BP. For example if there are three nodes:

  1. LiveLink Pose
  2. Modify Curve
  3. Output Pose

If we could set that the recording happens after Node 1, this would make the whole system a lot more flexible. So we could still modify the animation during recording to get an accurate live previz/monitoring, but maintain flexibility to change this after recording. Hope this makes sense

It’s possible. I ask the actors to record their performances with their iPhones. They share the .zip files, and I parse the .csv’s with Python inside Cinema 4D.

I wish there was an option to record the performance as an FBX.

Could you share more details about this process?

I wrote a Python script that parses those Live Link Face .csv files and redirects the raw data to a dummy cube that contains all pose morphs - named as they are supposed to be in Unreal. Then, I apply Key Reduction(Timeline) to get rid of most of the jitter. After that, I run another script that transfers the animation to my characters. After that, it’s a matter of fine-tuning the performance pass by pass, from larger movements down to small details. I use Motion Clips to deal with the tongue. Never fails.

4 Likes

Hello! Just wanted to show my absolute interest on this feature. Also @Leo_Saramago are you planning on some sort of release to that script? Thank you so much!

1 Like

Hi! It wouldn’t be of much help because I run it as a prototype, meaning it was not written to be universal. My rigs are very specific. Sorry, I don’t have the time to pursue that goal.

The good news is I know for a fact that Live Link Face App. developers have plans to make .csv’s readable in Unreal. This will allow everyone to have performances captured remotely.

1 Like

Any update on this?
Thanks.

4 Likes