Download

Face capture with Android - Metahuman - Download Links | Free | Open Source | Demo

Update:

Current bugs and limitations:

  • The Android app doesn’t handle client disconnection perfectly, you might have to restart the app if you don’t get any incoming data.

  • The project requires the free ‘TCP Socket plugin’

  • Biggest limitation right now is the ability to convert the incoming data into blend weights. It can be fairly complex and advanced to put together a robust system to generate this data. Currently i’m simply interpolation between 0.0 and 1.0 based on the average distance vertice to vertice of a subset of vertices defined per blendshape, between the current position of a vertice and its blendshape Max position.

You can download both the Apk and the Unreal Project here:
https://drive.google.com/drive/folders/1xmO57xO6RTyviZEkmkYsE2YY4AAzxlsp?usp=sharing

Android app to stream Face data over Network:

Unreal Engine Project to read this stream of data, generate blend weights, and hook it into LiveLink:


Hello Unreal Community, 

Opening a new topic regarding my recent effort (24H literally)  to get Android Face Capture going. Goal is to create an android alternative to the Apple FaceAR solution, with MetaHumans compatibility .

I am starting from scratch and everything is gonna be open source, starting with the Android App which send ArCore face data to Unreal Engine trough TCP (free TCP Socket Plugin from Unreal).

This is what i have so far (recorded galaxy s9):

quote: "wow wow wow wow wow wow"
https://www.youtube.com/watch?v=_kLu96hfot0
https://i.imgur.com/w20bTLD.png
4 Likes

Got the head bone movement imported, time to put the groundwork for blendshapes!
Now that the whole android part is usable enough and send the data i need i am back in known territories.

1 Like

we need this!

1 Like

Everything is working Just, Fine !

You can download both the Apk and the Unreal Project here:
https://drive.google.com/drive/folders/1xmO57xO6RTyviZEkmkYsE2YY4AAzxlsp?usp=sharing

Android app to stream Face data over Network:

Unreal Engine Project to read this stream of data, generate blend weights, and hook it into LiveLink:

2 Likes

Free and open source, updated the various links.

1 Like

Updated the App presentation photo :smiley:

I did fix the demo project requirements:
Initial version required 4.26.2-chaos built from source, whereas it now works with 4.26 downloaded from the Epic games Launcher.

Hi, the provided project file on google drive gives an error if using regular Unreal Engine 4.26.2 . Here is the error screen:

error

1 Like

Does the rebuild fails?
You can also try the version available on github:

Downloading source code zip file

  • I get “need manually source building error” after trying to rebuild with Google Drive version (as seen on the top of the picture):

  • Github version asks for Unreal Engine version and picking 4.26 gives the bottom error on the picture :

1 Like

This is a C++ project, you’re gonna need to have your Unreal Engine installation to be able to handle such project. (installing visual studio etc…)

Thanks, I rebuilt with VS2019 and configured all the steps on video. It works on my side now.
I am looking forward for it be improved for more accurate results (such as blinks).
Keep up the good work.

1 Like

I always wanted to figure out if it’s possible to convert face mesh landmarks to blendshapes. It’s great to see your implementation. Thank you!

In your project, you use the AndroidBlendshape assets to accomplish this, I am wondering how do you get the values in these AndroidBlendshape assets? Do you have utilities to generate these assets?

If you right click in the content browser you should see the category “FaceAndroid” which allow you to create 2 objects:

  • AndroidBlendShape : a blendshape location, a subset of relevant indices from the face, a position of face points when blendshape is maxed out.

  • AndroidBlenshapeWeights : Holds a map of blendweights computed at runtime purely based on aggregated distances from relevant face point runtime position compared to maxed out blendshape position.

Using distances is really trivial, wouldn’t be surprised if machine learning was industry standard past this point to convert points to weights. I might give it a shot later if i find relevant data.

Hi, thank you for the detailed explanation, there are two arrays in the AndroidBlendShape asset, Vertices to Control and Vertices Max Mvt, both have hundreds of elements. I think it’s quite difficult to fill these arrays by hand, so I am wondering whether you have any utilities, thank you.

1 Like

I.

On the default demo map there’s a blueprint representing the face points indices, which is actually a blueprint allowing the extraction of relevant indices:

From the blueprint viewport, go in top view/lit mode and adjust the inclusion box extent/position (the extent, not the scaling), to select the indices you want to extract (i.e points around left eyes).

Once your box is placed go into the construction script, select the blendshape you want to send the extracted indices to, toggle the safety boolean off and compile the blueprint.

You extraced : Vertices to Control

I see, that’s great, thank you for the reply.