Current bugs and limitations:
The Android app doesn’t handle client disconnection perfectly, you might have to restart the app if you don’t get any incoming data.
The project requires the free ‘TCP Socket plugin’
Biggest limitation right now is the ability to convert the incoming data into blend weights. It can be fairly complex and advanced to put together a robust system to generate this data. Currently i’m simply interpolation between 0.0 and 1.0 based on the average distance vertice to vertice of a subset of vertices defined per blendshape, between the current position of a vertice and its blendshape Max position.
You can download both the Apk and the Unreal Project here:
Android app to stream Face data over Network:
Unreal Engine Project to read this stream of data, generate blend weights, and hook it into LiveLink:
Hello Unreal Community, Opening a new topic regarding my recent effort (24H literally) to get Android Face Capture going. Goal is to create an android alternative to the Apple FaceAR solution, with MetaHumans compatibility . I am starting from scratch and everything is gonna be open source, starting with the Android App which send ArCore face data to Unreal Engine trough TCP (free TCP Socket Plugin from Unreal). This is what i have so far (recorded galaxy s9): quote: "wow wow wow wow wow wow" https://www.youtube.com/watch?v=_kLu96hfot0 https://i.imgur.com/w20bTLD.png