Note:
the facial tracking of arcore is not as good and somehow limited compared to the ios app,
for example it wont detect blinking,
i hope this app serves as a starting point for anyone who wants to experiment with facial movement,
maybe if the app gets enough users and with ads i can get to implement a better tracking with a comercial tracking sdk.
Enjoy and have fun,
please share if you find the app useful.
Wow ! Can you explain how you can use it with unreal ? a video tutorial ? thanks ! First alternative to Iphone solution ! especially great if you don’t like Iphones or don’t have one like me.
For those who are searching “mocap” in your applications search for “AugmentedFace”
install the app in your phone, make sure you are connected to your wifi and run the app (it should show you your ip address)
download the example project, open map0 and run the game, it should ask you for the IP then just press the button
How it works:
the android app creates a tcp server and sends all the tracked data every frame
the ue4 game acts as a client wich connects to the android tcp server
The ue4 game uses a tcp plugin to be able to make the connection, there is one blueprint wich uses the tcp component to
parse all the data coming from the server.
Nice! This is really great. @paranoio Will you make it open-source and put it up on GitHub or somewhere? I would definitely love to work on the blinking-detection feature! Cheers.
I had not thought about sharing the source , i guess its posible if more users ask for it.
The blink detection is not provided at the moment by arcore ( wich is responsible for the tracking )
and the overall face tracking needs to be improved so
im kind of waiting for google to update the arcore framework , they already added a depth image
so it should be posible for them to improve the tracking at least for lidar phones .
Hi , you may use any tcp client you prefer, there are some free tcp clients available on the marketplace , you will have to modify the example project to use the plugin you choosed.
Hi, first love the app its really useful, also i wanted to know is it possible to use the app over a wired connection instead of the provided wireless connection, cause there seems to be a huge delay between what i do on the phone and the actions that happen in the demo project.
We reached 1000 users !!!
Thank you everyone, please send your comments and share your work, i may not answer all the time but lets help each other.
@nywixtv i want to create a tutorial but its been some busy months, cant promise when but i will try to do it @TheAlienJD i know in some wireless networks it could be slow, as i recall the code in the app will look for the first network connection on the device, i guess you could try connecting a direct ethernet cable to your device and disable the wifi , but again, i dont know if that would work
Hey thanks for the quick reply. Sadly my pc is connected with an ethernet cable, and my phone is connected through wifi and the router is fairly close by but still it lags. I’m not a genius or anything at coding but im fairly sure it should be possible to change the connection type to usb. But if that is not an option currently, is there a method where i can use my pc webcam instead of the phones cam cause that would awesome.
hi just tried the app and it works perfectly , i am definitely going to try this for some cartoon characters i have got , and any idea how i could just export the point data to any 3d software, because i am familiar with the marker facial mocap method in 3d with a pre recorded video , but this live thingy is amazing , i am doing cinematics so this app would be definitely useful if there was any way to export the point animation data as fbx or alembic . This is coool, i am using a ethernet for my pc and wifi for my moto x4 they seem to work pretty good .
Hello Paranoio,
Thank you for this nice code, I’m testing with my Oneplus 5t and it’s working fine.
With the meta human project soon to be released do you know of a way to link the data coming from an android device to those characters?
@Mostah_cg hey , i was thinking about this a few months ago, looking into the code of ue4 seems posible
but there is a catch, i require either an iphone X (or above) or someone to send me a network capture of the udp data transmitted by the iphone to the ue4 computer
i dont want to get an iphone just to for this
so to try it i need that udp network traffic , if anyone reading this post want to help just run wireshark or some udp capture utility
while using the iphone with live link and send me the captured data.
@paranoio Thanks for the app! It’s great! I would like to use it to control a 3D character, but new to UE and no idea how to go forward. Do you have any tips or tutorial you could point to me so I could try to get something done?