the facial tracking of arcore is not as good and somehow limited compared to the ios app,
for example it wont detect blinking,
i hope this app serves as a starting point for anyone who wants to experiment with facial movement,
maybe if the app gets enough users and with ads i can get to implement a better tracking with a comercial tracking sdk.
Enjoy and have fun,
please share if you find the app useful.
I had not thought about sharing the source , i guess its posible if more users ask for it.
The blink detection is not provided at the moment by arcore ( wich is responsible for the tracking )
and the overall face tracking needs to be improved so
im kind of waiting for google to update the arcore framework , they already added a depth image
so it should be posible for them to improve the tracking at least for lidar phones .
Hi, first love the app its really useful, also i wanted to know is it possible to use the app over a wired connection instead of the provided wireless connection, cause there seems to be a huge delay between what i do on the phone and the actions that happen in the demo project.
We reached 1000 users !!!
Thank you everyone, please send your comments and share your work, i may not answer all the time but lets help each other.
@nywixtv i want to create a tutorial but its been some busy months, cant promise when but i will try to do it @TheAlienJD i know in some wireless networks it could be slow, as i recall the code in the app will look for the first network connection on the device, i guess you could try connecting a direct ethernet cable to your device and disable the wifi , but again, i dont know if that would work
Hey thanks for the quick reply. Sadly my pc is connected with an ethernet cable, and my phone is connected through wifi and the router is fairly close by but still it lags. I’m not a genius or anything at coding but im fairly sure it should be possible to change the connection type to usb. But if that is not an option currently, is there a method where i can use my pc webcam instead of the phones cam cause that would awesome.
hi just tried the app and it works perfectly , i am definitely going to try this for some cartoon characters i have got , and any idea how i could just export the point data to any 3d software, because i am familiar with the marker facial mocap method in 3d with a pre recorded video , but this live thingy is amazing , i am doing cinematics so this app would be definitely useful if there was any way to export the point animation data as fbx or alembic . This is coool, i am using a ethernet for my pc and wifi for my moto x4 they seem to work pretty good .
Thank you for this nice code, I’m testing with my Oneplus 5t and it’s working fine.
With the meta human project soon to be released do you know of a way to link the data coming from an android device to those characters?
@Mostah_cg hey , i was thinking about this a few months ago, looking into the code of ue4 seems posible
but there is a catch, i require either an iphone X (or above) or someone to send me a network capture of the udp data transmitted by the iphone to the ue4 computer
i dont want to get an iphone just to for this
so to try it i need that udp network traffic , if anyone reading this post want to help just run wireshark or some udp capture utility
while using the iphone with live link and send me the captured data.
@paranoio Thanks for the app! It’s great! I would like to use it to control a 3D character, but new to UE and no idea how to go forward. Do you have any tips or tutorial you could point to me so I could try to get something done?