WHAT
This stream will cover the FaceAR Sample project recently released by Epic. Zak Parrish will discuss what the sample includes, how it works, and how you can use the ARKit tech to drive your own facial animation projects or use facial capture data in other ways!
At this time, no. Currently only the iPhone X has the necessary hardware and only ARKit (Apple) has an API we can tap into for this kind of feature. Not to say it’ll be that way forever; it’s just the way it is right now.
Wow! This was something I got really excited for when iphone X came out and I saw the emoji ish. All I was thinking was "how can I rip this data to use for games’! Can’t wait to see what you guys have learned and see what tools I should invest in for this!
Just wish to piggy back on this thread since it was mentioned in the stream.
Been trying to setup a remote compiling from PC to a Mac for IOS development and; specifically for AR and hopefully this. Have been pulling my hair out. Followed the documentation and scraped every bit of info on forums, Answer Hub and guides around the net. Had a post on the forum explaining my problem (another on Answer Hub). I will circle back to this project but had to put it on hold.
Any tips from anyone who has experience setting this up would be great.
Would it be possible (and straightforward) to do facial mocap for UE4 (Paragon characters) using F-clone or Brekel Pro Face 2 ? (I believe both use Kinect v2)
I´m very interested in the next step which is to import my own customized fbx head into the project.
I have tried to do that without success. I followed all the steps in the original doc and the video.
Can you please give us a clue how can we import our own models to the example?
I will really appreciate any clue or tip to get the work done.
We create our own 3d characters and need to implement facial capture in real time.
We are using Unity but we are trying to switch to Unreal.
Hey Robert, did you manage to import your own character? Are there any new tutorials? I want to set this up with my own character too, but can’t get it to work.
Hey i tried to deploy this project into my iphone but i get this error “Remote compiling requires a server name. Please specify one in the remote server name settings field”. And i followed the Face AR documentation with a model with the 51 blendshapes from polywink but got any luck. I´m working in a windows machine does that be the problem? Thanks in advance guys any help would be appreciated.
We’re struggling to get a custom character set up as well. I got it to push to our iphone, and the livelink shows the character moving in the editor, but no movement on the phone version.
I was wondering if the searching for a face to track pop-up is really necessary? It happens often even in well lit rooms. I’ve never noticed this kind of behavior with ani mojis or the Face Cap app. Why is the UE4 AR kit so sensitive? I’m going to try disabling the check in the blueprint and see what happens.
Hi,
I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I’m using a Mac?
Still hoping someone can share some insight into why the arkit tracking is so finicky compared to all other options. Keeps on throwing up the looking for face track dialogue.