Unreal Engine Livestream - Introducing the FaceAR- August 9 - Live from Epic HQ

20180807.jpg

WHAT
This stream will cover the FaceAR Sample project recently released by Epic. Zak Parrish will discuss what the sample includes, how it works, and how you can use the ARKit tech to drive your own facial animation projects or use facial capture data in other ways!

WHEN
Thursday, August 9th @ 2:00PM ET - Countdown

WHERE
Twitch
Youtube
Facebook

WHO
Zak Parrish **- **Sr. DevRel Tech Artist - @Zakparrish](http://www.twitter.com/zakparrish)
Tim Slager - Community Manager - @Kalvothe
Amanda Bott - Community Manager - @amandambott](http://twitter.com/amandambott)

ARCHIVE

Can we use ARCore for this too ? If not, why not ?

Excited to watch this stream!

At this time, no. Currently only the iPhone X has the necessary hardware and only ARKit (Apple) has an API we can tap into for this kind of feature. Not to say it’ll be that way forever; it’s just the way it is right now.

Wow! This was something I got really excited for when iphone X came out and I saw the emoji ish. All I was thinking was "how can I rip this data to use for games’! Can’t wait to see what you guys have learned and see what tools I should invest in for this!

cool nice style

And mass effects animators just set they’re calendars.

I am so excited for both :D, using an iPhone X to do MoCap and Zak Parish being the one who talks about it.

looking forward to watch this one, learned a lot by listening to zak since unreal 3. good man

Just wish to piggy back on this thread since it was mentioned in the stream.

Been trying to setup a remote compiling from PC to a Mac for IOS development and; specifically for AR and hopefully this. Have been pulling my hair out. Followed the documentation and scraped every bit of info on forums, Answer Hub and guides around the net. Had a post on the forum explaining my problem (another on Answer Hub). I will circle back to this project but had to put it on hold.

Any tips from anyone who has experience setting this up would be great.

@ActualZak

Would it be possible (and straightforward) to do facial mocap for UE4 (Paragon characters) using F-clone or Brekel Pro Face 2 ? (I believe both use Kinect v2)

Thanks beforehand

Amazing job!! Congrats!!

I tried the sample and it worked perfect!

I´m very interested in the next step which is to import my own customized fbx head into the project.
I have tried to do that without success. I followed all the steps in the original doc and the video.

Can you please give us a clue how can we import our own models to the example?

Is there another doc where we can find that out besides the original doc:
https://docs.unrealengine.com/en-us/…R/FaceARSample ?

I will really appreciate any clue or tip to get the work done.
We create our own 3d characters and need to implement facial capture in real time.
We are using Unity but we are trying to switch to Unreal.

Thanks in advance!
Robert.

Hey Robert, did you manage to import your own character? Are there any new tutorials? I want to set this up with my own character too, but can’t get it to work.

Hi, I’d love to know how is possible to export a recording to FBX with blend shapes animation?

Hey i tried to deploy this project into my iphone but i get this error “Remote compiling requires a server name. Please specify one in the remote server name settings field”. And i followed the Face AR documentation with a model with the 51 blendshapes from polywink but got any luck. I´m working in a windows machine does that be the problem? Thanks in advance guys any help would be appreciated.

We’re struggling to get a custom character set up as well. I got it to push to our iphone, and the livelink shows the character moving in the editor, but no movement on the phone version.

I was wondering if the searching for a face to track pop-up is really necessary? It happens often even in well lit rooms. I’ve never noticed this kind of behavior with ani mojis or the Face Cap app. Why is the UE4 AR kit so sensitive? I’m going to try disabling the check in the blueprint and see what happens.

Hi,
I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I’m using a Mac?

Yes, you need a mac to compile. Yes, you can use a Windows machine - live will connect between the iphone and the windows machine running ue4

Still hoping someone can share some insight into why the arkit tracking is so finicky compared to all other options. Keeps on throwing up the looking for face track dialogue.