Announcement

Collapse
No announcement yet.

Unreal Engine Livestream - Introducing the FaceAR- August 9 - Live from Epic HQ

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Unreal Engine Livestream - Introducing the FaceAR- August 9 - Live from Epic HQ

    Click image for larger version  Name:	20180807.png Views:	1 Size:	269.4 KB ID:	1512399




    WHAT
    This stream will cover the FaceAR Sample project recently released by Epic. Zak Parrish will discuss what the sample includes, how it works, and how you can use the ARKit tech to drive your own facial animation projects or use facial capture data in other ways!

    WHEN
    Thursday, August 9th @ 2:00PM ET - Countdown

    WHERE
    Twitch
    Youtube
    Facebook

    WHO
    Zak Parrish - Sr. DevRel Tech Artist - @Zakparrish
    Tim Slager - Community Manager - Kalvothe
    Amanda Bott - Community Manager - @amandambott

    ARCHIVE
    Last edited by Kalvothe; 10-05-2018, 01:38 PM.

  • replied
    Still hoping someone can share some insight into why the arkit tracking is so finicky compared to all other options. Keeps on throwing up the looking for face track dialogue.

    Leave a comment:


  • replied
    Originally posted by davidqvist View Post
    Hi,
    I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I'm using a Mac?
    Yes, you need a mac to compile. Yes, you can use a Windows machine - live will connect between the iphone and the windows machine running ue4

    Leave a comment:


  • replied
    Hi,
    I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I'm using a Mac?

    Leave a comment:


  • replied
    I was wondering if the searching for a face to track pop-up is really necessary? It happens often even in well lit rooms. I've never noticed this kind of behavior with ani mojis or the Face Cap app. Why is the UE4 AR kit so sensitive? I'm going to try disabling the check in the blueprint and see what happens.

    Leave a comment:


  • replied
    We're struggling to get a custom character set up as well. I got it to push to our iphone, and the livelink shows the character moving in the editor, but no movement on the phone version.

    Leave a comment:


  • replied
    Hey i tried to deploy this project into my iphone but i get this error "Remote compiling requires a server name. Please specify one in the remote server name settings field". And i followed the Face AR documentation with a model with the 51 blendshapes from polywink but got any luck. I´m working in a windows machine does that be the problem? Thanks in advance guys any help would be appreciated.

    Leave a comment:


  • replied
    Hi, I'd love to know how is possible to export a recording to FBX with blend shapes animation?

    Leave a comment:


  • replied
    Originally posted by Robert-77 View Post
    Amazing job!! Congrats!!

    I tried the sample and it worked perfect!

    I´m very interested in the next step which is to import my own customized fbx head into the project.
    I have tried to do that without success. I followed all the steps in the original doc and the video.

    Can you please give us a clue how can we import our own models to the example?

    Is there another doc where we can find that out besides the original doc:
    https://docs.unrealengine.com/en-us/...R/FaceARSample ?

    I will really appreciate any clue or tip to get the work done.
    We create our own 3d characters and need to implement facial capture in real time.
    We are using Unity but we are trying to switch to Unreal.

    Thanks in advance!
    Robert.
    Hey Robert, did you manage to import your own character? Are there any new tutorials? I want to set this up with my own character too, but can't get it to work.

    Leave a comment:


  • replied
    Amazing job!! Congrats!!

    I tried the sample and it worked perfect!

    I´m very interested in the next step which is to import my own customized fbx head into the project.
    I have tried to do that without success. I followed all the steps in the original doc and the video.

    Can you please give us a clue how can we import our own models to the example?

    Is there another doc where we can find that out besides the original doc:
    https://docs.unrealengine.com/en-us/...R/FaceARSample ?

    I will really appreciate any clue or tip to get the work done.
    We create our own 3d characters and need to implement facial capture in real time.
    We are using Unity but we are trying to switch to Unreal.

    Thanks in advance!
    Robert.

    Leave a comment:


  • replied
    ActualZak

    Would it be possible (and straightforward) to do facial mocap for UE4 (Paragon characters) using F-clone or Brekel Pro Face 2 ? (I believe both use Kinect v2)

    Thanks beforehand

    Leave a comment:


  • replied
    Just wish to piggy back on this thread since it was mentioned in the stream.

    Been trying to setup a remote compiling from PC to a Mac for IOS development and; specifically for AR and hopefully this. Have been pulling my hair out. Followed the documentation and scraped every bit of info on forums, Answer Hub and guides around the net. Had a post on the forum explaining my problem (another on Answer Hub). I will circle back to this project but had to put it on hold.

    Any tips from anyone who has experience setting this up would be great.

    Leave a comment:


  • replied
    looking forward to watch this one, learned a lot by listening to zak since unreal 3. good man

    Leave a comment:


  • replied
    I am so excited for both , using an iPhone X to do MoCap and Zak Parish being the one who talks about it.

    Leave a comment:


  • replied
    And mass effects animators just set they're calendars.

    Leave a comment:

Working...
X