Announcement

Collapse
No announcement yet.

Unreal Engine Livestream - Introducing the FaceAR- August 9 - Live from Epic HQ

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Unreal Engine Livestream - Introducing the FaceAR- August 9 - Live from Epic HQ

    Click image for larger version  Name:	20180807.png Views:	1 Size:	269.4 KB ID:	1512399




    WHAT
    This stream will cover the FaceAR Sample project recently released by Epic. Zak Parrish will discuss what the sample includes, how it works, and how you can use the ARKit tech to drive your own facial animation projects or use facial capture data in other ways!

    WHEN
    Thursday, August 9th @ 2:00PM ET - Countdown

    WHERE
    Twitch
    Youtube
    Facebook

    WHO
    Zak Parrish - Sr. DevRel Tech Artist - @Zakparrish
    Tim Slager - Community Manager - Kalvothe
    Amanda Bott - Community Manager - @amandambott

    ARCHIVE
    Last edited by Kalvothe; 10-05-2018, 01:38 PM.

  • replied
    We have followed the instructions and have everything running.
    However the Livelink does not seem to be activated: the Windows 10 app does not show any updates the iOS app should be sending out. iOS does detect the face and updates within the app.
    -> Both machines are on the same network and should be able to see each other. (Check with a small setup using OSC communication)
    -> Tried this in 4.20.3 and 4.22.1
    Do we need to add any special rules to the (Windows default) firewall?

    In the docs (https://docs.unrealengine.com/en-US/...ple/index.html) it says:
    "Launch your Project, and in Project Settings set Supports AR to On."
    Searched back and forth but I cannot find this setting in the project settings, can this be the cause?

    Thanks in advance for any tips which will push uis in the right direction towards a working setup.

    Leave a comment:


  • replied
    I've been using te FaceAR for quite a while now, and I've also been experimenting with optimization and easy use of this app in order to drive different characters.

    You can see an example of a retargeted character here.

    You can either retarget the KiteBoy AnimBP to your character, or you can get the values from each morph target curve ( as I did in the video above ) and cast those to your character.
    In order to increase the quality I also added an option to switch between 30 and 60 fps, in order to improve the facial tracking quality, and it does look better and smoother, so is something that you might want to try.

    Together with IKinema Orion and the Hi5 VR Gloves, I'm now able to get true Full Body Motion Capture setup, and it's working nicely, I also did a live demo of the entire setup, and people were shocked when they found out about the overall cost of the entire setup, since it's really something innovative and easy to use, and the pricing is far to be as expensive as professional solution like Cubic Motion/Vicon.

    Leave a comment:


  • replied
    Should remote build be working for this project when trying to launch to device from windows10 using as a remote a mac os mojave with xcode updated? I'm on 4.22.1 and getting BUILD ERROR (Missing receip, target not build, missing UE4 binary) I don't know if I have screw up my setup while updating to mojave and also to 4.22 on both OS, or if this project needs to be launched to the device directly from a macos with xcode.

    Leave a comment:


  • replied
    Live Link Connect Failed When I Packaged FaceARSample?

    Leave a comment:


  • replied
    What would the steps be to replace the character in the demo with our own?

    Leave a comment:


  • replied
    Still wondering if anyone has found a work-around for the ue4 arkit in this face AR sample losing tracking? I've never seen another true depth arkit face-tracking app on the iphone lose tracking like this does. It's hard to believe i'm alone here.

    Leave a comment:


  • replied
    Still hoping someone can share some insight into why the arkit tracking is so finicky compared to all other options. Keeps on throwing up the looking for face track dialogue.

    Leave a comment:


  • replied
    Originally posted by davidqvist View Post
    Hi,
    I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I'm using a Mac?
    Yes, you need a mac to compile. Yes, you can use a Windows machine - live will connect between the iphone and the windows machine running ue4

    Leave a comment:


  • replied
    Hi,
    I understand that I need a Mac to build the app to the iPhone but can I then use UE4 on Windows to connect or will the live link only work if I'm using a Mac?

    Leave a comment:


  • replied
    I was wondering if the searching for a face to track pop-up is really necessary? It happens often even in well lit rooms. I've never noticed this kind of behavior with ani mojis or the Face Cap app. Why is the UE4 AR kit so sensitive? I'm going to try disabling the check in the blueprint and see what happens.

    Leave a comment:


  • replied
    We're struggling to get a custom character set up as well. I got it to push to our iphone, and the livelink shows the character moving in the editor, but no movement on the phone version.

    Leave a comment:


  • replied
    Hey i tried to deploy this project into my iphone but i get this error "Remote compiling requires a server name. Please specify one in the remote server name settings field". And i followed the Face AR documentation with a model with the 51 blendshapes from polywink but got any luck. I´m working in a windows machine does that be the problem? Thanks in advance guys any help would be appreciated.

    Leave a comment:


  • replied
    Hi, I'd love to know how is possible to export a recording to FBX with blend shapes animation?

    Leave a comment:


  • replied
    Originally posted by Robert-77 View Post
    Amazing job!! Congrats!!

    I tried the sample and it worked perfect!

    I´m very interested in the next step which is to import my own customized fbx head into the project.
    I have tried to do that without success. I followed all the steps in the original doc and the video.

    Can you please give us a clue how can we import our own models to the example?

    Is there another doc where we can find that out besides the original doc:
    https://docs.unrealengine.com/en-us/...R/FaceARSample ?

    I will really appreciate any clue or tip to get the work done.
    We create our own 3d characters and need to implement facial capture in real time.
    We are using Unity but we are trying to switch to Unreal.

    Thanks in advance!
    Robert.
    Hey Robert, did you manage to import your own character? Are there any new tutorials? I want to set this up with my own character too, but can't get it to work.

    Leave a comment:

Working...
X