Advice please: facial tracking to drive the animation response

hi - I’m an artist in research mode, interested in the potential of UE. I wish to use the facial tracking data of a viewer that stands in front of a screen. I want to use that data to control/trigger the pre-set responses of an animated character.

Basically: if the viewer looks at the character, the character responds one way, if the user looks away, the avatar responds another way. (The viewer would need to be about 3 feet away from the camera, so eye tracking is not possible.)

Q: Are there facial tracking plug-ins/packages for use in UE?
Q: can these be used to target and trigger event sequences of animation for a UE character?
Q: can this work as a published gaming file on a Mac or PC?

thank you!

You could place a huge gyroscope tracking device on the player’s forehead that sends the rotation and accelerometer linear movement sorta like the iphone or occulus. There is also a kinect plugin released for ue4. You could utilize opencv and the many tracking solutions out there and there are other ways to do it either with shadows, normal map conversion, with geometry reconstruction or the difference in pixels over a set number of frames. If u check the difference in size between the two halves of a persons face itll probably work to determine which way they’re looking. The first suggestion is the easier choice.

a huge gyroscope… uh huh. thanks. Maybe I could tape a map over their face, too.

I was thinking more along the lines of what’s available in Brekel or FaceShift.

Haha yeah couldn’t help myself. Well you can use faceshift by creating an interface to collect data from the tool and modify any attributes in game. Creating such a thing would mean that you use their application as a base to collect the data which probably would require their approval for resale, not so sure. You can check out the kinect plugin and recreate the tool in probably less than a month. In case you didn’t see it, the kinect plugin helper scene is available in the marketplace under the blueprint section.

no worries, it was a good image. :wink:
thanks for the suggestions. So you think I can harness input data from an app like FaceShift to affect the animation responses of an avatar.
I’ll check out what’s said about the kinect plugin helper helping to direct data. btw, I’m not re-selling anything, this is to be used for a public installation environment.

Oh yes you definitely can. This image says it all

I’ll check into it when I get some free time, but you’ll want to watch the incoming data on the assigned port that’s being used to modify the face controllers, or use the translation values of the controllers to directly modify data that will be read in Unreal. You can usually output all data in maya in the output panel. Anyways you’ll just need to intercept the data and then you can use it however you want.

hey Garner, so sorry that I missed this response to the conversation. I didn’t see it until I returned to the thread just now. That’s a helpful post.
So do you think (or know) if FaceShift can supply this kind of active reading of a viewer’s face through a finished, published game project file, so that facial tracking is part of the gaming interactivity?