Download

"Experimental support for facial animation"?

I found this little nugget while looking back through the release notes on 4.15:

Can a someone who knows a bit more elaborate on this? does this mean we can expect to see something like Valve’s face-poser which came with the Source SDK?

The work for this consisted of an initial proof-of-concept for how facial animation could be dealt with in the engine in terms of:

  • Import pipeline
  • Audio synchronization
  • Usage of pose-based and blendshape-based animation

It is by no means meant as a way of creating facial animation, and is nowhere near production-ready. No work besides bugfixes was done on it for 4.16, so its status stands as ‘experimental’.

It consists of a single plugin that provides some UI for batch import of sounds/curves and a component that allows for audio sync.

One engine feature was completed as part of this work, and is not experimental; the “Curve Source” anim node. This allows animation curves to be programmatically driven by any component or actor that implements a particular interface.

Thanks very much for the reply, was worried this would get buried.

So to make sure I understand correctly: it will not allow any kind of automatic lipsync, am I understanding correctly? Or is that what “Curve Source” does?

How do you implement that interface? Is it possible using blueprints? Does this have anything to do with the AudioCurveSource component? And how do you set it up generally? Question bombardment :rolleyes:

I was thinking you could use this to play wave files of lipsync lines split into phonemes and hook these up to blendshapes, does this seem possible?

Yes, this is the general idea behind the system at present, although we are focusing more on pose-blending of skeletal animations over blendshapes, the principles are roughly the same.

Yes, and code. Create a component or actor that implements the ICurveSourceInterface interface. There are three functions to implement, although only two are important right now: GetBindingName() and GetCurves(). The binding name is used by the curve source anim node to bind at runtime to name returned from GetBindingName(). The node can bind to components of the current actor, the current actor itself, or components or actors stored as member variables of that actor. GetCurves() returns an array of name/value pairs that specify the value each named curve will receive in the anim graph. Note that curves can drive pose weights, blendshapes, material parameters etc.

Yes, UAudioCurveSourceComponent implements ICurveSourceInterface, and as such is the only example of the system working.

Have not looked at the addition yet but I assume that is there to correct the sync problem. The thing about UE4 is it generally interpolates everything as far as key framed animation goes so if lip sync is added to the animgraph as yet another animation source the result generally does not match the audio as authored.

Thanks, sounds pretty cool!

I don’t really know how to set this up. I’ve got a character with an AudioCurveSource component set up with a random sound wave. The binding name is Default. In the anim BP I’ve got a random animation without curves hooked up into a Curve Source node with the source binding named Default. I guess this should complete the binding, but what creates the actual curves? I had the assumption that this was something that turned the amplitude of the audio into curves but looking in the code it looks like it’s supposed to work with .fbx with curves and sound somehow.

Can you use Audio Source in 4.16 or are there parts missing? I’m testing with a morph target called Key 1 but I’m a bit lost on how to continue.

Interesting. Will the result be equal to the fidelity as imported from MotionBuilder using the voice device?

Right now unless you have a third party library (such as FaceFX) creating the curves for you, or you are prepared to write your own amplitude/RMS lip-flap code, the system will not do anything. As I said, it is experimental :slight_smile:

Fidelity is entirely down to the system generating the curves and the quality of the content, so I guess it could be? Sorry, I’m not familiar with MotionBuilder at all!

Talk to Zack :wink: He knows. :smiley:

Okay, I understand now. The new audio engine in 4.16 has a way to set up an envelope follower so the code has been written for me already! It only outputs a float though, so I’m guessing it can’t be used with Curve Source as it is.

I was able to attach the envelope follower preset to a sound and have it change a morph target directly as the sound was playing so that works. It doesn’t allow you to use third party libraries in any way so I guess it’s a bit different but it should work for what I’m doing.

Quite the contrary, as a ‘curve’ at animation evaluation time is just a float value. We just call them curves for legacy reasons because usually the values are defined by curve evaluation.

Okay, but is it possible to hook up this envelope follower to the Curve Source node then? :confused: Or any float in general?
edit: (only using blueprint)

I’ve not tried it, but from the looks of it, yes indeed. You will need an envelope follower-derived component that also implements the curve source interface. It would output the named value of the curve you want to drive in GetCurves().

I tried but I couldn’t create a blueprint class based on the component (that option was grayed out in right click) so I guess you can’t do it in blueprint right now. For the coders out there you can give this a try though. For myself I’ll try to hack together a lipsyncing system of my own. Thanks a lot for the help! :smiley:

You could create a ‘bridge’ component at the moment to do this.

That’s the first time I’ve heard of a bridge component. How would you accomplish that?

Also I updated to 4.16 preview 3 (testing using 4.16) and noticed that I could see code interfaces again which is helpful. :rolleyes:

Its a term I just made up :slight_smile: Idea being you could create a component that implements the curve source interface, but all it does is grab the parent actor, look for any envelope followers and forward its data.

Looking again however, with the implementation the way it is you may be best implementing the curve source interface on your actor, as the envelope follower uses an Blueprint Assignable event to communicate its status, which IIRC you cant handle on a component.