Announcement

Collapse
No announcement yet.

"Experimental support for facial animation"?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    "Experimental support for facial animation"?

    I found this little nugget while looking back through the release notes on 4.15:

    • New: Added experimental support for facial animation.
    Can a someone who knows a bit more elaborate on this? does this mean we can expect to see something like Valve's face-poser which came with the Source SDK?

    #2
    Originally posted by The_1990 View Post
    Can a someone who knows a bit more elaborate on this? does this mean we can expect to see something like Valve's face-poser which came with the Source SDK?
    The work for this consisted of an initial proof-of-concept for how facial animation could be dealt with in the engine in terms of:

    - Import pipeline
    - Audio synchronization
    - Usage of pose-based and blendshape-based animation

    It is by no means meant as a way of creating facial animation, and is nowhere near production-ready. No work besides bugfixes was done on it for 4.16, so its status stands as 'experimental'.

    It consists of a single plugin that provides some UI for batch import of sounds/curves and a component that allows for audio sync.

    One engine feature was completed as part of this work, and is not experimental; the "Curve Source" anim node. This allows animation curves to be programmatically driven by any component or actor that implements a particular interface.
    Tom Sarkanen | Unreal Engine Developer | Epic Games UK

    Comment


      #3
      Thanks very much for the reply, was worried this would get buried.

      So to make sure I understand correctly: it will not allow any kind of automatic lipsync, am I understanding correctly? Or is that what "Curve Source" does?
      Last edited by The_1990; 05-05-2017, 11:38 AM.

      Comment


        #4
        Originally posted by Tom Sarkanen View Post
        One engine feature was completed as part of this work, and is not experimental; the "Curve Source" anim node. This allows animation curves to be programmatically driven by any component or actor that implements a particular interface.
        How do you implement that interface? Is it possible using blueprints? Does this have anything to do with the AudioCurveSource component? And how do you set it up generally? Question bombardment

        I was thinking you could use this to play wave files of lipsync lines split into phonemes and hook these up to blendshapes, does this seem possible?

        Comment


          #5
          Originally posted by cyaoeu View Post
          I was thinking you could use this to play wave files of lipsync lines split into phonemes and hook these up to blendshapes, does this seem possible?
          Yes, this is the general idea behind the system at present, although we are focusing more on pose-blending of skeletal animations over blendshapes, the principles are roughly the same.

          Originally posted by cyaoeu View Post
          How do you implement that interface? Is it possible using blueprints?
          Yes, and code. Create a component or actor that implements the ICurveSourceInterface interface. There are three functions to implement, although only two are important right now: GetBindingName() and GetCurves(). The binding name is used by the curve source anim node to bind at runtime to name returned from GetBindingName(). The node can bind to components of the current actor, the current actor itself, or components or actors stored as member variables of that actor. GetCurves() returns an array of name/value pairs that specify the value each named curve will receive in the anim graph. Note that curves can drive pose weights, blendshapes, material parameters etc.

          Originally posted by cyaoeu View Post
          Does this have anything to do with the AudioCurveSource component?
          Yes, UAudioCurveSourceComponent implements ICurveSourceInterface, and as such is the only example of the system working.
          Tom Sarkanen | Unreal Engine Developer | Epic Games UK

          Comment


            #6
            Have not looked at the addition yet but I assume that is there to correct the sync problem. The thing about UE4 is it generally interpolates everything as far as key framed animation goes so if lip sync is added to the animgraph as yet another animation source the result generally does not match the audio as authored.
            Clarke's third law: Any sufficiently advanced technology is indistinguishable from magic.
            Custom Map Maker Discord
            https://discord.gg/t48GHkA
            Urban Terror https://www.urbanterror.info/home/

            Comment


              #7
              Thanks, sounds pretty cool!

              Comment


                #8
                Originally posted by Tom Sarkanen View Post
                Yes, and code. Create a component or actor that implements the ICurveSourceInterface interface. There are three functions to implement, although only two are important right now: GetBindingName() and GetCurves(). The binding name is used by the curve source anim node to bind at runtime to name returned from GetBindingName(). The node can bind to components of the current actor, the current actor itself, or components or actors stored as member variables of that actor. GetCurves() returns an array of name/value pairs that specify the value each named curve will receive in the anim graph. Note that curves can drive pose weights, blendshapes, material parameters etc.
                I don't really know how to set this up. I've got a character with an AudioCurveSource component set up with a random sound wave. The binding name is Default. In the anim BP I've got a random animation without curves hooked up into a Curve Source node with the source binding named Default. I guess this should complete the binding, but what creates the actual curves? I had the assumption that this was something that turned the amplitude of the audio into curves but looking in the code it looks like it's supposed to work with .fbx with curves and sound somehow.

                Can you use Audio Source in 4.16 or are there parts missing? I'm testing with a morph target called Key 1 but I'm a bit lost on how to continue.

                Comment


                  #9
                  Originally posted by Tom Sarkanen View Post
                  Yes, this is the general idea behind the system at present, although we are focusing more on pose-blending of skeletal animations over blendshapes, the principles are roughly the same.
                  Interesting. Will the result be equal to the fidelity as imported from MotionBuilder using the voice device?
                  Clarke's third law: Any sufficiently advanced technology is indistinguishable from magic.
                  Custom Map Maker Discord
                  https://discord.gg/t48GHkA
                  Urban Terror https://www.urbanterror.info/home/

                  Comment


                    #10
                    Originally posted by cyaoeu View Post
                    I don't really know how to set this up. I've got a character with an AudioCurveSource component set up with a random sound wave. The binding name is Default. In the anim BP I've got a random animation without curves hooked up into a Curve Source node with the source binding named Default. I guess this should complete the binding, but what creates the actual curves? I had the assumption that this was something that turned the amplitude of the audio into curves but looking in the code it looks like it's supposed to work with .fbx with curves and sound somehow.

                    Can you use Audio Source in 4.16 or are there parts missing? I'm testing with a morph target called Key 1 but I'm a bit lost on how to continue.
                    Right now unless you have a third party library (such as FaceFX) creating the curves for you, or you are prepared to write your own amplitude/RMS lip-flap code, the system will not do anything. As I said, it is experimental
                    Tom Sarkanen | Unreal Engine Developer | Epic Games UK

                    Comment


                      #11
                      Originally posted by FrankieV View Post
                      Interesting. Will the result be equal to the fidelity as imported from MotionBuilder using the voice device?
                      Fidelity is entirely down to the system generating the curves and the quality of the content, so I guess it could be? Sorry, I'm not familiar with MotionBuilder at all!
                      Tom Sarkanen | Unreal Engine Developer | Epic Games UK

                      Comment


                        #12
                        Originally posted by Tom Sarkanen View Post
                        Fidelity is entirely down to the system generating the curves and the quality of the content, so I guess it could be? Sorry, I'm not familiar with MotionBuilder at all!
                        Talk to Zack He knows.
                        Clarke's third law: Any sufficiently advanced technology is indistinguishable from magic.
                        Custom Map Maker Discord
                        https://discord.gg/t48GHkA
                        Urban Terror https://www.urbanterror.info/home/

                        Comment


                          #13
                          Originally posted by Tom Sarkanen View Post
                          Right now unless you have a third party library (such as FaceFX) creating the curves for you, or you are prepared to write your own amplitude/RMS lip-flap code, the system will not do anything. As I said, it is experimental
                          Okay, I understand now. The new audio engine in 4.16 has a way to set up an envelope follower so the code has been written for me already! It only outputs a float though, so I'm guessing it can't be used with Curve Source as it is.

                          I was able to attach the envelope follower preset to a sound and have it change a morph target directly as the sound was playing so that works. It doesn't allow you to use third party libraries in any way so I guess it's a bit different but it should work for what I'm doing.

                          Comment


                            #14
                            Originally posted by cyaoeu View Post
                            It only outputs a float though, so I'm guessing it can't be used with Curve Source as it is.
                            Quite the contrary, as a 'curve' at animation evaluation time is just a float value. We just call them curves for legacy reasons because usually the values are defined by curve evaluation.
                            Tom Sarkanen | Unreal Engine Developer | Epic Games UK

                            Comment


                              #15
                              Originally posted by Tom Sarkanen View Post
                              Quite the contrary, as a 'curve' at animation evaluation time is just a float value. We just call them curves for legacy reasons because usually the values are defined by curve evaluation.
                              Okay, but is it possible to hook up this envelope follower to the Curve Source node then? Or any float in general?
                              edit: (only using blueprint)
                              Last edited by cyaoeu; 05-10-2017, 07:26 AM.

                              Comment

                              Working...
                              X