Announcement

Collapse
No announcement yet.

Training Livestream - Getting Started with Character Morph Targets - May 9 - Live from Epic HQ

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    [LIVESTREAM] Training Livestream - Getting Started with Character Morph Targets - May 9 - Live from Epic HQ


    WHAT

    Ed Burgess joins us to talk about how to get started with morph targets using a character's head. Ed explores how to bring a face to life in a few steps. He'll explain how to export, import and control the mesh using UE4 and 3ds Max. If you are interested in building a custom character creator for your game or how to drive facial animations with UE4, then make sure to tune in!

    Ed has provided an example project to go with this stream. Go to the DOWNLOAD PAGE here

    WHEN
    Tuesday, May 9th @ 2:00PM ET [Countdown]

    WHERE
    Twitch
    Facebook
    Youtube

    WHO
    Ed Burgess - Engine Support Technician
    Alexander Paschall - Community Manager - @UnrealAlexander

    Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it's not always possible to answer everyone's questions as they come up. This is especially true for off-topic requests, as it's rather likely that we don't have the appropriate person around to answer. Thanks for understanding!

    Archive:

    Last edited by Alexander Paschall; 05-19-2017, 04:47 PM.
    Twitch /unrealalexander| Twitter @UnrealAlexander
    How to report a bug? | Installation & Setup issues?
    Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]

    #2
    It seems you accidently linked Ryan Bruck's twitter account.

    Very interesting topic. I was wondering if you could make a quick example of driving morph targets using sound with this method: https://forums.unrealengine.com/show...l=1#post705891

    Doesn't have to be super fancy, just a morph target that changes with a sound file playing.

    Comment


      #3
      [Question] How do you use the facial animation importer and what are its use cases?
      [Question] Is it reasonable to update curve data from outside an anim instance? e.g Another asset contains curves for phonemes and I want to send them directly to the animation graph for evaluation.
      Last edited by MatzeOGH; 05-08-2017, 11:24 AM.

      Comment


        #4
        Great topic. Though I admit I'd love to see a good example setup of using the Improved 4.16 RBF Pose driver to drive corrective/muscle bending Morphs.

        Comment


          #5
          need some Blender Luv'n - just sayin'

          Comment


            #6
            Can you post the 3ds file so we can follow along?

            Or like a tweaked version if there's copyright issues
            Last edited by Ohriginal; 05-08-2017, 03:08 PM.

            Comment


              #7
              Originally posted by ayretek View Post
              need some Blender Luv'n - just sayin'
              Yes, yes. I tested morph targeting out of Blender a while ago and it works fine in Unreal.
              Please cover the Blender workflow too. :-)

              Comment


                #8
                Originally posted by Etienne Andlau View Post
                Yes, yes. I tested morph targeting out of Blender a while ago and it works fine in Unreal.
                Please cover the Blender workflow too. :-)
                exactly this....

                Comment


                  #9
                  Hi
                  Can we dl the project before the live ?

                  Comment


                    #10
                    Good stream, ill be watching the recording for sure.

                    Questions:

                    1. The poseDriver node seems a bit unituitive to use, does it need one poseAsset PER corrective? And it barely has any control for value mapping. Can you quickly show how to drive corrective morph targets using the pose reader node from animBP?

                    2. How would you do facial animation? With a phoneme based system or with a FACS setup? (might be explained in stream by default, just asking to make sure)

                    3. How would you deal with implementing body corrective morphs that might have influence on a characters costume? If the character has like 20 different costumes he can wear, would the only solution be to have EACH of those have a matching set of correctives?
                    In that case it probably would be easier to solve rough volume problems on body with extra bones instead? (im currently doing it that way)

                    Comment


                      #11
                      What's memory performance like for morph targets, anyhow? One of my people is really, really concerned about additive morph targets on characters - mixing 'smile' and 'angry' say. They're both floats, so if you combine them, well, it can get ugly.

                      Comment


                        #12
                        Id like to have the 3DS file and the UE4 project aswell to experiment later after watching the stream. Is it possible?
                        Nilson Lima
                        Technical Director @ Rigel Studios Ltda - twitter: @RigelStudios
                        Join us at Discord: https://discord.gg/uFFSEXY

                        UE4 Marketplace: Cloudscape Seasons
                        supporting: Community FREE Ocean plugin

                        Comment


                          #13
                          I suppose a better way to ask the question is to ask about optimization tricks, especially when dealing with multiple morphs on the same area, with regards to hardware performance and load.

                          Comment


                            #14
                            [Question] How would you use a pose asset to do facial animation on the fly rather than using preset curves?

                            Comment


                              #15
                              Using Face Mo Cap in UE4 Sequencer

                              Anyway you can describe a pipeline on how to capture facial animation - use it in the sequencer - with the intention of getting it into Adobe Premiere. Our team's goal is to use UE4 with the intention of using it for a short animated fiilm. Any recommended way to go about this - without having to use Maya or anything else - would be helpful.

                              Comment

                              Working...
                              X