Announcement

Collapse
No announcement yet.

Training Livestream - Getting Started with Character Morph Targets - May 9 - Live from Epic HQ

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    [QUESTION] :: Anyway you can describe a pipeline on how to capture facial animation - use it in the sequencer - with the intention of getting it into Adobe Premiere. Our team's goal is to use UE4 with the intention of using it for a short animated fiilm. Any recommended way to go about this - without having to use Maya or anything else - would be helpful.

    Furthemore : Would we be able to "record" Morph targets - i.e "Character A raises eyebrows" in UE4 - then use it as a precanned animation that can be activated in Sequencer... if I imported a facial model - Could I "ragdoll puppet it" in UE4 - capture the animation - and then use the way other animations work in the sequencer editor

    Comment


      #17
      Hey there,

      Was watching the stream and wondering if you could explain the 2 axis, radial control you've made in UMG to control 4 morph targets at the same time and blend together. You were using it for Mouth Movement control.

      Later on, If you could release the project for reviewing and reverse engineering too, that would also be useful but I think a lot of people would be interested in learning about this system for facial animation and body morphs as well.

      Comment


        #18
        [Question] Any lipsync feature ( ala Source2 ) planned in the near future?
        [Question] Would you be able to isolate ( from an image sequence or streaming webcam ) markers locations ( green or red ) on the face ( using multiple raycasting/blueprint functions ), average the pixels location, put an empty actor as the average location, get XY movemet, and then transfer the motions to the morphs?
        Alternatively to manipulate the image ( getting alphas from the color channels ) so that UE4 is able to recognize the shapes ( contour of the mouth/eyes, similar to the tracking techonology Faceware is using ), and get tracking data from them?
        ENTER REALITY

        VR Solutions

        Contact us for more informations

        Comment


          #19
          Did this get uploaded somewhere? Couldn't be there sadly, and really want to watch this

          Comment


            #20
            Originally posted by belgianwizard View Post
            Did this get uploaded somewhere? Couldn't be there sadly, and really want to watch this
            Click the latest video here: https://www.twitch.tv/unrealengine/videos/all

            Comment


              #21
              Originally posted by cloganart View Post
              Was watching the stream and wondering if you could explain the 2 axis, radial control you've made in UMG to control 4 morph targets at the same time and blend together. You were using it for Mouth Movement control.
              Also wondering about this
              Character Customizer

              Comment


                #22
                [MENTION=26573]mlindborg[/MENTION]

                You have 4 targets: Smile left, smile right, frown left, frown right
                and a control with 2 axes: x and y
                x=0 and y=1 means control up, so both smiles turn on, if x is -1 or 1 then only one of the two smiles turns on. This is done with a simple multiplication on a clamped output of the x axis.
                Same deal for y-1, just with the frowns. Now you control 2 morph targets for each side.

                Thats one way of doing it.

                Comment


                  #23
                  Originally posted by Adeptus View Post
                  [MENTION=26573]mlindborg[/MENTION]

                  You have 4 targets: Smile left, smile right, frown left, frown right
                  and a control with 2 axes: x and y
                  x=0 and y=1 means control up, so both smiles turn on, if x is -1 or 1 then only one of the two smiles turns on. This is done with a simple multiplication on a clamped output of the x axis.
                  Same deal for y-1, just with the frowns. Now you control 2 morph targets for each side.

                  Thats one way of doing it.
                  Yeah I was more wondering about the interface. I don't know much about UMG and I can't figure out how to make the selection surface.
                  Character Customizer

                  Comment


                    #24
                    Hey everyone, just an update. The archive is up and Ed is working on getting a version of his project out so you can deconstruct it. I'll update the thread again when that is available.
                    Twitch /unrealalexander| Twitter @UnrealAlexander
                    How to report a bug? | Installation & Setup issues?
                    Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]

                    Comment


                      #25
                      Hey Alex, Wondering the status of the project!

                      Comment


                        #26
                        https://forums.unrealengine.com/showthread.php?145339
                        It's right here! It was hiding!

                        Comment


                          #27
                          Hi all, I think the download link for the example project is broken. Is there any other link available?

                          Comment


                            #28
                            Originally posted by Alexander Paschall View Post

                            WHAT

                            Ed Burgess joins us to talk about how to get started with morph targets using a character's head. Ed explores how to bring a face to life in a few steps. He'll explain how to export, import and control the mesh using UE4 and 3ds Max. If you are interested in building a custom character creator for your game or how to drive facial animations with UE4, then make sure to tune in!

                            Ed has provided an example project to go with this stream. Go to the DOWNLOAD PAGE here

                            WHEN
                            Tuesday, May 9th @ 2:00PM ET [Countdown]

                            WHERE
                            Twitch
                            Morpheus TV
                            Facebook
                            Youtube

                            WHO
                            Ed Burgess - Engine Support Technician
                            Alexander Paschall - Community Manager - @UnrealAlexander

                            Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it's not always possible to answer everyone's questions as they come up. This is especially true for off-topic requests, as it's rather likely that we don't have the appropriate person around to answer. Thanks for understanding!

                            Archive:

                            Hello,
                            I have some question regarding this.
                            Can we dl the project before the live?

                            Comment


                              #29
                              In this you mention documentation for something called dynamic textures, for making random wrinkles etc. - I cannot find this documentation - where do I look?

                              Comment


                                #30
                                @Kiwikah Inc - you could totally save the X/Y axis points( +time? ) while you animate - and then rerun that! I would be carefull how though, you could easy end up with a huge set of data. (Does the original BP check time since last tick.)

                                One part of thus would be to make a fuction out of the Y/X interpetation, and taking only Y/X as input... then you could even varry the playback time. An other option would be to expand the widget to allow you to select from all availabel morphTargets ... and playbacks. So you would have a morph blendspace!

                                you could probalby also have several heads, to compare your recordings.

                                Comment

                                Working...
                                X