Announcement

Collapse
No announcement yet.

Best practice to implement and animate hands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Best practice to implement and animate hands

    Hey guys,

    I recently switched from Unity to Unreal and I made some really good progress so far. Right now I'm trying to figure out the best way to add hands to my VR project.

    Basically I want to attach the hands when i.e. holding a rifle and I also want to animate the trigger finger. I obviously could just add the hand meshes to the gun and toggle the visibility but for a rifle I'd need to add 4 meshes (2 for left hand and 2 for right hand). I'm a bit concerned that it might not be the best way to do it. Or is it?

    Another issue I'm facing is the positioning of the hand bones. In Unity I could manipulate each bone easily and see how it looks like. In Unreal it seems like I would need to create an individual animation for every weapon grip shape. And how would I animate the trigger finger? I'm afraid just manipulating one single bone by an exposed variable might work but it might look weird on some weapons. Should I also have an animation here that I can blend into the other one?

    Maybe my biggest issue is to find the right workflow to actually create the animations to make the hands look like they are truly holding the object.

    I'm really looking forward for some ideas and suggestions.

    #2
    Originally posted by cooldiE View Post
    Another issue I'm facing is the positioning of the hand bones. In Unity I could manipulate each bone easily and see how it looks like. In Unreal it seems like I would need to create an individual animation for every weapon grip shape. And how would I animate the trigger finger
    You can manipulate any bone by using 'Transform (Modify) Bone' node in animation blueprint.
    Stand-alone mocap app for Vive Trackers: vrmocapstudio.com
    Marketplace: Vive Mocap Kit / Fingers Solver / VR IK Body Solver / Subtitles to LipSync / Dialogue System with Character Animation
    And random stuff at Youtube

    Comment


      #3
      Originally posted by YuriNK View Post
      You can manipulate any bone by using 'Transform (Modify) Bone' node in animation blueprint.
      Yeah I could use that to animate the trigger finger but in some instances I might need to manipulate even more bones to make it look authentic. Furthermore I still need a wait to make the hand look like it is holding the weapon. I probably could also do that in code by exposing every bone and manipulate it in blueprint but I guess that might be a very bad way of doing it.

      I guess I'm mostly looking for the correct workflow to achieve authentic hand positioning and basic animations because I probably will need to do it for a lot more objects.

      Comment


        #4
        Yuu can setup animation assets in the animation editor if you dont want to use another software and blend them in a blueprint.

        And what's the problem with hands positioning? I always have a hand mesh attached to motion controller. Any other objects are attached to this mesh with different offsets. For diferent objects you need different relative offsets to hand and different hand poses.
        Stand-alone mocap app for Vive Trackers: vrmocapstudio.com
        Marketplace: Vive Mocap Kit / Fingers Solver / VR IK Body Solver / Subtitles to LipSync / Dialogue System with Character Animation
        And random stuff at Youtube

        Comment


          #5
          Thank you very much. I think I can make this work then. I just wasn't sure if using animations is the right way to do it but it seems thats exactly what I have to do

          Comment

          Working...
          X