Announcement

Collapse
No announcement yet.

[Plugin] Leap Motion - Event Driven

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Hello,
    I'd like to know if there is some way to capture one hand on top of the other hand while using leap.
    I've been stuck on this for a while now and would appreciate any help Thank you.

    Comment


      Hi, thank you for this great plugin.

      I have a question about using this plugin with the Flex branch of Unreal: how do I make the left hand work with Flex objects?
      I am using Flex UE4 + LeapMotion + Oculus DK2 to accomplish something similar to this: https://www.youtube.com/watch?v=Fv9QGPCppwE.

      I have created a new Game Mode blueprint that uses the LeapFloatingHandsCharacter pawn, as per the instructions.

      When I try to interact with a flexSphere, only the right hand works:
      Click image for larger version

Name:	2017-07-17 18_45_13-TestProject Game Preview Standalone (64-bit_PCD3D_SM5).png
Views:	1
Size:	373.3 KB
ID:	1131254Click image for larger version

Name:	2017-07-17 18_46_20-TestProject Game Preview Standalone (64-bit_PCD3D_SM5).png
Views:	1
Size:	375.5 KB
ID:	1131255

      Interestingly, the left hand collisions work with non-Flex meshes.

      Does anyone have an idea what is wrong? Thanks

      Comment


        Hi, I sorry to brother you , I need use this plugin in UE4 c++ Project ,but in the github page,I find you "C++ how to" is not upgrade .Is there any possible to renew this thread recently? [MENTION=548]getnamo[/MENTION]

        Comment


          Originally posted by m4k View Post
          Hello,
          I'd like to know if there is some way to capture one hand on top of the other hand while using leap.
          I've been stuck on this for a while now and would appreciate any help Thank you.
          Do you mean when you occlude one hand with the other? The current leap tracking should be able to still capture the top hand when it is visible, but the other hand will be physically occluded from the cameras. You may wish to elaborate what you wish to specifically accomplish.

          Originally posted by jacksparow View Post
          Hi, I sorry to brother you , I need use this plugin in UE4 c++ Project ,but in the github page,I find you "C++ how to" is not upgrade .Is there any possible to renew this thread recently? [MENTION=548]getnamo[/MENTION]
          PM me your github username and I'll add you to the preview plugin which has C++ support.

          Originally posted by funkmeisterb View Post
          Hi, thank you for this great plugin.

          I have a question about using this plugin with the Flex branch of Unreal: how do I make the left hand work with Flex objects?
          I am using Flex UE4 + LeapMotion + Oculus DK2 to accomplish something similar to this: https://www.youtube.com/watch?v=Fv9QGPCppwE.

          I have created a new Game Mode blueprint that uses the LeapFloatingHandsCharacter pawn, as per the instructions.

          When I try to interact with a flexSphere, only the right hand works:
          [ATTACH=CONFIG]148882[/ATTACH][ATTACH=CONFIG]148883[/ATTACH]

          Interestingly, the left hand collisions work with non-Flex meshes.

          Does anyone have an idea what is wrong? Thanks
          The current v2 plugin uses two meshes, one for each hand, check the blueprint to ensure both of these are setup to work with your Flex objects. e.g. LeapRiggedEchoHandsActor.uasset
          Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

          Comment


            Originally posted by getnamo View Post
            The current v2 plugin uses two meshes, one for each hand, check the blueprint to ensure both of these are setup to work with your Flex objects. e.g. LeapRiggedEchoHandsActor.uasset
            Thank you for your answer. What I had to do was indeed to add extra capsules to the finger meshes and to make sure that their collision system was set to block Flex objects.

            I have another problem now: the hand meshes appear at the wrong position now. In the screenshot below, the hand is right in front of my character, and it appears closer to his head.
            Click image for larger version

Name:	2017-08-08 14_58_14-TestProject Game Preview Standalone (64-bit_PCD3D_SM5).png
Views:	1
Size:	532.1 KB
ID:	1132204

            I have tried disabling the plugin/restarting UE4/reenabling the plugin/restarting UE4 and the problem is still there. Even when I create a new project and follow these instructions I still get the error.

            This problem appeared when I double-clicked the LeapEchoHands_AnimBlueprint animation blueprint in my project. Before opening it, UE4 told me asked me to rebuild the mesh (or something to this effect) and I clicked yes. Now I can't get back to the working setup of the leap hands.

            Interestingly enough, my project is source controlled and I do not see any changes to the LeapEchoHands_AnimBlueprint file.

            Any ideas on how to restore the working leap animations?
            Attached Files
            Last edited by funkmeisterb; 08-08-2017, 03:13 PM. Reason: More details about screenshot

            Comment


              I found a bug considering the Anim Body component. The function 'Enable' (used in LeapHandsActorBase in the function ShowHandsBasedOnTracking) for the Anim Hand references can sometimes give a revesed situation. The function state that a left hand is being tracked when it is a right hand (wich can correctly dertermined by other values like the hand ref from the 'HandMoved' interface event). The Same happens on the variable 'Alpha' of the left and right Hand references. This bug causes the Skeletal Mesh hands to freeze in these situations.

              Comment


                I've seen some discussion here about the lag in the tracking being caused by the use of time warping, but no clear solution being offered.
                https://community.leapmotion.com/t/u...latency/5791/3
                https://github.com/getnamo/leap-ue4/issues/16
                https://github.com/getnamo/leap-ue4/issues/9

                Is there a clear solution that I can use to have the same quality of tracking that I get in Unity? It would be great if I could get this in Unreal because Unreal kicks ***.

                Thanks!

                Comment


                  Originally posted by funkmeisterb View Post
                  I've seen some discussion here about the lag in the tracking being caused by the use of time warping, but no clear solution being offered.
                  https://community.leapmotion.com/t/u...latency/5791/3
                  https://github.com/getnamo/leap-ue4/issues/16
                  https://github.com/getnamo/leap-ue4/issues/9

                  Is there a clear solution that I can use to have the same quality of tracking that I get in Unity? It would be great if I could get this in Unreal because Unreal kicks ***.

                  Thanks!
                  There's a currently private preview plugin available for people willing to give some feedback which already has this fixed along with some other cool features . PM me your github username and I'll add you to it.
                  Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                  Comment


                    Hi,
                    It's me again with another bug.
                    Seems like the Tip Velocity of the Leap Finger References get the Camera offset added in HMD mode. Plz fix
                    Click image for larger version

Name:	TipVelocity.JPG
Views:	43
Size:	41.1 KB
ID:	1383519

                    Comment


                      Originally posted by BOBtheROSS View Post
                      Hi,
                      It's me again with another bug.
                      Seems like the Tip Velocity of the Leap Finger References get the Camera offset added in HMD mode. Plz fix
                      Click image for larger version

Name:	TipVelocity.JPG
Views:	43
Size:	41.1 KB
ID:	1383519
                      The API for this will be changing completely in the upcoming plugin. It will likely not expose velocity values to keep a smaller and more maintainable footprint. In return it should have better high level interaction support. Out of curiosity what use case are you using this particular function for?

                      Also if you're interested, PM me your github username to get preview access to the new plugin.
                      Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                      Comment


                        Originally posted by getnamo View Post
                        [...] Out of curiosity what use case are you using this particular function for?
                        I was building my own swipe gesture, because the built in one was giving strength values only if the hand was moving within a certain velocity. Too fast or too slow wouldn't give any gestures from the leap frame reference and the Interface 'evnet swipe gesture detected' wouldn't execute or freeze the values.

                        Is there any way to adjust the threshold of the a built in gesture?

                        Comment


                          Originally posted by BOBtheROSS View Post

                          I was building my own swipe gesture, because the built in one was giving strength values only if the hand was moving within a certain velocity. Too fast or too slow wouldn't give any gestures from the leap frame reference and the Interface 'evnet swipe gesture detected' wouldn't execute or freeze the values.

                          Is there any way to adjust the threshold of the a built in gesture?
                          There's nothing built in that provides that flexibility, looks like you'll have to go custom. In general it is recommended nowadays to use direct physical interaction with things rather than general gestures, but if you wish to use general gestures, swiping is pretty easy to do manually.

                          I recommend taking the finger tip position and storing it each tick and calculating the velocity manually from Velocity = Distance/DeltaTime each tick. Then if the velocity vector breaches a certain Vector.Size() threshold while being within e.g. <30 degrees within your desired swipe direction normal, consider it a swipe. To get angles between vectors you can use the Dot product between your velocity normal and the direction you're interested in divided by their magnitude. i.e.

                          Angle = ACOS((Vector1 dot Vector2) / (Vector1.Size() * Vector1.Size() ))
                          Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                          Comment


                            Hey,
                            we're using the plugin to work with custom gestures in our project.
                            For one of the gestures I want to use the direction of the fingers, but somehow every finger has the same direction as the index finger.
                            Do you know what the problem could be?
                            Thanks in advance.

                            Comment


                              Hi I'm evaluating the usage of Leap Motion for interactive product visualization, wanted to ask how does LP behave with real objects (so to create AR app) where the object itself will occlude part of the hand. it's able to maintain the hand tracking or the sensor need at every time to have free visual to every finger?

                              Comment


                                Originally posted by littlewildwolf View Post
                                Hey,
                                we're using the plugin to work with custom gestures in our project.
                                For one of the gestures I want to use the direction of the fingers, but somehow every finger has the same direction as the index finger.
                                Do you know what the problem could be?
                                Thanks in advance.
                                Do you have blueprint graphs of what you're trying to do? I believe the old plugin has correct directional vectors so it may depend on how you're trying to achieve it.

                                Originally posted by davide445 View Post
                                Hi I'm evaluating the usage of Leap Motion for interactive product visualization, wanted to ask how does LP behave with real objects (so to create AR app) where the object itself will occlude part of the hand. it's able to maintain the hand tracking or the sensor need at every time to have free visual to every finger?
                                I'd recommend checking out the Blocks demo on leap motion's website. It's a good test for checking out the sensor quality in vr.
                                Link: https://gallery.leapmotion.com/blocks/

                                While the demo is built in unity, you'll have similar tracking quality in the new unreal plugin.
                                Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                                Comment

                                Working...
                                X