Announcement

Collapse
No announcement yet.

[Plugin] Leap Motion - Event Driven

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Hi getnamo

    With the latest beta I can´t manage to enable hands collisions (I´m using the low poly hands) do I have to do something special about it?

    Cheers.

    Comment


      Originally posted by MariuszWaclawek View Post
      Hi [MENTION=548]getnamo[/MENTION], first of thanks for your work on this plug-in it's a great start.

      We are working on a project that requires Leaps to be replicated properly over network, out of the box we get our own hands mimicked to all players and players can't see tracked hands of others just their own copies everywhere.
      From what I understand the new rewrite of the plug-in will have improved latency which we are excited about. Does the rewrite also integrate well with Unreal replication? Will it be supported out of the box or do we have to replicate all transforms by hand like [MENTION=14253]maffew[/MENTION] did, it does not sound like a good solution though?
      AFAIK, there is no replication done in getnamo github. And I am now also trying to get replication done here.

      Edit: No replication in the existing github. I have no idea about rewrite version.
      Last edited by Syed; 03-27-2017, 03:14 AM.

      Comment


        Originally posted by MariuszWaclawek View Post
        Hi [MENTION=548]getnamo[/MENTION], first of thanks for your work on this plug-in it's a great start.

        We are working on a project that requires Leaps to be replicated properly over network, out of the box we get our own hands mimicked to all players and players can't see tracked hands of others just their own copies everywhere.
        From what I understand the new rewrite of the plug-in will have improved latency which we are excited about. Does the rewrite also integrate well with Unreal replication? Will it be supported out of the box or do we have to replicate all transforms by hand like [MENTION=14253]maffew[/MENTION] did, it does not sound like a good solution though?

        Do you have any blog on the progress of the rewrite or even a short list of upcoming improvements, features?

        PS. If you are willing to share the github repo I would really appreciate the access.

        Thanks,
        Mariusz
        Originally posted by Syed View Post
        AFAIK, there is no replication done in getnamo github. And I am now also trying to get replication done here.

        Edit: No replication in the existing github. I have no idea about rewrite version.
        Trying to figure out the best architecture for this. There are a few considerations to have a good experience for replication.

        For VR this is especially true, since 2 players at 90fps of just 3 transforms (e.g. HMD, left and right) will saturate the 10kb/s default bandwidth, so we will likely need to raise the cap e.g. https://answers.unrealengine.com/que...-of-setti.html. Leap motion fingers have ~ 22 per hand or nearly 15x the data of head + motion controllers.

        The other consideration is how to have a good experience with a ranging number of people streaming their hand positions at these higher data rates. I'm thinking some kind of interpolation will be needed with the more aggressive kind being used when the other person is e.g farther away from you, while more accurate streaming happens when they're near (or maybe leave that as an option)

        Finally what is the best way to have this working from a developer point of view? Not all leap experiences will be multiplayer, so there should be a toggle, then you need a way to determine ownership of tracked hands, perhaps check if a rigged actor's parent is a pawn or player controller and determine ownership that way? This isn't fully settled yet so if you have good ideas about how to expose the functionality in an intuitive way, please contribute thoughts, ideas, maybe even code

        Originally posted by Juangea View Post
        Hi getnamo

        With the latest beta I can´t manage to enable hands collisions (I´m using the low poly hands) do I have to do something special about it?

        Cheers.
        For the moment, add issues on the preview github for preview plugin issues.
        Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

        Comment


          I think you can lower that Data of the Leap hands if you eliminate the knuckles and Just track the Fingertips locations and palm location. That would allow for less bandwidth to be broadcast. right? Or, am I missing something?

          Comment


            Originally posted by getnamo View Post
            Finally what is the best way to have this working from a developer point of view? Not all leap experiences will be multiplayer, so there should be a toggle, then you need a way to determine ownership of tracked hands, perhaps check if a rigged actor's parent is a pawn or player controller and determine ownership that way? This isn't fully settled yet so if you have good ideas about how to expose the functionality in an intuitive way, please contribute thoughts, ideas, maybe even code
            A General Rule I try to apply to Software Development is to allow all features to have a Toggle. This allows for greater compatibility and customization.

            Comment


              Originally posted by ParagonVRAdmin View Post
              I think you can lower that Data of the Leap hands if you eliminate the knuckles and Just track the Fingertips locations and palm location. That would allow for less bandwidth to be broadcast. right? Or, am I missing something?
              Yes I agree with this. Just need to have palm location +rotation, plus finger tips directions + positions. The rest can be extrapolated but it could be difficult though. And I dont think we need others' (metacarpal etc) position as their lengths are known in editor (probably one has to enter the values?). But first thing first, we need to get something working ie just transmit everything, and all fingers work as they should be. From thereon, we will have a solid base code to play around.

              Comment


                Hello everyone,

                I am very new at this, i am an archviz artist and trying to step into VR and Unreal, i am using the Oculus rift along with the leap motion and trying to add the Leapfloatinghands to the HMDlocomotion pawn from the unreal VRtemplate, and i am not being able to see my hands, actually i tried but i don't know how to do it, could someone help me with a step by step guide to set it up please.

                Thank you very much

                Comment


                  Hello everyone,

                  I am very new at this, i am an archviz artist and trying to step into VR and Unreal, i am using the Oculus rift along with the leap motion and trying to add the Leapfloatinghands to the HMDlocomotion pawn from the unreal VRtemplate, and i am not being able to see my hands, actually i tried but i don't know how to do it, could someone help me with a step by step guide to set it up please.

                  Thank you very much [MENTION=548]getnamo[/MENTION]

                  Comment


                    Originally posted by Nicolas4 View Post
                    Hello everyone,

                    I am very new at this, i am an archviz artist and trying to step into VR and Unreal, i am using the Oculus rift along with the leap motion and trying to add the Leapfloatinghands to the HMDlocomotion pawn from the unreal VRtemplate, and i am not being able to see my hands, actually i tried but i don't know how to do it, could someone help me with a step by step guide to set it up please.

                    Thank you very much [MENTION=548]getnamo[/MENTION]
                    You can find detailed documentation on how to use the plugin at the github repository https://github.com/getnamo/leap-ue4, it has sections for most use cases. The one you're looking for is found here: https://github.com/getnamo/leap-ue4#...stom-character. In essence you add a child actor to your pawn/character of choice and set it to LeapRiggedEchoHandsActor, parented to your camera component. Then modify any offsets to fit Oculus expected origin.
                    Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                    Comment


                      The current version of this plugin is pretty slow, if there is another version where people are working on, where could we find this?

                      Comment


                        Originally posted by Olivierus View Post
                        The current version of this plugin is pretty slow, if there is another version where people are working on, where could we find this?
                        It's currently in a closed alpha being tested internally with a few members from the forum helping debug it. The new version has a lot of breaking changes and I want to make sure it is solid and well documented before being released to the wider public. If you're keen on testing an early release in exchange for feedback and helping raise potential issues with logs and replication steps, pm me your github username.
                        Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                        Comment


                          Hey I'm using the built-in Leap plugin in UE4 4.15.1 for a VR hot air balloon project, and I had this issue in the past of having a black screen on my monitor when playing in VR. In the VR goggles it works, but not being able to view on screen is a problem. I had the same problem in the past and it had resolved itself I don't know how, and now I have the same issue again. Anyone any ideas?

                          Comment


                            Originally posted by thibaultvdb View Post
                            Hey I'm using the built-in Leap plugin in UE4 4.15.1 for a VR hot air balloon project, and I had this issue in the past of having a black screen on my monitor when playing in VR. In the VR goggles it works, but not being able to view on screen is a problem. I had the same problem in the past and it had resolved itself I don't know how, and now I have the same issue again. Anyone any ideas?
                            Assuming it's related to using a passthrough character, try enabling images from the control panel, or change from the passthrough character to rigged.
                            Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                            Comment


                              I'm using the floatinghandscharacter, and tried with the other blueprints as pawn but the problem persists. I enabled images in the leap control panel.
                              Click image for larger version

Name:	black preview window.png
Views:	1
Size:	507.0 KB
ID:	1126546

                              Comment


                                Hi getnamo,

                                Thanks for your awesome code contributions and support on this thread!

                                I have some work to show on the 5th of May, notably a small feature based on your plugin. I'd prefer to finalize the feature on the new version to avoid duplicated work, so I have postponed any further improvement on that until now.
                                But the date is coming close. I have planned 2 days to migrate the feature and 1.5 days of testing/tweaking. Is it still worth me to wait? Or is the current state close enough to final that I could use it?

                                Thanks!

                                Comment

                                Working...
                                X