Announcement

Collapse
No announcement yet.

[Plugin] Leap Motion - Event Driven

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    This is awesome!!! :d

    Comment


      I need particles to emit from specific sockets, what's the best way to access the socket in the updated plugin? thanks for the awesome work btw!

      Comment


        Originally posted by Rolento View Post
        I need particles to emit from specific sockets, what's the best way to access the socket in the updated plugin? thanks for the awesome work btw!
        If you're using e.g. example LeapHandsPawn it just has a child actor called BSLowPolyHand. This is the asset which links tracked skeletal meshes to an actor, you can find this asset in the plugin contents folder. In the default example we use two skeletal meshes, one for each hand, you can attach things to either hands to e.g. emit things from sockets.

        An easy way to adjust things is to simply clone the BSLowPolyHand asset, add your new cloned asset to pawn or other actor of choice and modify it to your needs e.g. whether that is to use a different skeletal mesh or to add things to the skeletal mesh sockets in the usual way https://docs.unrealengine.com/en-us/...Meshes/Sockets.
        Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

        Comment


          I think I'm missing the obvious but let me ask anyway...

          Is there a simple way so that I can use Leap Motion Plugin to "record" animations on my default UE4 mannequin rig and use them as "fbx" later on? As if using LM plugin to record mocap data?

          Comment


            Originally posted by BarisT View Post
            I think I'm missing the obvious but let me ask anyway...

            Is there a simple way so that I can use Leap Motion Plugin to "record" animations on my default UE4 mannequin rig and use them as "fbx" later on? As if using LM plugin to record mocap data?
            At the moment there is nothing out of the box that makes that easy for ue4, but it is possible to manually capture leap data frames in the plugin each tick and then do manual processing to them.

            If you want something more complete, consider capturing this externally e.g. https://gallery.leapmotion.com/hand-capture/ or http://blog.leapmotion.com/leap-moti...e-7-animation/

            It's a fairly popular feature request, it's possible saving recording for animations may be possible in a future release. Contributions are certainly welcome
            Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

            Comment


              Great plugin.
              Someone can advice please how to replace the low poly hands model with another hands model ? Thanks

              Comment


                Originally posted by VR class View Post
                Great plugin.
                Someone can advice please how to replace the low poly hands model with another hands model ? Thanks
                Please check out the CustomRigHello.umap https://github.com/leapmotion/LeapUnrealmodules#rigging in https://github.com/leapmotion/LeapUnrealmodules which contains examples of different rigs
                Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                Comment


                  Hi everybody. I am a total newbie to this technology. What I cannot grasp, from both the comments here and the leap motion web site, is what is the system architecture underneath the overall solution.. I mean, is the AR app provided by a cloud server or just hosted in the PC, or no connection whatsoever is required (namely, the AR application and processing run on-board the AR headset)? Can someone explain me this? Thanks

                  Comment


                    Originally posted by getnamo View Post

                    At the moment there is nothing out of the box that makes that easy for ue4, but it is possible to manually capture leap data frames in the plugin each tick and then do manual processing to them.

                    If you want something more complete, consider capturing this externally e.g. https://gallery.leapmotion.com/hand-capture/ or http://blog.leapmotion.com/leap-moti...e-7-animation/

                    It's a fairly popular feature request, it's possible saving recording for animations may be possible in a future release. Contributions are certainly welcome
                    I'm not an avid UE4 user (just an 3d animator), but I thought it was possible by using the Sequencer Recorder. From my understanding, it can record any skeleton data. So I think it's possible, never tried it but I plan on doing so in the future

                    Comment


                      Hi Getnamo.

                      I'm having an issue with the Unreal Plugin. The problem is: when i use your modules example and the leapdesktop pawn, the leapmotion sensor get optmized for head mounted only after hit play and ESC on the unreal editer. After that, nothing works unless restarting the Computer. I just have the raw module and plugin to test and i'm getting it. Is it right?

                      Ty!

                      Comment


                        Originally posted by LeMooNcs View Post
                        Hi Getnamo.

                        I'm having an issue with the Unreal Plugin. The problem is: when i use your modules example and the leapdesktop pawn, the leapmotion sensor get optmized for head mounted only after hit play and ESC on the unreal editer. After that, nothing works unless restarting the Computer. I just have the raw module and plugin to test and i'm getting it. Is it right?

                        Ty!
                        By default the Leap Motion is set to VR optimized tracking (Leap Mode VR), if you want to use it for this mode nothing needs to be done. The LeapDesktopActor expects the leap motion sensor to be used in the non-default desktop mode where it is facing up. It prepares the leap sensor for this by changing the optimization mode to Desktop on begin play and then resetting it back to VR mode on end play. Because of how it works, if you want to use the leap motion in vr, don't use this asset anywhere in your map!

                        Instead you should be using LeapHandsPawn or it's child actor BSLowPolyHand in a pawn of your choice.
                        Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                        Comment


                          Hi guys!

                          So I'm really excited! I've got the plugin working in my project. In the sense that I've loaded the actor and I can see the hands in VR.

                          However, the next step: letting the hand interact with my products (grab), is something I'm struggling with. The tutorials I found so far on getting the hands in VR were pretty easy. But I'm sort of missing something to let the hands interact..

                          Could somebody help me? Of course, compensation is available, because as we like to say in Dutch: only the sun rises for free (so everything else should be paid).

                          All the best,

                          Tim

                          Comment


                            Hi everyone,

                            I had a project in 4.18 using the v2 Leap Motion framework . Now that that we're on 4.21 and v4 framework, does that mean that I have to rebuild all my blueprints related to the Leap Motion?

                            I'm asking because it seems like a lot of things have been changed with the plugin. Did anyone else have the same problem?

                            Thanks,

                            Comment


                              Originally posted by jakwarne View Post
                              Hi everyone,

                              I had a project in 4.18 using the v2 Leap Motion framework . Now that that we're on 4.21 and v4 framework, does that mean that I have to rebuild all my blueprints related to the Leap Motion?

                              I'm asking because it seems like a lot of things have been changed with the plugin. Did anyone else have the same problem?

                              Thanks,
                              The v3 plugin (using v4 tracking) is a complete rewrite so it has fully breaking API. That said it should be easier to get things running and you will find it much more performant with lower latency. It is not expected to have significant breaking API like this in the future.

                              Check out https://github.com/leapmotion/leapunrealmodules for some basic blueprint examples.

                              Originally posted by timsuitcase View Post
                              Hi guys!

                              So I'm really excited! I've got the plugin working in my project. In the sense that I've loaded the actor and I can see the hands in VR.

                              However, the next step: letting the hand interact with my products (grab), is something I'm struggling with. The tutorials I found so far on getting the hands in VR were pretty easy. But I'm sort of missing something to let the hands interact..

                              Could somebody help me? Of course, compensation is available, because as we like to say in Dutch: only the sun rises for free (so everything else should be paid).

                              All the best,

                              Tim

                              There is a basic pickup/drop button press example found here: https://github.com/leapmotion/leapun...ickup-and-drop try out the map and explore the blueprints to see how it all comes together.


                              Plugins: GES - Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo

                              Comment


                                So... I have updated to the new plugin and SDK, thanks for the work getnamo. What is the situation now with gestures such as swipe? Do you know of an effective way to do that?

                                Thanks

                                Comment

                                Working...
                                X