[Plugin] Leap Motion - Event Driven

I have some querstion about UTexture2D* FPrivateLeapImage::Texture32FromLeapImage (leapimage.h), how i can understand - X points array go out of memory?

Hello,

I tried deriving from the LeapController c++ file and visual studio found errors within the official leapmotion c++ files, once the file was created. A Screenshot showing the errors has been attached. As for my project, I am using UE4.17 and I am trying to create custom gestures for the leap motion controller. Any help will be greatly appreciated.

Would it be possible for me to get access to this? Iā€™ve been using your plugin (thank you for your work!) in VR for a few years and would love to test the new version and experience lower latency, as experienced in Untiy. I PMā€™d you my username, in case this is possible. Thanks!

Hi there I am having trouble trying to track a specific finger as well. What is the location info that you are feeding into this function?

New Plugin Release! v3 plugin with v4 tracking

The preview plugin is now publicly available at
https://.com/leapmotion/LeapUnreal

and you can find example assets that go with it here
https://.com/leapmotion/LeapUnrealmodules

If youā€™ve been using the engine plugin, I highly recommend you check the new plugin out!

Blog post about update: Hand Tracking | Gemini Is Here - Ultraleap documentation

This is awesome!!! :d

I need particles to emit from specific sockets, whatā€™s the best way to access the socket in the updated plugin? thanks for the awesome work btw!

If youā€™re using e.g. example LeapHandsPawn it just has a child actor called BSLowPolyHand. This is the asset which links tracked skeletal meshes to an actor, you can find this asset in the plugin contents folder. In the default example we use two skeletal meshes, one for each hand, you can attach things to either hands to e.g. emit things from sockets.

An easy way to adjust things is to simply clone the **BSLowPolyHand **asset, add your new cloned asset to pawn or other actor of choice and modify it to your needs e.g. whether that is to use a different skeletal mesh or to add things to the skeletal mesh sockets in the usual way https://docs.unrealengine.com/en-us/Engine/Content/Types/SkeletalMeshes/Sockets.

I think Iā€™m missing the obvious but let me ask anywayā€¦

Is there a simple way so that I can use Leap Motion Plugin to ā€œrecordā€ animations on my default UE4 mannequin rig and use them as ā€œfbxā€ later on? As if using LM plugin to record mocap data?

At the moment there is nothing out of the box that makes that easy for ue4, but it is possible to manually capture leap data frames in the plugin each tick and then do manual processing to them.

If you want something more complete, consider capturing this externally e.g. Hand Capture ā€“ Ultraleap Gallery or Leap Motion + iClone 7 for Professional Animation - Leap Motion Blog

Itā€™s a fairly popular feature request, itā€™s possible saving recording for animations may be possible in a future release. Contributions are certainly welcome :slight_smile:

Great plugin.
Someone can advice please how to replace the low poly hands model with another hands model ? Thanks

Please check out the CustomRigHello.umap https://.com/leapmotion/LeapUnrealmodules#rigging in https://.com/leapmotion/LeapUnrealmodules which contains examples of different rigs

Hi everybody. I am a total newbie to this technology. What I cannot grasp, from both the comments here and the leap motion web site, is what is the system architecture underneath the overall solutionā€¦ I mean, is the AR app provided by a cloud server or just hosted in the PC, or no connection whatsoever is required (namely, the AR application and processing run on-board the AR headset)? Can someone explain me this? Thanks

Iā€™m not an avid UE4 user (just an 3d animator), but I thought it was possible by using the Sequencer Recorder. From my understanding, it can record any skeleton data. So I think itā€™s possible, never tried it but I plan on doing so in the future

Hi .

Iā€™m having an issue with the Unreal Plugin. The problem is: when i use your modules example and the leapdesktop pawn, the leapmotion sensor get optmized for head mounted only after hit play and ESC on the unreal editer. After that, nothing works unless restarting the Computer. I just have the raw module and plugin to test and iā€™m getting it. Is it right?

Ty!

By default the Leap Motion is set to VR optimized tracking (Leap Mode VR), if you want to use it for this mode nothing needs to be done. The *LeapDesktopActor *expects the leap motion sensor to be used in the non-default desktop mode where it is facing up. It prepares the leap sensor for this by changing the optimization mode to Desktop on begin play and then resetting it back to VR mode on end play. Because of how it works, if you want to use the leap motion in vr, donā€™t use this asset anywhere in your map!

Instead you should be using LeapHandsPawn or itā€™s child actor *BSLowPolyHand *in a pawn of your choice.

Hi guys!

So Iā€™m really excited! Iā€™ve got the plugin working in my project. In the sense that Iā€™ve loaded the actor and I can see the hands in VR.

However, the next step: letting the hand interact with my products (grab), is something Iā€™m struggling with. The tutorials I found so far on getting the hands in VR were pretty easy. But Iā€™m sort of missing something to let the hands interactā€¦

Could somebody help me? Of course, compensation is available, because as we like to say in Dutch: only the sun rises for free (so everything else should be paid).

All the best,

Hi everyone,

I had a project in 4.18 using the v2 Leap Motion framework . Now that that weā€™re on 4.21 and v4 framework, does that mean that I have to rebuild all my blueprints related to the Leap Motion?

Iā€™m asking because it seems like a lot of things have been changed with the plugin. Did anyone else have the same problem?

Thanks,

The v3 plugin (using v4 tracking) is a complete rewrite so it has fully breaking API. That said it should be easier to get things running and you will find it much more performant with lower latency. It is not expected to have significant breaking API like this in the future.

Check out https://.com/leapmotion/leapunrealmodules for some basic blueprint examples.

There is a basic pickup/drop button press example found here: https://.com/leapmotion/leapunrealmodules#vr-pickup-and-drop try out the map and explore the blueprints to see how it all comes together.

Soā€¦ I have updated to the new plugin and SDK, thanks for the work . What is the situation now with gestures such as swipe? Do you know of an effective way to do that?

Thanks