[Plugin] Leap Motion - Event Driven

I found a bug considering the Anim Body component. The function ‘Enable’ (used in LeapHandsActorBase in the function ShowHandsBasedOnTracking) for the Anim Hand references can sometimes give a revesed situation. The function state that a left hand is being tracked when it is a right hand (wich can correctly dertermined by other values like the hand ref from the ‘HandMoved’ interface event). The Same happens on the variable ‘Alpha’ of the left and right Hand references. This bug causes the Skeletal Mesh hands to freeze in these situations.

I’ve seen some discussion here about the lag in the tracking being caused by the use of time warping, but no clear solution being offered.
https://community.leapmotion.com/t/unreal-plugin-latency/5791/3
https://.com//leap-ue4/issues/16
https://.com//leap-ue4/issues/9

Is there a clear solution that I can use to have the same quality of tracking that I get in Unity? It would be great if I could get this in Unreal because Unreal kicks ■■■.

Thanks!

There’s a currently private preview plugin available for people willing to give some feedback which already has this fixed along with some other cool features :). PM me your username and I’ll add you to it.

Hi,
It’s me again with another bug.
Seems like the Tip Velocity of the Leap Finger References get the Camera offset added in HMD mode. Plz fix :wink:

The API for this will be changing completely in the upcoming plugin. It will likely not expose velocity values to keep a smaller and more maintainable footprint. In return it should have better high level interaction support. Out of curiosity what use case are you using this particular function for?

Also if you’re interested, PM me your username to get preview access to the new plugin.

I was building my own swipe gesture, because the built in one was giving strength values only if the hand was moving within a certain velocity. Too fast or too slow wouldn’t give any gestures from the leap frame reference and the Interface ‘evnet swipe gesture detected’ wouldn’t execute or freeze the values.

Is there any way to adjust the threshold of the a built in gesture?

There’s nothing built in that provides that flexibility, looks like you’ll have to go custom. In general it is recommended nowadays to use direct physical interaction with things rather than general gestures, but if you wish to use general gestures, swiping is pretty easy to do manually.

I recommend taking the finger tip position and storing it each tick and calculating the velocity manually from Velocity = Distance/DeltaTime each tick. Then if the velocity vector breaches a certain Vector.Size() threshold while being within e.g. <30 degrees within your desired swipe direction normal, consider it a swipe. To get angles between vectors you can use the Dot product between your velocity normal and the direction you’re interested in divided by their magnitude. i.e.

Angle = ACOS((Vector1 dot Vector2) / (Vector1.Size() * Vector1.Size() ))

Hey,
we’re using the plugin to work with custom gestures in our project.
For one of the gestures I want to use the direction of the fingers, but somehow every finger has the same direction as the index finger.
Do you know what the problem could be?
Thanks in advance.

Hi I’m evaluating the usage of Leap Motion for interactive product visualization, wanted to ask how does LP behave with real objects (so to create AR app) where the object itself will occlude part of the hand. it’s able to maintain the hand tracking or the sensor need at every time to have free visual to every finger?

Do you have blueprint graphs of what you’re trying to do? I believe the old plugin has correct directional vectors so it may depend on how you’re trying to achieve it.

I’d recommend checking out the Blocks demo on leap motion’s website. It’s a good test for checking out the sensor quality in vr.
Link: Blocks – Ultraleap Gallery

While the demo is built in unity, you’ll have similar tracking quality in the new unreal plugin.

Thanks, the hand tracking precision and stability is really grate.
Did you know if will also work when the sensor need to track a hand partially behind a real object.
I’m more interested into AR than VR applications.

The first Blueprint is our gesture, with the other one I tracked the x,y and z values of the vector from Indexfinger and Thump. Both vectors are nearly the same all the time, no matter in wich direction the fingers are pointing. That also happens if I use the other finger.

Ok, I found the problem, I can get the one finger at the time and store it in a variable, but as soon as I get a new finger the other variables are overridden. At least that’s what it looks like to me.
I don’t know if I am doing something wrong or it’s not possible to acces more than one finger at the time.

Hello, Community

I am pretty new to leap motion so i’m asking for help, kinda desperate here. I am trying to attach a BP_Actor to LeapRiggedEchoHands which is child actor of LeapFloatinghands Character on overlap but i cant make it work.

I also try almost same logic inside FloatingHandsCharacter but with no success.“TestGrab” is the cube Bp which im trying to pickup

It seems only the AttachtoActor part from the blueprints doesn’t work, because when i put my hand in the cube it generates overlap events and everything else looks fine.

Is my logic right?

I will be so thankful if someone can explain me how to make this work!

Regards!

You’d likely want to attach the DefaultSceneRoot in your TestGrab bp to the LeftHandMesh component instead of your actor using AttachToComponent. The actual hand actor doesn’t move, but the hand skeletal mesh component and its children do.

Looking at the code https://.com//leap-ue4/…erList.cpp#L70 that analysis does appear to be correct. Probably the best workaround would be to fetch the direction and store it in a local variable like you said, then do the final comparison with the stored local variables instead of directly referencing the fingers. This may have been a case of me marking a blueprint function pure, when it shouldn’t have. This shouldn’t be an issue in the preview plugin as it uses structs to hold all the data.

Thanks for the answer!

I guess you mean something like this ? Sadly still not working, am i doing something wrong ? Should i use attach to component at all or it need different approach ?
I also try to attach to LeftHandMesh child which was a cube again attached on begin play to the hand and to same socket, but the result was the same.

That does appear correct, maybe your socket name is wrong, check the bone names. That said if you’re looking for basic pickup/drop it may be easier for you to use the preview plugin which has a working example of this, PM me your username and I’ll send you an invite to the repo.

Hello!
Im fairly new to developing with leap, and Id like to ask how to track the fingertips’ world positions? Im currently using the TipLocation node and it doesnt give their global transform, and I cant find the rigth way to offset them to get the accurate locations…

Hi, I have a project and I just want to use the leap motion to display my hands. So I followed your tutorial How to use it - Convenience Rigged Characters. I changed my game mode to use LeapRiggedCharacter or LeapFloatingHandsCharacter as my default pawn. But I found that if I use LeapRiggedCharacter or LeapFloatingHandsCharacter as my default pawn, the previous pawn in my project wil not work. Is there any solution to make *LeapFloatingHandsCharacter and my previous pawn work together? *

I’m assuming you’re using something like https://.com//leap-ue4#how-to-use-it—blueprint—event-driven if that is the case you can just add your actor world position

you can then append the actor position to convert the local positions to world positions via something like