Announcement

Collapse
No announcement yet.

[Plugin] Leap Motion - Event Driven

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by merouses View Post
    Hello!
    Im fairly new to developing with leap, and Id like to ask how to track the fingertips' world positions? Im currently using the TipLocation node and it doesnt give their global transform, and I cant find the rigth way to offset them to get the accurate locations...
    I'm assuming you're using something like https://github.com/getnamo/leap-ue4#...world position

    you can then append the actor position to convert the local positions to world positions via something like

    Leave a comment:


  • replied
    Hi, I have a project and I just want to use the leap motion to display my hands. So I followed your tutorial How to use it - Convenience Rigged Characters. I changed my game mode to use LeapRiggedCharacter or LeapFloatingHandsCharacter as my default pawn. But I found that if I use LeapRiggedCharacter or LeapFloatingHandsCharacter as my default pawn, the previous pawn in my project wil not work. Is there any solution to make LeapFloatingHandsCharacter and my previous pawn work together?

    Leave a comment:


  • replied
    Hello!
    Im fairly new to developing with leap, and Id like to ask how to track the fingertips' world positions? Im currently using the TipLocation node and it doesnt give their global transform, and I cant find the rigth way to offset them to get the accurate locations...

    Leave a comment:


  • replied
    Originally posted by D_Petrov88 View Post

    Thanks for the answer!

    I guess you mean something like this ? Sadly still not working, am i doing something wrong ? Should i use attach to component at all or it need different approach ?
    I also try to attach to LeftHandMesh child which was a cube again attached on begin play to the hand and to same socket, but the result was the same.
    That does appear correct, maybe your socket name is wrong, check the bone names. That said if you're looking for basic pickup/drop it may be easier for you to use the preview plugin which has a working example of this, PM me your github username and I'll send you an invite to the repo.

    Leave a comment:


  • replied
    Originally posted by getnamo View Post


    You'd likely want to attach the DefaultSceneRoot in your TestGrab bp to the LeftHandMesh component instead of your actor using AttachToComponent. The actual hand actor doesn't move, but the hand skeletal mesh component and its children do.
    Thanks for the answer!

    I guess you mean something like this ? Sadly still not working, am i doing something wrong ? Should i use attach to component at all or it need different approach ?
    I also try to attach to LeftHandMesh child which was a cube again attached on begin play to the hand and to same socket, but the result was the same.

    Leave a comment:


  • replied

    Originally posted by D_Petrov88 View Post
    Hello, Community

    I am pretty new to leap motion so i'm asking for help, kinda desperate here. I am trying to attach a BP_Actor to LeapRiggedEchoHands which is child actor of LeapFloatinghands Character on overlap but i cant make it work.

    I also try almost same logic inside FloatingHandsCharacter but with no success."TestGrab" is the cube Bp which im trying to pickup

    It seems only the AttachtoActor part from the blueprints doesn't work, because when i put my hand in the cube it generates overlap events and everything else looks fine.

    Is my logic right?

    I will be so thankful if someone can explain me how to make this work!

    Regards!
    You'd likely want to attach the DefaultSceneRoot in your TestGrab bp to the LeftHandMesh component instead of your actor using AttachToComponent. The actual hand actor doesn't move, but the hand skeletal mesh component and its children do.


    Originally posted by littlewildwolf View Post

    Ok, I found the problem, I can get the one finger at the time and store it in a variable, but as soon as I get a new finger the other variables are overridden. At least that's what it looks like to me.
    I don't know if I am doing something wrong or it's not possible to acces more than one finger at the time.

    Looking at the code https://github.com/getnamo/leap-ue4/...erList.cpp#L70 that analysis does appear to be correct. Probably the best workaround would be to fetch the direction and store it in a local variable like you said, then do the final comparison with the stored local variables instead of directly referencing the fingers. This may have been a case of me marking a blueprint function pure, when it shouldn't have. This shouldn't be an issue in the preview plugin as it uses structs to hold all the data.
    Last edited by getnamo; 12-18-2017, 08:38 AM.

    Leave a comment:


  • replied
    Hello, Community

    I am pretty new to leap motion so i'm asking for help, kinda desperate here. I am trying to attach a BP_Actor to LeapRiggedEchoHands which is child actor of LeapFloatinghands Character on overlap but i cant make it work.

    I also try almost same logic inside FloatingHandsCharacter but with no success."TestGrab" is the cube Bp which im trying to pickup

    It seems only the AttachtoActor part from the blueprints doesn't work, because when i put my hand in the cube it generates overlap events and everything else looks fine.

    Is my logic right?

    I will be so thankful if someone can explain me how to make this work!

    Regards!
    Last edited by MosPetrov; 12-14-2017, 09:13 AM.

    Leave a comment:


  • replied
    Originally posted by getnamo View Post

    Do you have blueprint graphs of what you're trying to do? I believe the old plugin has correct directional vectors so it may depend on how you're trying to achieve it.
    Ok, I found the problem, I can get the one finger at the time and store it in a variable, but as soon as I get a new finger the other variables are overridden. At least that's what it looks like to me.
    I don't know if I am doing something wrong or it's not possible to acces more than one finger at the time.

    Leave a comment:


  • replied
    Originally posted by getnamo View Post

    Do you have blueprint graphs of what you're trying to do? I believe the old plugin has correct directional vectors so it may depend on how you're trying to achieve it.
    The first Blueprint is our gesture, with the other one I tracked the x,y and z values of the vector from Indexfinger and Thump. Both vectors are nearly the same all the time, no matter in wich direction the fingers are pointing. That also happens if I use the other finger.

    Leave a comment:


  • replied
    Thanks, the hand tracking precision and stability is really grate.
    Did you know if will also work when the sensor need to track a hand partially behind a real object.
    I'm more interested into AR than VR applications.

    Leave a comment:


  • replied
    Originally posted by littlewildwolf View Post
    Hey,
    we're using the plugin to work with custom gestures in our project.
    For one of the gestures I want to use the direction of the fingers, but somehow every finger has the same direction as the index finger.
    Do you know what the problem could be?
    Thanks in advance.
    Do you have blueprint graphs of what you're trying to do? I believe the old plugin has correct directional vectors so it may depend on how you're trying to achieve it.

    Originally posted by davide445 View Post
    Hi I'm evaluating the usage of Leap Motion for interactive product visualization, wanted to ask how does LP behave with real objects (so to create AR app) where the object itself will occlude part of the hand. it's able to maintain the hand tracking or the sensor need at every time to have free visual to every finger?
    I'd recommend checking out the Blocks demo on leap motion's website. It's a good test for checking out the sensor quality in vr.
    Link: https://gallery.leapmotion.com/blocks/

    While the demo is built in unity, you'll have similar tracking quality in the new unreal plugin.

    Leave a comment:


  • replied
    Hi I'm evaluating the usage of Leap Motion for interactive product visualization, wanted to ask how does LP behave with real objects (so to create AR app) where the object itself will occlude part of the hand. it's able to maintain the hand tracking or the sensor need at every time to have free visual to every finger?

    Leave a comment:


  • replied
    Hey,
    we're using the plugin to work with custom gestures in our project.
    For one of the gestures I want to use the direction of the fingers, but somehow every finger has the same direction as the index finger.
    Do you know what the problem could be?
    Thanks in advance.

    Leave a comment:


  • replied
    Originally posted by BOBtheROSS View Post

    I was building my own swipe gesture, because the built in one was giving strength values only if the hand was moving within a certain velocity. Too fast or too slow wouldn't give any gestures from the leap frame reference and the Interface 'evnet swipe gesture detected' wouldn't execute or freeze the values.

    Is there any way to adjust the threshold of the a built in gesture?
    There's nothing built in that provides that flexibility, looks like you'll have to go custom. In general it is recommended nowadays to use direct physical interaction with things rather than general gestures, but if you wish to use general gestures, swiping is pretty easy to do manually.

    I recommend taking the finger tip position and storing it each tick and calculating the velocity manually from Velocity = Distance/DeltaTime each tick. Then if the velocity vector breaches a certain Vector.Size() threshold while being within e.g. <30 degrees within your desired swipe direction normal, consider it a swipe. To get angles between vectors you can use the Dot product between your velocity normal and the direction you're interested in divided by their magnitude. i.e.

    Angle = ACOS((Vector1 dot Vector2) / (Vector1.Size() * Vector1.Size() ))

    Leave a comment:


  • replied
    Originally posted by getnamo View Post
    [...] Out of curiosity what use case are you using this particular function for?
    I was building my own swipe gesture, because the built in one was giving strength values only if the hand was moving within a certain velocity. Too fast or too slow wouldn't give any gestures from the leap frame reference and the Interface 'evnet swipe gesture detected' wouldn't execute or freeze the values.

    Is there any way to adjust the threshold of the a built in gesture?

    Leave a comment:

Working...
X