[Plugin] Leap Motion - Event Driven

How to detect it my Palm is UP or DOWN and if it is Left or Right Hand?

Hi and hi everyone.

First, thank you for this amazing plugin. Iā€™ve tried it out and it works very well out of the box ! This will open new possibilities with VR and I canā€™t wait to see future project using it.

Iā€™m currently working on a networked simulation and Iā€™m trying to figure out how to replicate hands through the network.
I think the plugin is currently not setup to do so (correct me if Iā€™m wrong), so I started to modify a bit the plugin blueprints.

Iā€™ve managed to replicates a ***LeapRiggedEchoHandsActor ***and its sub components/actor but Iā€™m stuck at animating it. For now, the leap motion client implements the HandMoved event and call a server RPC. The issue is that on the server side the LeapHand reference is not valid so I guess this one is not replicated. To do so I have to start modifying the C++ plugin code and this may be a bit harder than expected.

So before I run into this I would like to know if someone already managed to replicated the hands through the network.

Best regards.

Great investigation and Iā€™m looking forward to what solution you come up with. Make sure to make a pull request when you get it working!

On a side note has anyone else had this issue: https://.com//leap-ue4/issues/12 in packaged games? If so what version of the plugin/editor did you see this?

Hi and everyone! This plugin is great and iā€™m learning to use it :slight_smile:

Iā€™m using the plugin to all handā€™s data from the first client and send them to a custom nodejs server. On the second client iā€™m trying to position the hand and its fingers according to the infos received from the server and i canā€™t understand how to accomplish this last thing.

In LeapBasicRiggedCharacterā€™s anim BP (BasicCharacter_AnimBlueprint), in the SetLocalVariables function all fingers orientations are set(e.g. Left Middle 1Orientation), for what i know the Leap API gives you the positions of finger bones, i canā€™t understand why in the animBP orientations are used, can you explain me this?

Also should i apply received values to set directly variables like ā€˜Left Middle 1Orientatonā€™ or can i simply use ā€œTransform Boneā€ functions?
I see that there an intermediate step in your plugin that is the AnimBody object, should i set received values in it?

Sorry for the many questions but iā€™m getting crazy over this.
I thank you for any help you or anyone can give me.

AnimBody is used to create a separation from leap specific input to the animation of the character mesh. But since you want to use the leap motion data directly you may find it easier to do the custom bp approach (attaching a leap controller to any blueprint of your choice and adding an interface to that class)
https://.com//leap-ue4#how-to-use-itā€”blueprint-without-convenience-content-quick-setup

and then just push the data you get from the event on hand moved:
https://.com//leap-ue4#how-to-use-itā€”blueprintā€”event-driven

which is much closer to the raw data you get from leap (albeit converted to UE4 space and units).

Hope that helps!

Thank you for the answer!
Iā€™m already using the hand moved event to get the data that i send from the client to the server, my problem is that when another client receives the handā€™s data of the first client, i canā€™t find a way to move the hands of the mesh representing the other player(my application is a sort of virtual room with 2 people that can interact in some way using their hands and moving them with the leap).

On the second client i receive correctly the data regarding other personā€™s hands but i canā€™t move the meshā€™s hands correctly, i saw that you used some sort of orientation variables to move the hands, can i use instead elbow, wrist and fingers components positions( from the leap hand moved event on the first client) to move the elbow, wrist and finger components on the second client? When i tried to do this i get some strange results like deformed fingers and so on.

Which is the correct way of binding leap data received from a server(so i canā€™t use event hands moved) to a BasicRiggedLeapCharacter?

Your help is very appreciated.
Thank you!

The animation is setup is to get close to 1:1 representation of your hand/finger positions. The way you can do this with a skeletal mesh is that you IK to the elbow/wrist position which the leap motion estimates, and then you use the orientations of everything else which will put the fingers in the correct form without distortion. The fingers might not match your real finger length but the relation of the fingers in the mesh will line up correctly for nearly any type of skeletal mesh hand.

If you wish to use this animation setup, then the correct way of updating the position from an external source is to update the Anim body which the animation blueprint uses.

AnimHand has a convenience function written in C++ which grabs the information directly from a leap hand (https://.com//leap-ue4/blob/master/Plugins/LeapMotion/Source/LeapMotion/Public/AnimBody/AnimHand.h#L128) and this is called in blueprint (in leap connector I believe). What you need to do is to set the Anim body data similarly to https://.com//leap-ue4/blob/master/Plugins/LeapMotion/Source/LeapMotion/Private/AnimBody/AnimHand.cpp#L91 (this can be done in blueprint), all the setting functions are publically accessible. Another way would be if you could serialize the leaphand data and pipe that over the network, but that might be a bit harder to do.

Optionally setup your own skeletal mesh animation rig which uses raw data you pipe in a different way.

Finally a last option is to fix the blueprints to work correctly in UEā€™s networking system through replication, which is something [MENTION=425607][/MENTION] seems to be working through.

Hi - thanks for an amazing plugin!

I just wanted to mention here that Iā€™m also trying to get network replicated Leap Motion hands working. It seems myself and several others are in the same boat. Iā€™ve created an AnswerHub question and an issue on the page for the leap plugin.
[MENTION=425607][/MENTION], , - have any of you made any progress on this problem?

Thanks,

Hi 6ead2ebf, I tried to serialize the anim body object in json and send it to another client but i didnā€™t succed.
So i simply sent shoulder, elbow, wrist and all fingers orientations taken directly from the LeapRiggedCharacter over the network in json and applied them to the bones of the mesh on another client.

I just needed the mesh on the second client follow the movements of the one on the first client(the one with the leap).
I know this is not the correct solution nor it isnā€™t elegant but it works.

Greetings all.

Iā€™m working on a Leap Motion project where as much game logic as possible is being kept in C++. The documentation on implies that the same setup logic in blueprint should be applicable in C++, but Iā€™ve hit several stumbling blocks trying to do so. Iā€™d just like to check that Iā€™m not missing anything obvious in my approach.

First off, when using the in engine plugin included with 4.12.5, trying to access any of the functions of the classes in the AnimBody folder results in LNK2019 errors. After some digging, I realized this was because said classes were all missing the LEAPMOTION_API specifier. Adding the specifier back in resolved the LNK2019 errors, but Iā€™ve had problems instantiating them in C++. Iā€™ve tried CreateDefaultSubobject and NewObject to make AnimHand objects, but the internal AnimFingers/AnimBones always end up null. Is this set of classes just not setup to work in C++? Or am I doing something wrong?

Also, I wanted to make use of the LeapEventInterface, but theyā€™re all specified as BlueprintImplementableEvents. This means I canā€™t directly inherit the class and provide a C++ implementation, correct? Or am I missing something?

Hey there , thanks very much for the hard work youā€™ve put into this plugin. Echoing what some others have posted, I have had similar collision issues following your Jenga tutorial with the FloatingHandsCharacter - I canā€™t seem to detect on hit events in mesh to pick upā€™s blueprint, where we check if it implements the interface - I made this work with a block component in the FloatingHandsCharacter BP (made a ā€œstickā€ and got attaching a pickup cube to the ā€œstickā€ to work by toggling grab with a key press) - this is also for a university project, Iā€™ve tried setting the collision in the meshes (LeftHandMesh and RightHandMesh) in LeapRiggedEchoHandsActor to PhysicsActor but that doesnā€™t work. The hands are grabbing but they are not being detected as touching the cube. If anyone has a solution to this it would be greatly appreciated! :smiley:

Please help me.
A few days of sufferingā€¦

I try to attach the object into the socket.

Please check your question for the answer, Iā€™ve added a reply.

This is a good catch, they should be BlueprintNativeEvent instead, which will allow your interface object respond implement the events in e.g. this format:


virtual void LeapRightHandMoved_Implementation(ULeapHand* Hand) override;

for each event.

Iā€™ve added this as a issue for now with a branch; will test this on the next plugin pass.

Check your character settings, they may be overriding the ability of your meshes to collide. Also keep in mind that the jenga video is just one example of how to get something like that working and should not be considered the definite way.

Iā€™ve looked into this and the transition to body state in the next dev release should enable replication correctly. Stay tuned :slight_smile:

Thanks for the reply ! Iā€™ve figured out a work around! I instead placed the blueprints inside LeapRiggedEchoHands, the child actor for LeapHands iirc (Iā€™ll check this tommorow to confirm but I thought Iā€™d writed quickly for now). Once I did that I got it working. So for anyone who has the same problem, place it in the child actor and cast your grab variable from the your character to that bp to check if grabbing. Iā€™ll post a screenshot of my bp later. Apologies if I explained incorrectly here. :smiley:

EDIT: So just to clarify, instead of placing your bp functions and nodes such as pickupIfEmpty inside the BP_FloatingHands you move it to LeapRiggedEchoHands actor bp. I know there are better ways to do these things but I didnā€™t have a lot of time :stuck_out_tongue:

Thanks for the reply! Iā€™d actually poked around the plugin code myself and came to a similar conclusion.

One thing to note is that simply making all the functions BlueprintNativeEvents means that any C++ classes that extend ILeapEventInterface must explicitly provide definitions for all the BlueprintNativeEvents in the interface. For me this means seventeen empty stub functions, for the five or so events I actually care about. Could be worse, but still pretty messy to deal with. Trying to put stub definitions inside of the interface doesnā€™t work, as this results in a compiler error about being unable to instantiate abstract classes. I assume this is unreal trying to enforce interfaces as function declaration only.

I was thinking about other solutions, but I couldnā€™t think of any that felt strictly ā€˜betterā€™. Thereā€™s splitting up the LeapEventInterface into several smaller interfaces (hand, finger, gesture, passthrough?), but thatā€™d require some significant changes in LeapController, and it could be disruptive for existing projects. On the plus side, it could result in fewer extraneous event calls, depending on how its implemented. Another possibility would be to use the regular C++ virtual function for native implementation, as that can have an overridable implementation in the interface, and then wrap that and the BlueprintImplementable function in a function:

.h


UFUNCTION(BlueprintNativeEvent, Category = "Leap Interface Event")
	void OnLeapHandMoved(ULeapHand* hand);
	virtual void OnLeapHandMoved_Native(ULeapHand* hand);
	void LeapHandMoved_Exec(ULeapHand* hand)

.cpp


void OnLeapHandMoved_Native(ULeapHand* hand) {
	//stub function so implementation is optional};

void LeapHandMoved_Exec(ULeapHand* hand) {
	OnLeapHandMoved_Native(hand);
	ILeapEventInterface::Execute_OnLeapHandMoved(this, hand);//not sure if this would work, havent tested
}

But then that removes the possibility of doing things like defining an event in C++ and then overriding it in a blueprint child.

I had the plugin working but now with a new project in 4.11.2 I am getting this error

ā€œMissing or incompatible modules in LeapMotion pluginā€

Any ideas what I am doing wrong?

EDIT - Nevermind I guess the plugin got built in the engine? Or maybe I put it in there. Not sure.

Has anyone had issues in 4.13? It seems to be working fineā€¦?

I think stubs is the correct way to go for now, it might be a bit annoying but itā€™s harmless. May not be very clean, but being explicit is very C++ :stuck_out_tongue:

Yep since 4.11 the community plugin is the official one and is now included in each engine release.

Iā€™m looking for information on how to remap the coordinate system of the Orion Hands.
Looking for where I can control/modify position relative to screen space.

Thank you.

The plugin automatically remaps the coordinate system from Leap space to UE space.

That said, if you use the convenience hands you can modify the coordinate by just parenting a scene node; if you use blueprint custom or C++ you can modify the output you receive from the plugin by multiplying the output by a transform. Optionally you can modify this file: https://.com//leap-ue4/blob/master/Plugins/LeapMotion/Source/LeapMotion/Private/LeapInterfaceUtility.cpp to change the coordinate system transform globally for all output from the plugin.