Jenga Physics in VR - Full Tutorial Project Included

Hey VR developers! I just released a 3 part tutorial series on how to use my unofficial leap motion plugin with a focus in VR. If you ever wondered how to do perfect eye-casting raytraces or wanted an example of how to pickup and drop virtual objects, this series answers those questions.

Check out the latest leap motion blog entry for a run down of plugin features or look below for video links and a quick summary of what each will cover.

The three tutorial videos contain parts on how to implement basic jenga, jenga with pickup/drop cabilitilies, and finally telekinesis powers.

Part 1 - Setup and Collision

What you’ll learn from the first part:

Drag and drop installation


Selecting a leap character and hitting Play to see interactive hands.


Build Jenga stacks and push them around with the Leap Collision Character.


Full instructions available in the video below

Part 2 - Picking up and dropping objects
Dig into the second part and learn to extend functionality with input mapping in order to pickup and drop blocks

Part 3 - Raytracing and Telekinesis
Learn how to do raytracing in UE4


Enable your telekinetic powers to pickup blocks from the ground and place them back on top of the stack!

Resources
When you’ve finished you can compare your results to the final project zip, or use it as a reference!

If you just want to try the demo, grab the executable instead.

If you want to see an example of leap-only locomotion try the mage arena game.

For the latest documentation please see the github repo.

Ask questions at the main plugin thread.

Let me know if it has been useful, always looking for any feedback you may have!

Thankyou!!!

I would love to know how you have the leap to be so smooth, looking forward to really playing with this but the leap for me is so horrible it’s hardly even usable, fingers flip inside out like crazy, hands come in and out of view so each time I try this blocks fly everywhere :slight_smile: I’ve been messing with it off and on for months to get this working correctly but never get close to how smooth it is in your videos. nice work though with what I can see!

Does this work with the official plugin?

You’re welcome!

A lot of the concepts covered here are ambiguous to the input solution so you could definitely implement the same setup. That said certain functions are currently only exposed in the unofficial plugin (grabbing/ circle gestures etc.) and you would need to dig into C++ to expose those in the official plugin (largely recreating what this plugin does); eye raycasting and and the physics logic downstream would be the same however.

There are a number of things you can do to improve you tracking.

  1. Don’t have extra IR sources, such as strong sunlight directly in your FOV
  2. If you have furniture really close to you, move away a bit and re-center your view.
  3. If you’re seeing low framerate on that hands, you may be using a USB socket that is contended, use a different socket that isn’t connected in a chain so that you have the full bandwidth.
  4. remember that the leap motion detects best when you view your hand flat-on and guesses more when you have a closed fist/etc. You can see this clearly in the videos when I grab blocks (video 2) as the hand orientation has much more jitter than when the fingers are spread out.
  5. If its guessing your hand incorrectly, move it out of view and bring it back. This will usually make it guess correctly.
  6. Looking down with furniture nearby is usually a recipe for bad tracking.

The leap has also improved quite markedly since even their 2.0 release, they have a much more stable state tracking than they used to have. You can now for example easily tap your wrist, make a thumb gesture and flip it from top to bottom without it losing the tracking.

I implemented most of the blueprints (parts 1 and 2) in my project. I replaced the jenga blocks with a conductor baton, to drive and orchestra. Right now I can pause/unpause the music by pressing a key, but I would like to be able to track the baton velocity (once it has been grabbed) to replace the key. This way I would be able to set a velocity threshold for the baton which will drive the pause/unpause function. I tried for several hours but seems that I’m not able to track the object velocity. Any help?

I was wondering how you could possibly have the leap trigger a trigger box.

You can query an object’s position and store it, then compare how much that position that moved in one frame (v=d/t) and you will have a velocity vector to determine what you’re looking for.

if you’re using the collision character you can just do a OnBeginOverlap on a trigger volume and cast the other object to e.g. a LeapBasicRiggedCharacter


I was wondering what else I would need to add to get the trigger reading the leap collision actor as the trigger.

Try simulation generates overlap checkbox on the character static mesh (I believe I do this in the tutorial videos). You can also attach custom trigger boxes as child components to your mesh and parent them to a bone, which may give you finer control over what triggers things.

That solved the problem thank you very much.

Thanks for this. Does anyone know if it’s 4.9 compatible?

The method outlined in the video is the same for 4.9. The final project should also be compatible, just download the newer plugin drag it over the old one and switch engine versions to 4.9.

Got it working. Thank you once again!

The project file on Mega is gone, please re-upload.

May not be relevant anymore, but link has been updated.

for reference: MEGA

Hey, great tutorial. Thank you.

Although I am stuck with a modification I am trying to get working. I am running 4.13.2 with vive and trying to get floating hands to work instead of the rigged character. I have run through your steps and most things work and think I have found where things break down.

  1. i think “Does Implement interface” function is not working and I cannot get past the Branch in the Jengablock Blueprint. I think I read that it may have something to do with hands being a child or something. I have tried “Cast to LeapHandsActorBase” and plugging the “as leap hands actor base” to the “Does Implement interface” function. hopefully someone has some input.

  2. I am also unclear on the Pickup Character custom event/function. is this to just test the object type with the target pickup object?

  3. I think my next problem will be with the “Floating Hands Character” blueprint. I am concerned my parent and socket name is incorrect for the floating hands. I am using LeapHands for both, but have tried some of the other meshes for the socket, any advise would be appreciated…

Thanks a ton for the work, i have learned a ton and may quit my mechanical engineering job and start making games… maybe…

Played around with LeapMotion and this UE4-Plugin one day and I’m sorry but it really sucked. Not even the simplest gestures where detected in a reliable way (about 1 out of 10 hit rate and then quickly fireing dozens of events at one). But that’s not be a problem of the plug-in (which looked well-engineered to me), the sensor just sucks. It’s far to unstable, even under standard office light conditions. I am surprised, that is barely worked in this videos. LeapMotion in it’s current form is basically the Kinect of hand recognition.