You’re welcome!
A lot of the concepts covered here are ambiguous to the input solution so you could definitely implement the same setup. That said certain functions are currently only exposed in the unofficial plugin (grabbing/ circle gestures etc.) and you would need to dig into C++ to expose those in the official plugin (largely recreating what this plugin does); eye raycasting and and the physics logic downstream would be the same however.
There are a number of things you can do to improve you tracking.
- Don’t have extra IR sources, such as strong sunlight directly in your FOV
- If you have furniture really close to you, move away a bit and re-center your view.
- If you’re seeing low framerate on that hands, you may be using a USB socket that is contended, use a different socket that isn’t connected in a chain so that you have the full bandwidth.
- remember that the leap motion detects best when you view your hand flat-on and guesses more when you have a closed fist/etc. You can see this clearly in the videos when I grab blocks (video 2) as the hand orientation has much more jitter than when the fingers are spread out.
- If its guessing your hand incorrectly, move it out of view and bring it back. This will usually make it guess correctly.
- Looking down with furniture nearby is usually a recipe for bad tracking.
The leap has also improved quite markedly since even their 2.0 release, they have a much more stable state tracking than they used to have. You can now for example easily tap your wrist, make a thumb gesture and flip it from top to bottom without it losing the tracking.