I figured I’d make a Gesture Tracker VR support thread so I can have a place for feedback, public support and feature requests.
Does this work with Oculus Touch, Leap Motion, Perception Neuron, etc.?
Gesture recognition depends only on the transform of the GestureTracker component itself and does not depend on any device specific functionality or API so it will work with any input device. As long as you attach it to some moving component that you want to track it will work.
Is C++ supported?
C++ is supported but discouraged. The component was designed to be used in blueprints and is not as easily setup in C++. To use the plugin in code add “GestureTrackerVR” to your project’s public/private module names and then include “GestureTracker.h”. I’m willing to provide help to anyone using C++ or BPs but I highly encourage you to use BPs if possible. Make sure to keep your GestureTracker’s parameters within the range of values they are clamped to in blueprints. Values outside of the clamp range are not guaranteed to work and may cause a crash (e.g. Gesture Resolution <= 0).
Are Mac, Android, iOS, etc. supported?
These platforms are not officially supported since I have not tested for them but the component should work fine and users have successfully used the plugin on all of them. If you’re building for one of these platforms make sure to check GestureTrackerVR.uplugin in the plugin’s folder for the WhitelistPlatforms list at the bottom. Add your platform string to this list (e.g. add “Mac” or “IOS”) if it is not already present.
How do I save/load gestures? Can I package default gestures with my game?
Gestures can be saved and loaded using the GestureTracker component’s Save and Load functions. Gestures are stored in the Content folder so the arguments to these functions should be paths relative to the content folder. Gesture save files are just binary data files without type, but you can give them some standin extension if you wish. To provide a set of default Gestures with your game you can specify to package the gesture save with your game and load it when the game starts. To package the save file you’ll need to add the directory containing them to ‘Project Settings > Packaging > Additional Non-Asset Directories to Package’. So you would, for example, save all your gestures to a “Gestures” folder in the root of your Content folder and then add the Gestures folder to the list of Additional Non-Asset Directories to Package.
Can I perform multiple gestures at the same time?
Yes. You can add as many GestureTracker components as you’d like and they will all perform tracking and recognition independently. If you would like to do two handed combined gesturing you can just check that the GestureTracker on each hand recognized the desired gesture within a certain time frame of each other.
Does this have to be used with VR?
The system is designed with VR in mind but users have had success using it to track mouse and finger gestures (on mobile). Feel free to ask me about your use case if you’re unsure if this is right for you.
1.1 - 11/3/16 (4.13)
Added baseMaterial input to draw functions. This is an optional replacement the draw mesh’s default material that lets you easily apply dynamic materials to the gesture draw.
Implemented the GestureTracker Update Track/Predicted Draw Materials functions, as well as the Update Materials function for GestureMesh objects.
Fixed Draw by Id and Draw by Name functions.
Other minor fixes and comment improvements.
1.2 - 11/18/16 (4.14)
Made GestureTracker.h public for C++ access. I still recommend using blueprints.
Added GetNumGestures and GetIds functions to get the number of recorded gestures and an array of their ids respectively.
Added UpdateTrackIds and UpdateDrawIds functions to let you change the blacklist/whitelist of drawing/tracking in the middle of recognition.
Added RecognitionDuration float output to recognition functions and events which gives the time it took the user to perform the gesture in seconds.
1.3 - 2/22/2017 (4.14/4.15)
- Fixed an issue with tracked draw lines drawing incorrectly while moving and tracking by the world location.
- Fixed gesture disqualification tolerance so its the same for all resolutions (used to be so that a smaller resolution permitted a smaller off-gesture distance before disqualification).
- Added resetOnRecognized argument to continuous recognition. This lets you determine whether recognition for a gesture should restart as soon as it has been completed, or if it will only restart once the gesture has stopped being performed (e.g. If true, a single downward swipe gesture could be recognized multiple times by doing one long downward swipe. If false, you must stop performing the downward swipe and then start it again for it to be recognized a second time).
- Added GetGesturePath(int id) and GesturePredictedGesturePath() functions to get a TArray<FVector> describing the path of the gesture with the given id and the currently predicted gesture respectively.
- Removed unused EGestureTrackerMode enum.
- Minor naming and organization improvements.
1.4 - 6/2/2016 (4.16)
Added BodyLocationComponent, a reference which is null by default. If set to point at your VR “body” (typically the camera object), the vector between this component and the
Tracker will be used to determine that casting rotation. This resolves issues where the gesture rotation is incorrectly interpreted (e.g. if the player is pointing the
motion controller forward but with a 90 degree twist in the wrist) but may also be slightly less precise about determining the gesture rotation. Leaving this null will
leave rotation tracking working the same as it has in the past. This may or may not be better recognition depending on what kind of gestures/recognition are being used.
Added trackFailDistance and continuousFailDistance parameters to determine how far off a given gesture’s path you must go to fail recognition of that gesture.
Added predictionPercentage parameter that determines what percent a gesture must be completed before it can be displayed as the predicted gesture.
Added rotationOffset parameter, which adds degrees to your gesturing rotation in case the GestureTracker is offset from the player’s actual aiming direction.
Added GetPredictedGestureId() and GetPercentageComplete(int id) functions.
Continuous recognition should now be more reliable. Expect more improvements in the future too.
All parameters are now read/writeable in blueprints. NOTE: Please make sure you stick to parameter values inside the ranges of the sliders in the details panel. Values outside these ranges (e.g. gesture resolution <= 0) are unsupported and could cause crashes/other issues.
Minor readability, naming, comments, and organizational improvements.
1.5 - 7/6/2017 (4.16)
Added LengthRatio output to FinishRecognition and GestureRecognized events representing the ratio of the length of the tracked gesture to the length of the recognized gesture. So if the tracked gesture was twice as long as the recorded gesture it recognized then LengthRatio will be 2.
Added NormalizePitchRotation to tracking params. Normally gestures are only normalized by yaw rotation. Enabling this will also normalize them by pitch so they rotate with you as you look up and down. This means if you define, for example, a forward punch gesture it will also be recognized if you punch up or down. Or if you draw a triangle in front of you it will also be recognized if you draw a downward facing triangle above you.
Also added UseBodyLocationForPitch; if true we use the BodyLocationComponent for pitch normalization (if we’re normalizing pitch and the BodyLocationComponent has been set), otherwise we’ll use the gesture tracker rotation as normal. This should usually be false since, if you’re using your VR camera as the BodyLocationComponent, there will be a negative bias in the pitch (since your hands rest about 40 degrees below the camera).
Moved GestureMeshComponent to the public folder to resolve an issue with blueprint nativization.
1.5.1 - 8/2/2017 (4.16)
- Added missing include whose absence could cause compile errors for shipping builds.
1.6 - 8/9/2017 (4.17)
- Updated to UE 4.17 but no meaningful changes.