Gesture Tracker VR Help and Feature Requests

I figured I’d make a Gesture Tracker VR support thread so I can have a place for feedback, public support and feature requests.

FAQ:

Does this work with Oculus Touch, Leap Motion, Perception Neuron, etc.?

Gesture recognition depends only on the transform of the GestureTracker component itself and does not depend on any device specific functionality or API so it will work with any input device. As long as you attach it to some moving component that you want to track it will work.

Is C++ supported?

C++ is supported but discouraged. The component was designed to be used in blueprints and is not as easily setup in C++. To use the plugin in code add “GestureTrackerVR” to your project’s public/private module names and then include “GestureTracker.h”. I’m willing to provide help to anyone using C++ or BPs but I highly encourage you to use BPs if possible. Make sure to keep your GestureTracker’s parameters within the range of values they are clamped to in blueprints. Values outside of the clamp range are not guaranteed to work and may cause a crash (e.g. Gesture Resolution <= 0).

Are Mac, Android, iOS, etc. supported?

These platforms are not officially supported since I have not tested for them but the component should work fine and users have successfully used the plugin on all of them. If you’re building for one of these platforms make sure to check GestureTrackerVR.uplugin in the plugin’s folder for the WhitelistPlatforms list at the bottom. Add your platform string to this list (e.g. add “Mac” or “IOS”) if it is not already present.

How do I save/load gestures? Can I package default gestures with my game?

Gestures can be saved and loaded using the GestureTracker component’s Save and Load functions. Gestures are stored in the Content folder so the arguments to these functions should be paths relative to the content folder. Gesture save files are just binary data files without type, but you can give them some standin extension if you wish. To provide a set of default Gestures with your game you can specify to package the gesture save with your game and load it when the game starts. To package the save file you’ll need to add the directory containing them to ‘Project Settings > Packaging > Additional Non-Asset Directories to Package’. So you would, for example, save all your gestures to a “Gestures” folder in the root of your Content folder and then add the Gestures folder to the list of Additional Non-Asset Directories to Package.

Can I perform multiple gestures at the same time?

Yes. You can add as many GestureTracker components as you’d like and they will all perform tracking and recognition independently. If you would like to do two handed combined gesturing you can just check that the GestureTracker on each hand recognized the desired gesture within a certain time frame of each other.

Does this have to be used with VR?

The system is designed with VR in mind but users have had success using it to track mouse and finger gestures (on mobile). Feel free to ask me about your use case if you’re unsure if this is right for you.

Changelog:

1.1 - 11/3/16 (4.13)

  • Added baseMaterial input to draw functions. This is an optional replacement the draw mesh’s default material that lets you easily apply dynamic materials to the gesture draw.

  • Implemented the GestureTracker Update Track/Predicted Draw Materials functions, as well as the Update Materials function for GestureMesh objects.

  • Fixed Draw by Id and Draw by Name functions.

  • Other minor fixes and comment improvements.

1.2 - 11/18/16 (4.14)

  • Made GestureTracker.h public for C++ access. I still recommend using blueprints.

  • Added GetNumGestures and GetIds functions to get the number of recorded gestures and an array of their ids respectively.

  • Added UpdateTrackIds and UpdateDrawIds functions to let you change the blacklist/whitelist of drawing/tracking in the middle of recognition.

  • Added RecognitionDuration float output to recognition functions and events which gives the time it took the user to perform the gesture in seconds.

1.3 - 2/22/2017 (4.14/4.15)

  • Fixed an issue with tracked draw lines drawing incorrectly while moving and tracking by the world location.
  • Fixed gesture disqualification tolerance so its the same for all resolutions (used to be so that a smaller resolution permitted a smaller off-gesture distance before disqualification).
  • Added resetOnRecognized argument to continuous recognition. This lets you determine whether recognition for a gesture should restart as soon as it has been completed, or if it will only restart once the gesture has stopped being performed (e.g. If true, a single downward swipe gesture could be recognized multiple times by doing one long downward swipe. If false, you must stop performing the downward swipe and then start it again for it to be recognized a second time).
  • Added GetGesturePath(int id) and GesturePredictedGesturePath() functions to get a TArray<FVector> describing the path of the gesture with the given id and the currently predicted gesture respectively.
  • Removed unused EGestureTrackerMode enum.
  • Minor naming and organization improvements.

1.4 - 6/2/2016 (4.16)

  • Added BodyLocationComponent, a reference which is null by default. If set to point at your VR “body” (typically the camera object), the vector between this component and the
    Tracker will be used to determine that casting rotation. This resolves issues where the gesture rotation is incorrectly interpreted (e.g. if the player is pointing the
    motion controller forward but with a 90 degree twist in the wrist) but may also be slightly less precise about determining the gesture rotation. Leaving this null will
    leave rotation tracking working the same as it has in the past. This may or may not be better recognition depending on what kind of gestures/recognition are being used.

  • Added trackFailDistance and continuousFailDistance parameters to determine how far off a given gesture’s path you must go to fail recognition of that gesture.

  • Added predictionPercentage parameter that determines what percent a gesture must be completed before it can be displayed as the predicted gesture.

  • Added rotationOffset parameter, which adds degrees to your gesturing rotation in case the GestureTracker is offset from the player’s actual aiming direction.

  • Added GetPredictedGestureId() and GetPercentageComplete(int id) functions.

  • Continuous recognition should now be more reliable. Expect more improvements in the future too.

  • All parameters are now read/writeable in blueprints. NOTE: Please make sure you stick to parameter values inside the ranges of the sliders in the details panel. Values outside these ranges (e.g. gesture resolution <= 0) are unsupported and could cause crashes/other issues.

  • Minor readability, naming, comments, and organizational improvements.

1.5 - 7/6/2017 (4.16)

  • Added LengthRatio output to FinishRecognition and GestureRecognized events representing the ratio of the length of the tracked gesture to the length of the recognized gesture. So if the tracked gesture was twice as long as the recorded gesture it recognized then LengthRatio will be 2.

  • Added NormalizePitchRotation to tracking params. Normally gestures are only normalized by yaw rotation. Enabling this will also normalize them by pitch so they rotate with you as you look up and down. This means if you define, for example, a forward punch gesture it will also be recognized if you punch up or down. Or if you draw a triangle in front of you it will also be recognized if you draw a downward facing triangle above you.

  • Also added UseBodyLocationForPitch; if true we use the BodyLocationComponent for pitch normalization (if we’re normalizing pitch and the BodyLocationComponent has been set), otherwise we’ll use the gesture tracker rotation as normal. This should usually be false since, if you’re using your VR camera as the BodyLocationComponent, there will be a negative bias in the pitch (since your hands rest about 40 degrees below the camera).

  • Moved GestureMeshComponent to the public folder to resolve an issue with blueprint nativization.

1.5.1 - 8/2/2017 (4.16)

  • Added missing include whose absence could cause compile errors for shipping builds.

1.6 - 8/9/2017 (4.17)

  • Updated to UE 4.17 but no meaningful changes.

Hi Hunter,

as promissed I got my boss to get me your plugin. And I truely love it, great job.

However,
I seem to have trouble transfering gestures between levels.

I start continuously recognizing at begin play for both hands, but the continuous recognizing recognized something doesn’t recogniz anything. I think it is due to the gesture not being saved between 2 levels. Maybe. I don’t know. And if I save and load a gesture, how can I assign the loaded gesture an ID?

EDIT:

Figured it out. Using Save and Load fixed the level issue for me. I guess they keep all properties including their IDs.

Though now I have a new problem.

In the new level the gestures don’t get recognized as well.
The pawn I’m using is traveling at a somewhat high speed through the level and also gets roteted depending on the controller positions relative to one another. Can this affect the consistency with wich the gestures are being tracked?

Hi Investigator,

Sorry for the late response. You are correct about the Save/Load functions, glad you figured it out.

As long as you’re not tracking by the world location (and you won’t be by default) the speed you’re moving at should not make a difference.

However, the tracker uses its rotation to try and understand the direction you’re oriented and performing the gesture, but it does assume that you aren’t turning during the gesture (or if you are, it assumes that is part of the gesture). I think I’ll just have to add a option to use the relative rotation, but it could be more complicated than that. If I remember correctly you’re creating a game with a flight mechanic. Something like a flap downward while maintaining the same orientation is considered different from a flap downward while rotating your body (and arms) 90 degrees, I don’t know if that describes your problem well though. Can you describe specifically what kind of gestures you’re performing and the situations in which they fail?

You described my problem pretty well. However, flapping while turning rarely happens so far, luckily. It would make the whole thing a bit mor complex.
Using the draw function I noticed that the drawn lines are somewhat off/odd.
Even though I disabled the tracking by world location the controllers seem to be constantly traveling forwards and the line is drwan around 50cm to 75cm infropnt of the actual controller position while the controllers are beeing tracked just fine.

The gesture I’m using is, for the sake of consistency in recognition, a simple downward pull on both controllers simultaneously. Before I was using a whole flap cycle which the testers failed to reproduce (even though they recorded their own cycles).
The main issue is resolved I think, just the draw is off somehow.

It looks like the drawing issue arises when the owning actor is moving at high speeds when drawing starts while tracking using the world position. I’m looking into a solution.

It might be interesting to track orientation of daydream controller has gestures. Kind of like the hand signals used in E.T. movie.

Hello there :slight_smile:

I bought your plugin but can’t manage to make it work in package version (ue4 4.13, development build). By “dont work” I mean that the gestures are not recognized in packaged version, I believe they are not loaded (but I’m not sure).

What am I doing wrong?

My content folder: ReloadTest is a saved gesture file, I copied and pasted it in the Gesture folder.
2dd363c83fb1dd6807256443b15216ff40a6f188.jpeg

This is the content of the Gesture Folder, there is only the copied saved gesture file.
a5fd80a2de405b95dc339c3bc37f5af3180deba6.jpeg

These are my additional non-asset directories to package and I did add the gesture folder:

Thanks in advance for any answer :slight_smile:

Hi Elliot,

Is it possible you’re still trying to load the ReloadTest file in your root folder (which won’t be packaged but will work in the editor) instead of the one in your Gestures folder? I’ve done the exact same setup in my test project and it works on my end. If not that then maybe try saving directly to the Gestures folder instead of using a copy.

A snapshot of the relevant blueprint/code would necessary to troubleshoot any further.

I was loading “ReloadTest” instead of “Gestures/ReloadTest”, silly mistake. Your answer put me back on track.

Everything is working as intended now, thank you for your support :slight_smile:

Hi C++ question here…
Can you please elaborate how can i get the tracked points from a predicted gesture? I am trying to understand your code but doesn’t really make sense.
I would like to compare 2 predicted gestures made by the player and check how similar they are.

Hi Azarus,

The GestureTracker has a GestureLibrary object called gestureLib, which has a Get(int index) method to get RecordGesture objects which are stored gestures. The index is just determined by the order they were recorded, but you can get it from the Id using the GetIndexById(int id) method. RecordGesture’s have a Path() method which returns a TArray<FVector> of the gesture path. There’s no path comparison functionality built in but I imagine you have some sort of distance metric in mind.

The gestureLib is private since I didn’t intend for gesture paths to be accessed by outside code, but you can drop the function below into GestureTracker.cpp if you need to do so. Make sure you compile in visual studio and not using the button in the Engine, since it will only recompile your game’s code and not the plugin.

In GestureTracker.h
[FONT=Courier New]
TArray<FVector> GetPathById(int id) const;

In GestureTracker.cpp
[FONT=Courier New]
TArray<FVector> UGestureTracker::GetPathById(int id) const
{
int index = gestureLib.GetIndexById(id);
return gestureLib.Get(index).Path();
}

Hi thanks for your answer, but it only returns the recorded gestures? I would like to get the currently predicted ones, and not the stored recording ones. :slight_smile:

Also noticed the code is not updated to 4.14 yet :rolleyes:

Edit:
After a while of digging your code i got my stuff working, thank you :slight_smile:

The code has been updated for 4.14 for several months, are you sure you’ve downloaded the latest version?

I got a bWantsBeginPlay is deprecated warning.

You must have the old version then, this was fixed with version 1.2 back in November when 4.14 came out. When you downloaded the plugin you chose to have it installed for 4.14 right?

I downloaded it around last week i think, and it was 4.13 indeed :stuck_out_tongue:

I really enjoy your plugin. Would it be possible for you to save the gestures into data assets rather than in a seperate binary file? So we could adjust the gesture tracking information by hand in the data asset, also delete invidual gestures, give them a new index or different names.

I might be doing something wrong, but this is just hard that when i record a gesture and its bad, i have to delete the saved file and record all the gesture again.

If you record a gesture with the same id twice it will overwrite the first one, so you can just keep recording with the id you want until you get the gesture you like. Creating custom asset files with the ability to modify the gestures in editor is something I’m unfamiliar with and would probably take a while to implement, but I’ll definitely consider it for the future. I’ll add a function to delete the gesture with the specified id in the next update. For now you can always just start with a saved set of gestures you know you want, and if you add any undesired gestures you can load the saved set again and it’ll clear the newly added unwanted gestures.

Very good work! I bought the plugin…
I need some informations about the implementation: what machine learning technique did you use to classify?? Dynamic Time Warping? Hidden Markov Models? others?
Could you give me more specific information about the plugin structure? I have to insert this information in my master thesis…I would be grateful

I used an algorithm I developed myself that doesn’t use any machine learning architectures. My technique doesn’t have any academic foundation, it’s just an idea I had that I tried and tweaked until it felt good to me. I was inspired by these children’s wire toys. The wire represents the gesture path. As long as you’re pulling the bead in vaguely the same direction as the current part of the path the bead will advance along the wire. If the bead makes it to the end of the wire the gesture is completed. This doesn’t exactly describe the algorithm, but basically if the tracked motion vector and vector for the current part of the reference gesture where the “bead” is have a dot product greater than the Acceptable Similarity parameter then the bead will advance along the wire. Gestures are stored with their yaw rotation normalized around 0 so you can do the same gesture while facing any direction (I do my best to interpret the direction the user is facing using the rotations of the tracker component).

There’s a lot in the details of course but if you want to go that far I’d just look through the source. It’s not as mathematically rigorous as other methods but it’s cheap, recognition is O(n) in the number of gestures. It also makes continuous recognition easy, since I just have to reset the “bead” back to the start of a gesture’s “wire” every time its determined the gesture was not being performed. Continuous recognition is somewhat more expensive though, since it uses additional memory Θ(n) in the number of gestures (realistically this will never be more than a few kilobytes though) and no gesture can ever be ruled out (unlike during normal recognition, where most gestures are ruled out almost immediately) so its recognition is Θ(n).

1 Like