Gesture Tracker VR Help and Feature Requests

Thank you very much! That’s a good compromise… If you had some documentation about this or other type of information, please, contact me at :slight_smile:

Hey, hdelattre!

So you basically use movable mesh on spline to recognise gesture? How is it working with different scales though?

Correct me if I’m wrong)

Hi Two-faced,

I’m not actually moving a bead mesh along a spline to do the recognition (although the draw functions effectively give this appearance), that’s just a metaphor to give a rough idea of how the recognition algorithm works. Essentially each recognizable gesture internally keeps track of an index along its path that recognition has advanced to. This index represents where the bead would be.

When a user is performing a gesture the algorithm will look at the last movement vector and move the bead forward on a given gesture if is similar enough that gesture’s path where the “bead” index currently is. The max amount the bead can move forward is equal to the amount the user moved forward multiplied by the recognition ratio parameter, which by default is 3 or 5 or something like that. This means that, by default, a performed gesture can be a third the size of a reference gesture and still be recognized as long as the shape of the gesture was the same. I wanted the recognition to care more about the general shape of the performed gesture than the size of it, so if you record a big square gesture and then perform a small square it will still recognize it by default. Of course you can change this if you want by making the recognition ratio 1.

It’s quite interesting. It’s sad that there is no way to make procedural templates.

Hey, hdelattre!

I purchased GestureTrackerVR plugin. Now I want to interface 5DT data gloves with virtual environment created in Unreal Engine. Please do help me out and give me a brief.

Regards,
Mnrmja007

Having fun with this. I think being able to pull the %complete of both recognized and continuous recognitions gestures at will would be good. So if you have a gesture in progress, you would be able to get the info “gesture id 3 completed 58%” for example. Make it easy to build more juicy feedback for the player even before a gesture is completed.

Thanks for a great plugin!

Hi Karmington,

Thanks for the suggestion, I’ll add functions to get the completion percentage for a given id/name and for the predicted gesture. Look out for them in the next update!

Hi hdelattre ,

I have just bought your plugin. I set it up exactly the way you did in youtube video but i cant manage to get even the start recognition during playtime. Can you provide step to step setup ?? Appreciate it so much …

Hi Weihow,

I recommend downloading the demo project at dropbox.com/s/fy0y3vdpsdunbws/GestureTrackerVRDemo.zip so you can compare it to your project. Make sure you’ve set up the drawing functions so you can visualize if recognition is working, make sure you’ve set up your inputs properly, and make sure the GestureTracker is attached to the proper motion controller.

Hi Hunter,

Is this getting updated to 4.16 soon

Thanks

Hi Darin,

It was just updated today actually, hopefully you’ve seen it already. I’ll update the OP with the changenotes.

Hi @hdelattre
I’m the guy from YouTube who was asking you about implementing this into a 2d game.

So i attached the gesture component to another component which moves with the mouse position on tick.
But i couldn’t get the gesture to be drawn nor recognized. The next image shows my setup

Also i tried to download the demo project to compare, but the link is invalid.

Thanks.

Edit: So i got the gestures recognized successfully by enabling “use World location” under tracking. However i currently cannot draw the gesture on screen, any advice?

Hi @IronSuit,

I’m glad to hear you got recognition working. Sorry about the demo project being unavailable, I accidentally revoked the link a few days ago. You can download it here. Unfortunately it’s still using 4.13 but you should be able to upgrade it to 4.16 easily (or just check it out in 4.13 if you still have it).

As for the drawing, I think it has to do with the way you’re moving the gesture object. The draw occurs at the GestureTracker’s location. You’re setting its position to the mouse coordinates so it’s always going to be near the world origin. I think it’s drawing but just near the origin and not on screen where you want it to be. You’ll need to move it to the actual world position wherever you click, not just the mouse coordinates. Try using the convert mouse position to world space node.

Hey there, i’ll try to do that and see what will happen.
But tbh, i’m pretty happy with what i got right now.
I was able to save the circle and currently i’m getting it recognized quite good. Thanks for your help.

Hi Hunter! Is there a way to compare the size of a recognized gesture to the one that was used for recognition? I know there are two recognition ratio parameters, one for tracked and one for continuous, but I need to know by how much a recognized gesture is bigger ou smaller than its referenced gesture. Could you help me with that? Thank you in advance.

Hi @joycehorn,

That would be easy to implement. Gestures have a PathLength() function, so you could divide the tracked gesture path length by the recognized gesture path length to get a ratio.

I can add a length ratio output to FinishRecognition and the recognized events. Look for it in the next update!

Hi @hdelattre,

Great plugin, easy to install and setup.

My first gesture is a punch motion. Is it possible for the gesture system to pick up the punch gesture in any direction? For example, I have the punch gesture recorded directly in front me of. Can that gesture movement also be recognised if I punch towards the sky?

Hi @hdelattre,

I was wondering what would be the best way to check the progress of a gesture before the gesture is completed. I saw that you were working on implementing this for the plugin, but if you have any suggestions on how to go about doing this, that would be incredibly helpful.

Hi @Bino,

Currently only yaw rotation is normalized, so a punch forward would be considered different from a punch up. But I have good news! I’ve actually been planning on adding an option for normalizing pitch rotation as well for a game I’m working on where you happen to look up and down a lot. So it’ll work how you want soon, just look out for another update within a week or so!

Hi @drcook445,

Did you get the latest update for 4.16? As mentioned in the change notes, I’ve added a GetCompletionPercentage(int id) function. If you want to check the predicted gesture’s completion you can use the GetPredictedGestureId() function and plug the result into GetCompletionPercentage.