Gesture Tracker VR Help and Feature Requests

Hi @joycehorn and @Bino,

I’ve just submitted the next update to Epic which includes the features you asked for. Check the changelog for v1.5 at the bottom of my first post for details. The update should be available within a day or two.

Thank you @hdelattre ! :slight_smile:

It seems that when drawing the gesture frame rate drops to about 50-ish. Any pointers to where I could tweak the settings to gain performance ?

Hi @Shin_ji,

You could increase the gesture resolution which will reduce the number of drawn segments (sorry I know the name is kind of confusing, resolution in this case means the length of a tracked segment of the gesture so a higher resolution means coarser and faster tracking). I’m planning on decoupling the tracking resolution and draw resolution in the future so you’ll be able to have higher accuracy gestures without making the drawing more expensive.

You could also try using your own mesh and materials for drawing that are cheaper to render. I’ll look into more ways of improving the draw time.

Hi Hunter, how do I use my own mesh and materials for drawing gestures?

Hey @hdelattre, thanks for the input, I’ll look into it and let you guys know.

Hi @PredalienatorX,

The draw functions have mesh and material inputs. If they are left empty then the default/last set mesh and materials will be used so you just have to plug in whatever stuff you want to change. The base material input replaces the mesh’s material so if the mesh already has a material you’re fine with you don’t need to use it, although you can put a material instance into the base material if you want to change material parameters at runtime. Materials you provide have to have the Usable with Spline Meshes option marked true (in the material editor). The mesh should be a very small shape since it will represent one length of tracked gesture which is generally only a few units long.

Start Draw Tracked Gesture not using specified StaticMesh, and only uses default mesh

Hiya,

Awesome work on this plugin. Thanks for your work.

I am attempting to add a Plane StaticMesh to the Start Draw Tracked Gesture node because I want to change the scale of the plane according to Motion controller tilt etc.
I added a Plane Static Mesh Component > Cast to StaticMesh node > the Start Draw Tracked Gesture’s Mesh input, but this does not seem to affect it, and it seems to just use the default mesh.
I am pretty new to Unreal and Blueprints, so please excuse if I am missing something more obvious, but I am hoping you could give me some pointers on what I am missing?

I am also wondering how i might change the total length of a drawn gesture before it disappears…

Thank you!

Hi @karmakat

Instead of connecting a pin to the static mesh input just click the dropdown next to the pin and select the mesh asset that you’d like to use. The plugin doesn’t currently support dynamic scaling of individual segments of the draw path, but it’s something I’ll consider for future updates. Until then a hacky way to do it yourself would be to start drawing but use some invisible mesh/material. On tick get the gesture tracker’s TrackGestureMesh and/or PredictGestureMesh (whichever you want to draw), check if they’re valid, then use their spline points (they are both splines) to update your own drawing code. You’d have to do some work to make this have good performance but it’s possible.

As for your second question, just select the GestureTracker and in the Details panel look under Parameters > Drawing. You’ll want to modify the track draw distance.

Howdy, I am having trouble installing the plugin. Its not so much going to the plugin menu and activating it but the install to engine function itself doesn’t appear to be working. Just wanted to give you a heads up about that and was wondering if your able to contact epic or something about.

Just wanted to give an update… not sure why but install to engine is now working so you can disregard my last post. Was also wondering if say i did the fireball gesture, would there be a way to hold the fireball and ignore other gestures until i released it? This would greatly help with aiming. (Unless im missing something, which is very well a possibility x.X still reviewing your blueprints for everything.) Thank you for your time :slight_smile:

Hi @WolfyJowol,

Glad to hear the install issue is fixed. You can easily do that fireball setup yourself. If you’re using regular recognition then you won’t start looking for other gestures after making your fireball anyway since FinishRecognition() ends tracking.

Setup finish recognition so if the recognized id is your fireball id (whatever you decide that is, lets say 0) then you spawn a fireball, attach it to your hand, and set some flag indicating that the fireball is in your hand. If you’re using trigger to StartRecognition() then you could check if you’re holding a fireball and throw it instead of starting recognition when you pull the trigger. You could also just start recognition on a different input and then you wouldn’t have to worry about holding the fireball causing gestures to be recognized.

If you’re using continuous recognition then when ContinuousGestureRecognized is called you’ll just want to check if you’re holding a fireball and do nothing if you are (or you could end continuous recognition when you pick up the fireball and restart it when you let go).

My demo video shows the most basic setup possible, you can and should set up your recognized events so they only trigger actions if the game says they can.

Hello, I have one more question(for now, sorry x.X) instead of going down a list of spells with your setup, its possible to, say you have a row of hotkeys 1-10 can you assign gesture 1 to hotkey slot 1, gesture 2 for hotkey slot 2 etc? (sorry for the noobish questions x.X)

You can’t assign gestures directly to input actions (if that’s what you mean by hotkey) but you can have your gestures trigger whatever those hotkeys would trigger. If each hotkey is an event then you just switch on the gesture id and trigger the different hotkey events based on the id.

Hi Hunter
I just buy your plugin and i love it. Now i want to modify some code to fit my requirement , How can i do that?

Hi @TanRem, I responded on the marketplace page but I’ll duplicate my response here:

Make a Plugins folder in the root of your project folder. In your Unreal install, go to Plugins/Marketplace and move the GestureTrackerVR folder from there into the Plugins folder you just made. You can then modify the source and compile your project in visual studio to make changes to the plugin. As far as I know you cannot recompile the plugin within the editor (it won’t hot reload) so you’ll need to close the editor and recompile In VS any time you want to make a change.

Hi Hunter, thank you for your plugin is really nice and thank you for your hard work ! I have a question i’m not sure how to achieve this, let s say I want to launch a fireball at a bird or a flying creature but when i try it s only going on the level of my HMD ? I hope I explained well enough!

Hi@jhonut ,

I’m not certain that I understand your question but it sounds like you’re having a hard time getting your fireballs to fly upward instead of just horizontally when you do a gesture? You could just use the forward vector of your hand when the gesture finishes to define the movement direction of the fireball. More details or pics of relevant blueprints would be helpful!

Hey Hunter,

Your gesture implementation looks really promising! I’m currently working on a VR project on a MacOS system. Would it be possible to get access to a Mac version of your plugin? I could compile it myself if you’d be willing to provide source material (which i can totally understand!) Let me know if you’re interested and dont have access to a Mac system. I’d love to give it a try!

What i’m hoping to achieve is a (continuous) recognition of a fast flick gesture forward from the wrist. It sounded like your implementation doesnt support any kind of rotating gesture recognition so i am wondering if the rather small and fast translation of the motion controller is likely to be picked up by your algorithm or not?

Hi @a_prototype,

I’m 99% sure the plugin will work fine on Mac, I just don’t have it listed as a supported platform since I can’t personally test for it. If you end up using it for Mac make sure to add “Mac” to the WhitelistPlatforms in GestureTrackerVR.uplugin otherwise it won’t be packaged properly when you build for shipping.

The algorithm will handle a small flick fine, but given that such a gesture is performed all the time naturally while doing other stuff you might end up frustrated if you use something like that with continuous recognition. It definitely works though, for example at the end of my demo video I had a downward flick gesture set to push me upward for a basic flight simulation for a fun test. It worked great but also meant any time you were moving your arms downward you’d start flying up even if you didn’t mean to. Checking the the time taken was less than something like .3 seconds could help though, since it would ensure that your game only responded to fast intentional flicks and not just passive movement.