Custom Gesture Recognition



A Gesture recognition plugin for Unreal Engine 4 allowing to record, detect and follow the progression of gestures from motion controllers in real-time with a high precision.

As a developer/designer, you only need to record one example per gesture. This can be done live in the game.
Once the template has been recorded, the listener will be able to detect when a gesture is started and follow its progression in real time.
You can freely add events based on the progression such as triggering a spell once the gesture is 95% done, or tick a spell for each 10% of gesture completed.

**The plugin is currently under development but the following core features are already functional:

  • Simple component setup allowing fast integration in a project from BP or C++
  • Offline and online reference gesture recording
  • Only one reference gesture needed to teach the recognizer
  • Allow the usage of 2D and 3D gestures
  • Real time recognition with high accuracy and robustness
  • Possibility to run multiple instance of the recognizer in parallel

You can test an early version(Vive only) here:

You can download the example project files on GitHub: GitHub - Deams51/UE4-Gesture-Recognition-Plugin: A 3D Gesture recognition plugin for Unreal Engine 4 allowing to record, detect and follow the progression of gestures from motion controllers in real-time with a high precision.


In this video you can see a gesture being recorded and then used to cast a spell:

Here, you can see an example of actions bound at run-time to gestures (binding a square to spawn a cube, a circle for a sphere…):

Projects using gestures
If you created a new VR experience using gestures, let us know and we will add it here for all to see! :slight_smile:

This plugin would not have been possible without the previous work published by the HCI researchers over the years, and especially the following article by B. Caramiaux et al:
B. Caramiaux, N. Montecchio, A. Tanaka, F. Bevilacqua. Adaptive Gesture Recognition with Variation Estimation for Interactive Systems. ACM Transactions on Interactive Intelligent Systems (TiiS), 4(4), 18-51. December 2014

Ideas/Questions/Feedback? **
Feel free to post here or create an issue on GitHub.

Been a long time since I’ve first posted, but I’m still working on it when I have a bit of time.

Here is a Vive demo of the plugin in action: GesturePluginExample-0.3.rar - Google Drive
I’m definitely interested by any feedback you can have!

It works very well ! I encourage who is interested in to try it !

Thanks Greggus!
Hopefully I will have the time to work on the release this weekend.

I can’t wait to test it :slight_smile:

Also really looking forward to this! :slight_smile: I sent you a PM too.

I noticed your system doesn’t properly accomodate for casting the gesture while looking in a different direction than where it was defined. Do you have plans to fix this?

Generally, any update for this?

I’ve been thinking on how to address this issue and will be working on a fix asap.

Been quite busy here, but haven’t forgotten about it.

The plugin is now available for everybody to play with.
The example project files can be downloaded from GitHub:

Really impressive! I can easily see how this is super useful for VR devs

Thanks !
Looking forward to seeing what the community will come up with.

This is so awesome. Thank you so much Deams! I’m diving in now, can’t wait to start casting spells!

Thanks! :smiley:
Feel free to share with everybody here the projects you build with the plugin!

I get an error saying I need to install the LowEntryExtStdLib from the marketplace when trying to run the test project. Do I need that to be able to use the plugin properly in my own project?

Does the plugin with with blueprints?

Keep up the good work btw! Highly appreciate it!

Alrighth, I got it working. I don’t know what exactly was wrong, but I fixed it.

As you stated in your README you are going to add a tutorial, right?
I would love to have one as I’m still rather new to UE and don’t quite get how everything works.
So, could you give me some hints on how to use it?

My boss told me I could tell you a bit about my project, yay!
I’m developing a sort of playable teaser in VR for the upcoming movie Manou the Swift by LUXX Studios which they are currently workling on fulltime. It is going to be a lot about flying, who could’ve guessed that.
The game is a one man project and I only started getting into Unreal exactly 2 weeks ago so please excuse my newbyness. I did actually put together a rotationbased movement component in less time than I thought it would take me to learn all these nodes in BPs.

I want to make it so that when you have your arms streched out and flap them you also flap your wings ingame and get some extra height to continue gliding. However I do not want the player to set up the gesture himself but for it to be predefined. How should I do this? And should I do this in the level or the pawn blueprint? So far I have the whole movement inside the pawn.

On the ‘record gesture’ node do I need to set one of the controllers as target? Can I mirror the gesture to also use it on the other hand/controller or is the tollerance of the listen function big enough for it to not matter?
I would try and find out myself if I had an idea how to set it up.

I’m sorry if my questions are a bit on the clueless and newby side of things and hope they are not too annoying to read through or answer.

Would love to get some advice!

Hey Investigator,

As it seems you might not have seen it, the current project is under GPL license and will not allow you to use it for anything commercial.
I have an MIT version of the code in development, which you will be able to use for commercial projects, but no ETA as I have to focus on my paid work first.
A one man team for a movie game project? And a beginner in UE4 and VR?
Well at least you’re gonna have a lot of fun! :smiley:
But you’ll have to teach me how you got that job. :wink:

Concerning flying, I worked on such a system for a client and based on this experience, I wouldn’t recommend you to use the gesture recognizer for such behaviors.
It can work, but it’s just not done for it!

You can mirror a recorded gesture, but the how will depend of you and which data you used as input to the gesture.
I’d recommend doing it on the pawn BP directly.

So, is there a special reason why this is GPL? And why do you need to have some version in development for another license, i didnt see anything in your code that forces the GPL on your code too? Except of course you are going the same way as so much others and will do a non-GPL version for selling it.

Anyway, i have to admit i didnt especially look at the license file since you’ve put your plugin here and wanted to hear about projects using it, with that i was just expecting it to be something like LGPL or Apache License. As it stands now, i will have to remove gestures completely from my project since i cannot use this plugin anymore (i will sell the game once its finished).

The reason is that even though most of the code has been rewritten, a few lines here and there and the architecture is still similar to code written in the paper I quoted above, which is under GPL. It sucks, but it’s not my decision. So, no I haven’t released as GPL because I plan to make money out of a non GPL version. :slight_smile: That’s in fact a bit insulting since most of my work is pushed online and under MIT. :frowning:

I still have in mind to rewrite the whole plugin with a new architecture, but I couldn’t find the time to finish it for now.
But you are free to write your own implementation, the code is pretty self explanatory, and you can read the associated literature.

Ok, apologies! I was just a bit…sad, that i had to remove all the mechanics with gestures from my game demo. I should have thought a bit more over it before writing something, or even look at the license file before i started anything with it.