Announcement

Collapse
No announcement yet.

Gesture Tracker VR Help and Feature Requests

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    I downloaded it around last week i think, and it was 4.13 indeed :P
    Last edited by Azarus; 01-31-2017, 08:38 AM.

    Comment


      #17
      I really enjoy your plugin. Would it be possible for you to save the gestures into data assets rather than in a seperate binary file? So we could adjust the gesture tracking information by hand in the data asset, also delete invidual gestures, give them a new index or different names.

      I might be doing something wrong, but this is just hard that when i record a gesture and its bad, i have to delete the saved file and record all the gesture again.

      Comment


        #18
        Originally posted by Azarus View Post
        I really enjoy your plugin. Would it be possible for you to save the gestures into data assets rather than in a seperate binary file? So we could adjust the gesture tracking information by hand in the data asset, also delete invidual gestures, give them a new index or different names.

        I might be doing something wrong, but this is just hard that when i record a gesture and its bad, i have to delete the saved file and record all the gesture again.
        If you record a gesture with the same id twice it will overwrite the first one, so you can just keep recording with the id you want until you get the gesture you like. Creating custom asset files with the ability to modify the gestures in editor is something I'm unfamiliar with and would probably take a while to implement, but I'll definitely consider it for the future. I'll add a function to delete the gesture with the specified id in the next update. For now you can always just start with a saved set of gestures you know you want, and if you add any undesired gestures you can load the saved set again and it'll clear the newly added unwanted gestures.
        Gesture Tracker VR: A Gesture Recognition Plugin

        Comment


          #19
          Very good work! I bought the plugin..
          I need some informations about the implementation: what machine learning technique did you use to classify?? Dynamic Time Warping? Hidden Markov Models? others?
          Could you give me more specific information about the plugin structure? I have to insert this information in my master thesis..I would be grateful

          Comment


            #20
            Originally posted by PJ22 View Post
            Very good work! I bought the plugin..
            I need some informations about the implementation: what machine learning technique did you use to classify?? Dynamic Time Warping? Hidden Markov Models? others?
            Could you give me more specific information about the plugin structure? I have to insert this information in my master thesis..I would be grateful
            I used an algorithm I developed myself that doesn't use any machine learning architectures. My technique doesn't have any academic foundation, it's just an idea I had that I tried and tweaked until it felt good to me. I was inspired by these children's wire toys. The wire represents the gesture path. As long as you're pulling the bead in vaguely the same direction as the current part of the path the bead will advance along the wire. If the bead makes it to the end of the wire the gesture is completed. This doesn't exactly describe the algorithm, but basically if the tracked motion vector and vector for the current part of the reference gesture where the "bead" is have a dot product greater than the Acceptable Similarity parameter then the bead will advance along the wire. Gestures are stored with their yaw rotation normalized around 0 so you can do the same gesture while facing any direction (I do my best to interpret the direction the user is facing using the rotations of the tracker component).

            There's a lot in the details of course but if you want to go that far I'd just look through the source. It's not as mathematically rigorous as other methods but it's cheap, recognition is O(n) in the number of gestures. It also makes continuous recognition easy, since I just have to reset the "bead" back to the start of a gesture's "wire" every time its determined the gesture was not being performed. Continuous recognition is somewhat more expensive though, since it uses additional memory Θ(n) in the number of gestures (realistically this will never be more than a few kilobytes though) and no gesture can ever be ruled out (unlike during normal recognition, where most gestures are ruled out almost immediately) so its recognition is Θ(n).
            Last edited by hdelattre; 02-23-2017, 01:04 AM.
            Gesture Tracker VR: A Gesture Recognition Plugin

            Comment


              #21
              Thank you very much! That's a good compromise.. If you had some documentation about this or other type of information, please, contact me at pulito.piergianni@gmail.com

              Comment


                #22
                Hey, hdelattre!

                So you basically use movable mesh on spline to recognise gesture? How is it working with different scales though?

                Correct me if I'm wrong)
                Available for contract hiring! Complex mechanics, quick game prototyping, VR, AI, Animation, Tools for designers.

                Check out my latest game! Last Joy - 2D RPG with unique combat system.

                Comment


                  #23
                  Hi Two-faced,

                  I'm not actually moving a bead mesh along a spline to do the recognition (although the draw functions effectively give this appearance), that's just a metaphor to give a rough idea of how the recognition algorithm works. Essentially each recognizable gesture internally keeps track of an index along its path that recognition has advanced to. This index represents where the bead would be.

                  When a user is performing a gesture the algorithm will look at the last movement vector and move the bead forward on a given gesture if is similar enough that gesture's path where the "bead" index currently is. The max amount the bead can move forward is equal to the amount the user moved forward multiplied by the recognition ratio parameter, which by default is 3 or 5 or something like that. This means that, by default, a performed gesture can be a third the size of a reference gesture and still be recognized as long as the shape of the gesture was the same. I wanted the recognition to care more about the general shape of the performed gesture than the size of it, so if you record a big square gesture and then perform a small square it will still recognize it by default. Of course you can change this if you want by making the recognition ratio 1.
                  Gesture Tracker VR: A Gesture Recognition Plugin

                  Comment


                    #24
                    Originally posted by hdelattre View Post
                    Hi Two-faced,
                    It's quite interesting. It's sad that there is no way to make procedural templates.
                    Available for contract hiring! Complex mechanics, quick game prototyping, VR, AI, Animation, Tools for designers.

                    Check out my latest game! Last Joy - 2D RPG with unique combat system.

                    Comment


                      #25
                      Hey, hdelattre!

                      I purchased GestureTrackerVR plugin. Now I want to interface 5DT data gloves with virtual environment created in Unreal Engine. Please do help me out and give me a brief.

                      Regards,
                      Mnrmja007

                      Comment


                        #26
                        Having fun with this. I think being able to pull the %complete of both recognized and continuous recognitions gestures at will would be good. So if you have a gesture in progress, you would be able to get the info "gesture id 3 completed 58%" for example. Make it easy to build more juicy feedback for the player even before a gesture is completed.

                        Thanks for a great plugin!

                        Comment


                          #27
                          Hi Karmington,

                          Thanks for the suggestion, I'll add functions to get the completion percentage for a given id/name and for the predicted gesture. Look out for them in the next update!
                          Last edited by hdelattre; 04-12-2017, 06:24 PM.
                          Gesture Tracker VR: A Gesture Recognition Plugin

                          Comment


                            #28
                            Hi hdelattre ,

                            I have just bought your plugin. I set it up exactly the way you did in youtube video but i cant manage to get even the start recognition during playtime. Can you provide step to step setup ?? Appreciate it so much ...

                            Comment


                              #29
                              Hi Weihow,

                              I recommend downloading the demo project at dropbox.com/s/fy0y3vdpsdunbws/GestureTrackerVRDemo.zip so you can compare it to your project. Make sure you've set up the drawing functions so you can visualize if recognition is working, make sure you've set up your inputs properly, and make sure the GestureTracker is attached to the proper motion controller.
                              Gesture Tracker VR: A Gesture Recognition Plugin

                              Comment


                                #30
                                Hi Hunter,

                                Is this getting updated to 4.16 soon

                                Thanks

                                Comment

                                Working...
                                X