Announcement

Collapse
No announcement yet.

Gesture Tracker VR Help and Feature Requests

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    [SUPPORT] Gesture Tracker VR Help and Feature Requests

    I figured I'd make a Gesture Tracker VR support thread so I can have a place for feedback, public support and feature requests.



    FAQ:

    Does this work with Oculus Touch, Leap Motion, Perception Neuron, etc.?

    Gesture recognition depends only on the transform of the GestureTracker component itself and does not depend on any device specific functionality or API so it will work with any input device. As long as you attach it to some moving component that you want to track it will work.

    Is C++ supported?

    C++ is supported but discouraged. The component was designed to be used in blueprints and is not as easily setup in C++. To use the plugin in code add "GestureTrackerVR" to your project's public/private module names and then include "GestureTracker.h". I'm willing to provide help to anyone using C++ or BPs but I highly encourage you to use BPs if possible. Make sure to keep your GestureTracker's parameters within the range of values they are clamped to in blueprints. Values outside of the clamp range are not guaranteed to work and may cause a crash (e.g. Gesture Resolution <= 0).

    Are Mac, Android, iOS, etc. supported?

    These platforms are not officially supported since I have not tested for them but the component should work fine and users have successfully used the plugin on all of them. If you're building for one of these platforms make sure to check GestureTrackerVR.uplugin in the plugin's folder for the WhitelistPlatforms list at the bottom. Add your platform string to this list (e.g. add "Mac" or "IOS") if it is not already present.

    How do I save/load gestures? Can I package default gestures with my game?

    Gestures can be saved and loaded using the GestureTracker component's Save and Load functions. Gestures are stored in the Content folder so the arguments to these functions should be paths relative to the content folder. Gesture save files are just binary data files without type, but you can give them some standin extension if you wish. To provide a set of default Gestures with your game you can specify to package the gesture save with your game and load it when the game starts. To package the save file you'll need to add the directory containing them to 'Project Settings > Packaging > Additional Non-Asset Directories to Package'. So you would, for example, save all your gestures to a "Gestures" folder in the root of your Content folder and then add the Gestures folder to the list of Additional Non-Asset Directories to Package.

    Can I perform multiple gestures at the same time?

    Yes. You can add as many GestureTracker components as you'd like and they will all perform tracking and recognition independently. If you would like to do two handed combined gesturing you can just check that the GestureTracker on each hand recognized the desired gesture within a certain time frame of each other.

    Does this have to be used with VR?

    The system is designed with VR in mind but users have had success using it to track mouse and finger gestures (on mobile). Feel free to ask me about your use case if you're unsure if this is right for you.


    Changelog:

    1.1 - 11/3/16 (4.13)

    * Added baseMaterial input to draw functions. This is an optional replacement the draw mesh's default material that lets you easily apply dynamic materials to the gesture draw.

    * Implemented the GestureTracker Update Track/Predicted Draw Materials functions, as well as the Update Materials function for GestureMesh objects.

    * Fixed Draw by Id and Draw by Name functions.

    * Other minor fixes and comment improvements.


    1.2 - 11/18/16 (4.14)

    * Made GestureTracker.h public for C++ access. I still recommend using blueprints.

    * Added GetNumGestures and GetIds functions to get the number of recorded gestures and an array of their ids respectively.

    * Added UpdateTrackIds and UpdateDrawIds functions to let you change the blacklist/whitelist of drawing/tracking in the middle of recognition.

    * Added RecognitionDuration float output to recognition functions and events which gives the time it took the user to perform the gesture in seconds.


    1.3 - 2/22/2017 (4.14/4.15)

    * Fixed an issue with tracked draw lines drawing incorrectly while moving and tracking by the world location.
    * Fixed gesture disqualification tolerance so its the same for all resolutions (used to be so that a smaller resolution permitted a smaller off-gesture distance before disqualification).
    * Added resetOnRecognized argument to continuous recognition. This lets you determine whether recognition for a gesture should restart as soon as it has been completed, or if it will only restart once the gesture has stopped being performed (e.g. If true, a single downward swipe gesture could be recognized multiple times by doing one long downward swipe. If false, you must stop performing the downward swipe and then start it again for it to be recognized a second time).
    * Added GetGesturePath(int id) and GesturePredictedGesturePath() functions to get a TArray<FVector> describing the path of the gesture with the given id and the currently predicted gesture respectively.
    * Removed unused EGestureTrackerMode enum.
    * Minor naming and organization improvements.


    1.4 - 6/2/2016 (4.16)

    * Added BodyLocationComponent, a reference which is null by default. If set to point at your VR "body" (typically the camera object), the vector between this component and the
    Tracker will be used to determine that casting rotation. This resolves issues where the gesture rotation is incorrectly interpreted (e.g. if the player is pointing the
    motion controller forward but with a 90 degree twist in the wrist) but may also be slightly less precise about determining the gesture rotation. Leaving this null will
    leave rotation tracking working the same as it has in the past. This may or may not be better recognition depending on what kind of gestures/recognition are being used.

    * Added trackFailDistance and continuousFailDistance parameters to determine how far off a given gesture's path you must go to fail recognition of that gesture.

    * Added predictionPercentage parameter that determines what percent a gesture must be completed before it can be displayed as the predicted gesture.

    * Added rotationOffset parameter, which adds degrees to your gesturing rotation in case the GestureTracker is offset from the player's actual aiming direction.

    * Added GetPredictedGestureId() and GetPercentageComplete(int id) functions.

    * Continuous recognition should now be more reliable. Expect more improvements in the future too.

    * All parameters are now read/writeable in blueprints. NOTE: Please make sure you stick to parameter values inside the ranges of the sliders in the details panel. Values outside these ranges (e.g. gesture resolution <= 0) are unsupported and could cause crashes/other issues.

    * Minor readability, naming, comments, and organizational improvements.

    1.5 - 7/6/2017 (4.16)

    * Added LengthRatio output to FinishRecognition and GestureRecognized events representing the ratio of the length of the tracked gesture to the length of the recognized gesture. So if the tracked gesture was twice as long as the recorded gesture it recognized then LengthRatio will be 2.

    * Added NormalizePitchRotation to tracking params. Normally gestures are only normalized by yaw rotation. Enabling this will also normalize them by pitch so they rotate with you as you look up and down. This means if you define, for example, a forward punch gesture it will also be recognized if you punch up or down. Or if you draw a triangle in front of you it will also be recognized if you draw a downward facing triangle above you.

    * Also added UseBodyLocationForPitch; if true we use the BodyLocationComponent for pitch normalization (if we're normalizing pitch and the BodyLocationComponent has been set), otherwise we'll use the gesture tracker rotation as normal. This should usually be false since, if you're using your VR camera as the BodyLocationComponent, there will be a negative bias in the pitch (since your hands rest about 40 degrees below the camera).

    * Moved GestureMeshComponent to the public folder to resolve an issue with blueprint nativization.

    1.5.1 - 8/2/2017 (4.16)

    * Added missing include whose absence could cause compile errors for shipping builds.

    1.6 - 8/9/2017 (4.17)

    * Updated to UE 4.17 but no meaningful changes.
    Last edited by hdelattre; 02-19-2018, 04:03 AM.
    Gesture Tracker VR: A Gesture Recognition Plugin

    #2
    Hi Hunter,

    as promissed I got my boss to get me your plugin. And I truely love it, great job.

    However,
    I seem to have trouble transfering gestures between levels.

    I start continuously recognizing at begin play for both hands, but the continuous recognizing recognized something doesn't recogniz anything. I think it is due to the gesture not being saved between 2 levels. Maybe. I don't know. And if I save and load a gesture, how can I assign the loaded gesture an ID?

    EDIT:

    Figured it out. Using Save and Load fixed the level issue for me. I guess they keep all properties including their IDs.

    Though now I have a new problem.

    In the new level the gestures don't get recognized as well.
    The pawn I'm using is traveling at a somewhat high speed through the level and also gets roteted depending on the controller positions relative to one another. Can this affect the consistency with wich the gestures are being tracked?
    Last edited by Investigator; 12-13-2016, 10:11 AM.

    Comment


      #3
      Hi Investigator,

      Sorry for the late response. You are correct about the Save/Load functions, glad you figured it out.

      As long as you're not tracking by the world location (and you won't be by default) the speed you're moving at should not make a difference.

      However, the tracker uses its rotation to try and understand the direction you're oriented and performing the gesture, but it does assume that you aren't turning during the gesture (or if you are, it assumes that is part of the gesture). I think I'll just have to add a option to use the relative rotation, but it could be more complicated than that. If I remember correctly you're creating a game with a flight mechanic. Something like a flap downward while maintaining the same orientation is considered different from a flap downward while rotating your body (and arms) 90 degrees, I don't know if that describes your problem well though. Can you describe specifically what kind of gestures you're performing and the situations in which they fail?
      Gesture Tracker VR: A Gesture Recognition Plugin

      Comment


        #4
        You described my problem pretty well. However, flapping while turning rarely happens so far, luckily. It would make the whole thing a bit mor complex.
        Using the draw function I noticed that the drawn lines are somewhat off/odd.
        Even though I disabled the tracking by world location the controllers seem to be constantly traveling forwards and the line is drwan around 50cm to 75cm infropnt of the actual controller position while the controllers are beeing tracked just fine.

        The gesture I'm using is, for the sake of consistency in recognition, a simple downward pull on both controllers simultaneously. Before I was using a whole flap cycle which the testers failed to reproduce (even though they recorded their own cycles).
        The main issue is resolved I think, just the draw is off somehow.

        Comment


          #5
          It looks like the drawing issue arises when the owning actor is moving at high speeds when drawing starts while tracking using the world position. I'm looking into a solution.
          Gesture Tracker VR: A Gesture Recognition Plugin

          Comment


            #6
            It might be interesting to track orientation of daydream controller has gestures. Kind of like the hand signals used in E.T. movie.

            Comment


              #7
              Hello there

              I bought your plugin but can't manage to make it work in package version (ue4 4.13, development build). By "dont work" I mean that the gestures are not recognized in packaged version, I believe they are not loaded (but I'm not sure).

              What am I doing wrong?

              My content folder: ReloadTest is a saved gesture file, I copied and pasted it in the Gesture folder.
              Click image for larger version

Name:	contentfolder.jpg
Views:	1
Size:	73.6 KB
ID:	1121734

              This is the content of the Gesture Folder, there is only the copied saved gesture file.
              Click image for larger version

Name:	gesturefolder.jpg
Views:	1
Size:	27.3 KB
ID:	1121735

              These are my additional non-asset directories to package and I did add the gesture folder:
              Click image for larger version

Name:	packagesettings.jpg
Views:	1
Size:	128.8 KB
ID:	1121736


              Thanks in advance for any answer
              [Released] Multiplayer Combat Editor
              A-RPG Sacred Swords
              Auto-Chess Live Development
              Youtube Tutorials

              Comment


                #8
                Hi Elliot,

                Is it possible you're still trying to load the ReloadTest file in your root folder (which won't be packaged but will work in the editor) instead of the one in your Gestures folder? I've done the exact same setup in my test project and it works on my end. If not that then maybe try saving directly to the Gestures folder instead of using a copy.

                A snapshot of the relevant blueprint/code would necessary to troubleshoot any further.
                Gesture Tracker VR: A Gesture Recognition Plugin

                Comment


                  #9
                  I was loading "ReloadTest" instead of "Gestures/ReloadTest", silly mistake. Your answer put me back on track.

                  Everything is working as intended now, thank you for your support
                  [Released] Multiplayer Combat Editor
                  A-RPG Sacred Swords
                  Auto-Chess Live Development
                  Youtube Tutorials

                  Comment


                    #10
                    Hi C++ question here..
                    Can you please elaborate how can i get the tracked points from a predicted gesture? I am trying to understand your code but doesn't really make sense.
                    I would like to compare 2 predicted gestures made by the player and check how similar they are.

                    Comment


                      #11
                      Hi Azarus,

                      The GestureTracker has a GestureLibrary object called gestureLib, which has a Get(int index) method to get RecordGesture objects which are stored gestures. The index is just determined by the order they were recorded, but you can get it from the Id using the GetIndexById(int id) method. RecordGesture's have a Path() method which returns a TArray<FVector> of the gesture path. There's no path comparison functionality built in but I imagine you have some sort of distance metric in mind.

                      The gestureLib is private since I didn't intend for gesture paths to be accessed by outside code, but you can drop the function below into GestureTracker.cpp if you need to do so. Make sure you compile in visual studio and not using the button in the Engine, since it will only recompile your game's code and not the plugin.

                      In GestureTracker.h

                      TArray<FVector> GetPathById(int id) const;


                      In GestureTracker.cpp

                      TArray<FVector> UGestureTracker::GetPathById(int id) const
                      {
                      int index = gestureLib.GetIndexById(id);
                      return gestureLib.Get(index).Path();
                      }
                      Last edited by hdelattre; 01-29-2017, 07:15 AM.
                      Gesture Tracker VR: A Gesture Recognition Plugin

                      Comment


                        #12
                        Originally posted by hdelattre View Post
                        Hi Azarus,

                        The GestureTracker has a GestureLibrary object called gestureLib, which has a Get(int index) method to get RecordGesture objects which are stored gestures. The index is just determined by the order they were recorded, but you can get it from the Id using the GetIndexById(int id) method. RecordGesture's have a Path() method which returns a TArray<FVector> of the gesture path. There's no path comparison functionality built in but I imagine you have some sort of distance metric in mind.

                        The gestureLib is private since I didn't intend for gesture paths to be accessed by outside code, but you can drop the function below into GestureTracker.cpp if you need to do so. Make sure you compile in visual studio and not using the button in the Engine, since it will only recompile your game's code and not the plugin.

                        In GestureTracker.h

                        TArray<FVector> GetPathById(int id) const;


                        In GestureTracker.cpp

                        TArray<FVector> UGestureTracker::GetPathById(int id) const
                        {
                        int index = gestureLib.GetIndexById(id);
                        return gestureLib.Get(index).Path();
                        }
                        Hi thanks for your answer, but it only returns the recorded gestures? I would like to get the currently predicted ones, and not the stored recording ones.

                        Also noticed the code is not updated to 4.14 yet

                        Edit:
                        After a while of digging your code i got my stuff working, thank you
                        Last edited by Azarus; 01-30-2017, 06:09 AM.

                        Comment


                          #13
                          The code has been updated for 4.14 for several months, are you sure you've downloaded the latest version?
                          Gesture Tracker VR: A Gesture Recognition Plugin

                          Comment


                            #14
                            Originally posted by hdelattre View Post
                            The code has been updated for 4.14 for several months, are you sure you've downloaded the latest version?
                            I got a bWantsBeginPlay is deprecated warning.

                            Comment


                              #15
                              Originally posted by Azarus View Post
                              I got a bWantsBeginPlay is deprecated warning.
                              You must have the old version then, this was fixed with version 1.2 back in November when 4.14 came out. When you downloaded the plugin you chose to have it installed for 4.14 right?
                              Gesture Tracker VR: A Gesture Recognition Plugin

                              Comment

                              Working...
                              X