Announcement

Collapse
No announcement yet.

Razer Hydra Plugin

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #76
    With the 4.4.1 preview fixing DK2 compatibility, I decided to try to update my Hydra project. Seems to work fine EXCEPT that the movement linked to my IK is way off. Last version of UE4 for the project was 4.3

    Rotation works fine, but when I move the controllers the arms BARELY move instead of having the 1:1 motion they had before updating.

    Any idea why this is? Could it be related to the "FRotator angularVelocity" added to "HydraControllerMoved" ? Or if the scale was somehow changed since 0.5.2, the last version I used.

    Edit: For some reason I used to have to divide the position by 10 for it to be 1:1, now I don't anymore. That fixed it. Did the scale change from cm to mm or something?
    Last edited by Thoth_FN; 08-22-2014, 04:26 PM.

    Comment


      #77
      Originally posted by spire8989 View Post

      Edit: For some reason I used to have to divide the position by 10 for it to be 1:1, now I don't anymore. That fixed it. Did the scale change from cm to mm or something?
      I wonder if this is why I couldn't get the hydra positions to work (for hand IK) in my 4.2.1 build. I'm really going for something like Getnamo is doing in the gifs he put up, but I've put that on hold for the moment while I work on other stuff that's not so fiddly and I know I can complete faster.
      Storyteller - An immersive VR audiobook player

      Dungeon Survival - WIP First person dungeon crawler with a focus on survival and environmental gameplay ala roguelikes

      Comment


        #78
        Originally posted by spire8989 View Post
        With the 4.4.1 preview fixing DK2 compatibility, I decided to try to update my Hydra project. Seems to work fine EXCEPT that the movement linked to my IK is way off. Last version of UE4 for the project was 4.3

        Rotation works fine, but when I move the controllers the arms BARELY move instead of having the 1:1 motion they had before updating.

        Any idea why this is? Could it be related to the "FRotator angularVelocity" added to "HydraControllerMoved" ? Or if the scale was somehow changed since 0.5.2, the last version I used.

        Edit: For some reason I used to have to divide the position by 10 for it to be 1:1, now I don't anymore. That fixed it. Did the scale change from cm to mm or something?
        The Hydra reports the controller positions from the base in mm, in UE4 the scale is defaulted to 1cm = 1uu. As you discovered dividing by 10 would do the conversion, since v0.6 (UE 4.3) this is now handled by the plugin internally.

        Originally posted by n00854180t
        This is incredibly awesome.

        I mentioned this on Reddit (Titus there) as well, but I'll say again, if you're interested in help at all, I'm down!
        That's great. I am currently still working on IMU based integration, but there are a couple of things that I need help on:
        1. How to swap from IK(direct position input e.g. hydra) to FK (IMU derived input e.g. smartphone/Myo) type input in the cleanest way possible.
        2. Template or plugin? (template might have to be code based, whereas plugin may come with a pre-compiled dll due to hmd look separation code)
        3. VR Head model blending for true first person perspective. Do we create a separate head model or do we use a masked material and vertex paint away vertices we do not want to see? Which is a better approach for ease of asset creation?
        4. What is the most convenient class to contain the IK? Controller? Pawn/Character? a mix of the two? Can we turn this into an interface? Currently most of the code is in the Character, due to forwarding values to the animation template, but I wonder if it would be better from a use case to have in the controller.
        5. From a plugin consumer point of view, what other conveniences would you want? keep it simple or simplify some other common vr input tasks?


        For now a lot of these are design choices (important, they define ease of use), but once a first code release is made, any blueprint or code additions are very welcome; you may wish to try some of the problems in isolation before then.
        Last edited by getnamo; 08-23-2014, 01:48 PM.
        Plugins: Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo - RealSense

        Comment


          #79
          Originally posted by getnamo View Post

          That's great. I currently still working on IMU based integration, but there are a couple of things that I need help on:


          For now a lot of these are design choices (important, they define ease of use), but once a first code release is made, any blueprint or code additions are very welcome; you may wish to try some of the problems in isolation before then.
          Awesome!

          [*]How to swap from IK(direct position input e.g. hydra) to FK (IMU derived input e.g. smartphone/Myo) type input in the cleanest way possible.
          Swapping from IK - you can set up a simple blend in out in the event graph for the anim blueprint - I use this for making single frame poses into cheapo animations - like raising or lowering an arm for holding a torch up. With the layered blend per bone or whatever it's called, you can blend only certain bones against a different pose, and also control it via the alpha.

          What I do there is just have it so that I have a variable for how long to blend in/out in seconds, then in the tick I add the delta time and then scale it from 0-X seconds blend time to 0-1.0, and feed that value into the alpha for the bone based blend. The result is a fairly smooth interpolation between whatever types of animation you're using over a configuration blend time. If you need it kick it in suddenly and just have it off all other times, a 0 alpha works fine, or a branch

          [*]Template or plugin? (template might have to be code based, whereas plugin may come with a pre-compiled dll due to hmd look separation code)
          This is a tricky one right now because of the issue with plugins not working in a packaged content-only game (have to recompile with a dummy class, turning it into a source project basically). Once that's fixed up, my vote would be for a plugin, but a template gives a lot of other benefits that might be useful.

          [*]VR Head model blending for true first person perspective. Do we create a separate head model or do we use a masked material and vertex paint away vertices we do not want to see? Which is a better approach for ease of asset creation?
          My vote goes to masking the existing head, because often times editing the art with a new head isn't a great option, whereas vertex painting is very doable even by non-artists.

          [*]What is the most convenient class to contain the IK? Controller? Pawn/Character? a mix of the two? Can we turn this into an interface? Currently most of the code is in the Character, due to forwarding values to the animation template, but I wonder if it would be better from a use case to have in the controller.
          Not sure I have any real opinion on this one - I think having it in the Character would be more similar to current example setups and what people typically do, though.

          [*]From a plugin consumer point of view, what other conveniences would you want? keep it simple or simplify some other common vr input tasks?
          I think keeping it focused, at least early, is the way to go. Hand and head tracking combined into the avatar in a very solid way, such that it can be set up with minimal fuss, then other convenience features can be added afterwards.
          Storyteller - An immersive VR audiobook player

          Dungeon Survival - WIP First person dungeon crawler with a focus on survival and environmental gameplay ala roguelikes

          Comment


            #80
            Originally posted by getnamo View Post
            Updated to UE4.4

            and a little sneak peak at an upcoming IK based VRMotionInput plugin:
            Gfygur Gallery 1

            Gfygur Gallery 2

            Gfygur Gallery 2
            Oh man that is awesome. Any idea when you'll be ready to release an update?

            Comment


              #81
              Hey Getnamo ! First thanx for your work !
              However, is there anyway to get the "Event Hydra Controller Moved" values for each controller ? it works with debug sphere, however, i'm trying to move to different component differently and i don't find anyway to get it working. Would be really nice to get an event like "Event Right Hydra Controller Moved" and "Event Left Hydra Controller Moved".
              I think there is a way by getting axis values and stuff, but the event is way simpler and compact

              Comment


                #82
                Originally posted by MattOstgard View Post
                Oh man that is awesome. Any idea when you'll be ready to release an update?
                This will be a separate plugin.

                I'm currently still working on IMU based controller integration for this plugin which is a key input group in addition to direct position controllers such as the hydra/stem/kinect. This is because the plugin is meant to be a middle-point plugin which will abstract away body position data from the actual hardware that provides it. Since we will have a lot of new input devices coming out in the near future, integrating each one directly will be counterproductive and not future proof.

                I believe that most of these devices trying to forward parts of a whole 'body position' data, which means hands, fingers, limbs, and general skeletal information. The idea behind a plugin of this nature would be to abstract that data structure in a way that can be accessed by developers directly, and allow for the motion input plugin to forward and merge the actual hardware input to that data structure. This will allow developers to focus on the VR aspects of development and not the input binding as well as allowing for easy integration of future input technology without changing any game logic code. Imagine this akin to Input Mapping but for a body position data set.

                Downstream from that data structure will be a convenience character(pawn) bp that will have IK/FK handles attached to the UE default skeleton. If you use the body position data, you will be able to easily forward that to the skeleton fully or partially, using IK/FK or where it is missing use animation. Or if you want to use the data set in non-skeletal way you can ignore that convenience character or build your own (say if you wanted to control wings instead of limbs and may not want a 1:1 mapping).

                In the coming week I hope to release an early github for this and to get input from other VR developers regarding structure, needs, and overal design for a plugin of this sort; hopefully some help can be had, so that we can build something robust which we will all use.


                Originally posted by Darknoodles View Post
                Hey Getnamo ! First thanx for your work !
                However, is there anyway to get the "Event Hydra Controller Moved" values for each controller ? it works with debug sphere, however, i'm trying to move to different component differently and i don't find anyway to get it working. Would be really nice to get an event like "Event Right Hydra Controller Moved" and "Event Left Hydra Controller Moved".
                I think there is a way by getting axis values and stuff, but the event is way simpler and compact
                This question was asked by PMBallisticDK earlier in this thread, the answer remains the same:

                Each blueprint event emits an integer called 'controller'. Simply add an IF statement and compare it to the controller you want (typically 0 for left, 1 for right) and any statements after that IF statement will only be for the controller you want. Additionally if you want to support people potentially misplacing their controllers, you can make a call to 'HydraWhichHand(int32 controller)' which will determine which hand the controller is being held in (returning 0 for left, 1 for right). This is determined by where the controller was last docked (which side of the dock it was).
                Plugins: Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo - RealSense

                Comment


                  #83
                  Thanks ! my bad, i however browsed the previous page, but as it was very late i may have skipped it thanks again

                  Comment


                    #84
                    Originally posted by getnamo View Post
                    In the coming week I hope to release an early github for this and to get input from other VR developers regarding structure, needs, and overal design for a plugin of this sort; hopefully some help can be had, so that we can build something robust which we will all use.
                    Cool I'll keep an eye out for it. Thanks!

                    Comment


                      #85
                      Originally posted by getnamo View Post

                      In the coming week I hope to release an early github for this and to get input from other VR developers regarding structure, needs, and overal design for a plugin of this sort; hopefully some help can be had, so that we can build something robust which we will all use.
                      Sounds awesome Getnamo!

                      I can't wait to take a look at the Github and test it, and help out. Are you going to set up an issue tracker so we can easily find current bugs/features to implement and easily submit them as pull requests? I think that would be great.
                      Storyteller - An immersive VR audiobook player

                      Dungeon Survival - WIP First person dungeon crawler with a focus on survival and environmental gameplay ala roguelikes

                      Comment


                        #86
                        Hey Getnamo! Thanks for all the work. Had a quick search and can't quite find what I want to do with the plugin... rather than explicitly setting the input mapping from the editor (which works fine) I would like to set a default state for one of my pawns. similar to:

                        UPlayerInput::AddEngineDefinedAxisMapping(FInputAxisKeyMapping("CinePawn_Yaw", EKeys::MouseX, 1.f));

                        However, when I try to replace EKeys::MouseX with say EKeysHydra::HydraLeftRotationYaw I can't get it to compile b/c the required header FHydraPlugin.h cannot be found by my project. I don't have an issue finding the public headers such as HydraDelegate.h etc though. What is the best way to expose the EKeysHydra so I can create my own definedAxisMapping in code? Thanks!

                        Comment


                          #87
                          Hi guys, I'm having some trouble getting the plugin to work. I set the input as advised but nothing seems to happen. If I turn on the Sixense Motion Creator 2 I get some movement, but as a generic joypad. Please could someone post an example project so I can see what I'm doing wrong?

                          Also, the VRMotionInput plugin looks incredible Getnamo, I can't wait to try it

                          Comment


                            #88
                            Originally posted by savantguarde View Post
                            Hey Getnamo! Thanks for all the work. Had a quick search and can't quite find what I want to do with the plugin... rather than explicitly setting the input mapping from the editor (which works fine) I would like to set a default state for one of my pawns. similar to:

                            UPlayerInput::AddEngineDefinedAxisMapping(FInputAxisKeyMapping("CinePawn_Yaw", EKeys::MouseX, 1.f));

                            However, when I try to replace EKeys::MouseX with say EKeysHydra::HydraLeftRotationYaw I can't get it to compile b/c the required header FHydraPlugin.h cannot be found by my project. I don't have an issue finding the public headers such as HydraDelegate.h etc though. What is the best way to expose the EKeysHydra so I can create my own definedAxisMapping in code? Thanks!
                            When I moved the EKeysHydra structure in 0.6.2 to clean up the dependencies it hid the keys from C++ input mapping. With 0.6.5, these have now been moved back to the delegate and their object code definition has been moved to the delegate as well. This means your required use case will now work, just make sure your project has a HydraDelegate.cpp copy in the project source folder in order for it to compile (which you will want to remove when you compile for shipping since it collapses the dll into monolithic .exe).



                            Originally posted by davidmcclure
                            Hi guys, I'm having some trouble getting the plugin to work. I set the input as advised but nothing seems to happen. If I turn on the Sixense Motion Creator 2 I get some movement, but as a generic joypad. Please could someone post an example project so I can see what I'm doing wrong?

                            Also, the VRMotionInput plugin looks incredible Getnamo, I can't wait to try it
                            Will need more information to help you out. How are you trying to use the plugin? The simplest way to use it is to drop the HydraPluginActor into your scene and use the input mapping system. The second simplest way is to use the provided blueprint events that are emitted inside HydraPluginActor to bind the received data to whatever you want to do as is shown in the video.


                            The VRMotionInput plugin is a bit delayed for now, have a lot on my plate atm, but will hopefully get around to releasing the base code soonish (tm).
                            Plugins: Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo - RealSense

                            Comment


                              #89
                              Hello!

                              I've never used the hydra before and I'm currently attempting to follow along with your tutorial that you posted back in April using the hydra-ue4-master folder. However, it appears that due to the updates to both Unreal and possibly the plugin too, some of the nodes that are being used in your video tutorial do not appear to exist in any of my available event nodes lists. I am currently using version 4.4.3 of Unreal. For example, the node "Event Hydra Undocked/Docked" does not appear. Nor does "Hydra controller moved" or "Hydra trigger changed". In other words, I'm having trouble following along and could use some help! Would there happen to be a way of accessing the nodes you were using in the tutorial or could there be a way to substitute the nodes you were using with ones that I can access in this version? Really, any information you could give me about how I could go about learning how to use the Hydra with Ue4 would be awesome, and I would greatly appreciate your help! Thank you!

                              Comment


                                #90
                                Originally posted by aialexander View Post
                                Hello!

                                I've never used the hydra before and I'm currently attempting to follow along with your tutorial that you posted back in April using the hydra-ue4-master folder. However, it appears that due to the updates to both Unreal and possibly the plugin too, some of the nodes that are being used in your video tutorial do not appear to exist in any of my available event nodes lists. I am currently using version 4.4.3 of Unreal. For example, the node "Event Hydra Undocked/Docked" does not appear. Nor does "Hydra controller moved" or "Hydra trigger changed". In other words, I'm having trouble following along and could use some help! Would there happen to be a way of accessing the nodes you were using in the tutorial or could there be a way to substitute the nodes you were using with ones that I can access in this version? Really, any information you could give me about how I could go about learning how to use the Hydra with Ue4 would be awesome, and I would greatly appreciate your help! Thank you!
                                The nodes are the same, you need to have your plugin enabled first. Follow the video again, pay attention to how to install and enable the plugin. Once you've confirmed its enabled, sub-class the HydraPluginActor in blueprint (use class viewer to find it). After placing the new blueprint in the scene you will receive all of those notifications inside your sub-classed blueprint actor.

                                This is all shown in the video and explained in the wiki/readme.

                                The only thing that is different is that input mapping isn't shown, but that is covered in both the wiki and in the readme.
                                Plugins: Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo - RealSense

                                Comment

                                Working...
                                X