Announcement

Collapse
No announcement yet.

Unreal Button Mapping names for Oculus Touch

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Unreal Button Mapping names for Oculus Touch

    Is there a rosetta stone lying around with the mapping names used in UE4 to correspond to the button signals coming in from the Touch controller? You can see what they are call here, however UE4 has different names for everything so it would be great to know what that is: https://developer3.oculus.com/docume...t-touch-touch/

    ovrTouch_A User in touching A button on the right controller.
    ovrTouch_B User in touching B button on the right controller.
    ovrTouch_RThumb User has a finger on the thumb stick of the right controller.
    ovrTouch_RThumbRest User has a finger on the textured thumb rest of the right controller.
    ovrTouch_RIndexTrigger User in touching the index finger trigger on the right controller.
    ovrTouch_X User in touching X button on the left controller.
    ovrTouch_Y User in touching Y button on the left controller.
    ovrTouch_LThumb User has a finger on the thumb stick of the left controller.
    ovrTouch_LThumbRest User has a finger on the textured thumb rest of the left controller.
    ovrTouch_LIndexTrigger User in touching the index finger trigger on the left controller.
    ovrTouch_RIndexPointing Users right index finger is pointing forward past the trigger.
    ovrTouch_RThumbUp Users right thumb is up and away from buttons on the controller, a gesture that can be interpreted as right thumbs up.
    ovrTouch_LIndexPointing Users left index finger is pointing forward past the trigger.
    ovrTouch_LThumbUp Users left thumb is up and away from buttons on the controller, a gesture that can be interpreted as left thumbs up.
    Ironbelly Studios- Your External UE4 Development Partner: AAA Quality Services at Indie Prices
    Request a quote today:
    https://ironbellystudios.com
    Follow us on Facebook: www.facebook.com/ironbellystudios

    #2
    i actually just mapped this out as well...

    y = face button 2, x = facebutton 1
    b = face button 2, a = facebutton 1

    you can call the event 'AnyKey' from your character, and use the node 'get key name' to print the name of anything pressed.

    now a major annoyance is building a platform specific switch that uses axis values from Touch, and Action values from Vive since the Vive trackpad has very different ergonomic implications for axis values.

    hopefully this helps.
    Earthborn Interactive | Flutter Bombs | MageWorks

    Comment


      #3
      Originally posted by Ironbelly View Post
      Is there a rosetta stone lying around with the mapping names used in UE4 to correspond to the button signals coming in from the Touch controller? You can see what they are call here, however UE4 has different names for everything so it would be great to know what that is: https://developer3.oculus.com/docume...t-touch-touch/

      ovrTouch_A User in touching A button on the right controller.
      ovrTouch_B User in touching B button on the right controller.
      ovrTouch_RThumb User has a finger on the thumb stick of the right controller.
      ovrTouch_RThumbRest User has a finger on the textured thumb rest of the right controller.
      ovrTouch_RIndexTrigger User in touching the index finger trigger on the right controller.
      ovrTouch_X User in touching X button on the left controller.
      ovrTouch_Y User in touching Y button on the left controller.
      ovrTouch_LThumb User has a finger on the thumb stick of the left controller.
      ovrTouch_LThumbRest User has a finger on the textured thumb rest of the left controller.
      ovrTouch_LIndexTrigger User in touching the index finger trigger on the left controller.
      ovrTouch_RIndexPointing Users right index finger is pointing forward past the trigger.
      ovrTouch_RThumbUp Users right thumb is up and away from buttons on the controller, a gesture that can be interpreted as right thumbs up.
      ovrTouch_LIndexPointing Users left index finger is pointing forward past the trigger.
      ovrTouch_LThumbUp Users left thumb is up and away from buttons on the controller, a gesture that can be interpreted as left thumbs up.

      For Bindings you can scroll down the GamePad and find OculusTouch and OculusRemote way down under the motioncontroller ones.
      I'm not sure what the grip button is though ... oh, MotionController (L) Grip1

      Turning Pawn using Grips buttons (for when you are tired of turning your head too far)
      Click image for larger version

Name:	VR_Inputs.png
Views:	1
Size:	262.0 KB
ID:	1120582
      Click image for larger version

Name:	VR_tick.png
Views:	1
Size:	74.6 KB
ID:	1120583
      Click image for larger version

Name:	VR_turning.png
Views:	1
Size:	170.7 KB
ID:	1120584
      Last edited by tomofnz; 12-26-2016, 10:47 PM.

      Comment


        #4
        Originally posted by Ironbelly View Post
        Is there a rosetta stone lying around with the mapping names used in UE4 to correspond to the button signals coming in from the Touch controller?
        For anyone else wanting one I just made a diagram that shows all Touch inputs and their corresponding mapping in UE4:

        Click image for larger version

Name:	OculusTouchInputMappingUE4.png
Views:	1
Size:	344.2 KB
ID:	1120770
        UE4 VR Cookbook
        Mitch's VR Lab tutorial series
        VR Content Examples

        Comment


          #5
          I was just gonna paste this but you already have a nice image
          Well, this might be handy for someone who wants to copy text.

          A == MotionController (R) FaceButton1
          B == MotionController (R) FaceButton2
          Y == MotionController (L) FaceButton2
          X == MotionController (L) FaceButton1
          Hamburger == Gamepad Special Right
          MotionController (R) Thumbstick
          MotionController (L) Thumbstick
          MotionController (R) Trigger
          MotionController (L) Trigger
          MotionController (L) Grip1
          MotionController (R) Grip1

          Handy to map these to an enumerator : http://tomofnz.wixsite.com/tomofnz/s...PRESSES-IN-UE4

          Comment


            #6
            Ok so I don't have the touch to test this out but why not just use the actual Oculus Touch key bindings?

            Click image for larger version

Name:	OculusTouch.JPG
Views:	1
Size:	359.6 KB
ID:	1121293
            Attached Files

            Comment


              #7
              I was just wondering if there was any chance of having this thread Stickied..? It will save people from searching for these Touch mappings.

              Comment


                #8
                I don't see a map for face button 4. Does anyone know what you would do if you are using a function for facebutton 4 on the vive and you can't change that how you would do that function on the touch?

                Comment


                  #9
                  Very Good Post! Was pointed here by someone from the Answer Hub, Agree that this information should be more readily available
                  Jungle Rock Pack - Available Now! - Unreal Marketplace
                  Wooden Floor Pack 4 - Available Now! - Unreal Marketplace
                  Medieval Fantasy Tavern Environment - Available Now! - Unreal Marketplace
                  Modern Table Pack - Available Now!

                  Comment


                    #10
                    Hi, If I want let's say to use the Oculus Touch as a Virtual Joystick in the flying template, or the car template, how can I find the x,y and z Axis values? all I see here is button values and capacitive touch values, or maybe I'm focusing this the wrong way...

                    Comment


                      #11
                      Originally posted by olistreet View Post
                      Hi, If I want let's say to use the Oculus Touch as a Virtual Joystick in the flying template, or the car template, how can I find the x,y and z Axis values? all I see here is button values and capacitive touch values, or maybe I'm focusing this the wrong way...
                      responded in your new thread for this question

                      Regarding Vive/Touch cross compatibility, is there any way to generate 'MotionController(L) Thumbstick Left' (or right, up, down like d-pad input) with Touch, or is this intended only for vive input?
                      Earthborn Interactive | Flutter Bombs | MageWorks

                      Comment


                        #12
                        Originally posted by paradoc View Post
                        responded in your new thread for this question

                        Regarding Vive/Touch cross compatibility, is there any way to generate 'MotionController(L) Thumbstick Left' (or right, up, down like d-pad input) with Touch, or is this intended only for vive input?
                        I believe this is vive only, you can see the inputs that the Oculus SDK provide here so if you wanted it to act like a vive controller you would need to do that in Blueprints, also just and FYI Thumbstick Left/Up/Right/Down are not used for the Vive, at least they weren't originally, instead the trackpad is mapped to the face buttons like an XBox controller (you can see my other diagram for the vive here)
                        UE4 VR Cookbook
                        Mitch's VR Lab tutorial series
                        VR Content Examples

                        Comment


                          #13
                          Tir Nan Og (#6)
                          Those buttons are for capacitive touch, which registers if your finger is on or nearly on a button (rather than the actual value from pushing it).
                          It can be used to provide states to drive virtual hand poses. So if your finger is over grip but not over trigger your virtual hand will blend to a pointing pose, but if it's also on trigger your hand will blend to a fist/grip pose.

                          Comment


                            #14
                            I have an honest question for the developers at Epic, and maybe someone here could field it. Why are these mapped the way they are mapped, I am currently struggling to make any sense of the mappings and imagine I can't be the only one bordering on insanity due to my earlier assumptions of reality.

                            1. Why is there a button titled "Gamepad Special Right" on the left hand controller? (assuming the word 'right' is also map-able to the concept of the 'right hand', and that turning 'right' at a red light is not equivalent to turning left at a red light, and that every other button on the touch controllers that claims to be on the 'right' is in fact on the right)
                            2. Is it normal to map generic controls to a specific peripheral's buttons with the intention of making games cross -platform? (I assumed it would be the other way around)
                            3. Why not expose all of the choices to the designers of the game, then let them handle designing a cross-platform experience? Give the people a choice, yeah?

                            I imagine the current setup would work well in a situation where each peripheral had the exact same spacing and comfort-of-reach on all of its buttons, and there was next to no actual difference between the periperals, but as it stands I am having trouble with the way that the HTC Vive controllers and the Oculus Touch controllers are dramatically different in their layout.

                            Comment


                              #15
                              Originally posted by Spiris View Post
                              I have an honest question for the developers at Epic, and maybe someone here could field it. Why are these mapped the way they are mapped, I am currently struggling to make any sense of the mappings and imagine I can't be the only one bordering on insanity due to my earlier assumptions of reality.

                              1. Why is there a button titled "Gamepad Special Right" on the left hand controller? (assuming the word 'right' is also map-able to the concept of the 'right hand', and that turning 'right' at a red light is not equivalent to turning left at a red light, and that every other button on the touch controllers that claims to be on the 'right' is in fact on the right)
                              2. Is it normal to map generic controls to a specific peripheral's buttons with the intention of making games cross -platform? (I assumed it would be the other way around)
                              3. Why not expose all of the choices to the designers of the game, then let them handle designing a cross-platform experience? Give the people a choice, yeah?

                              I imagine the current setup would work well in a situation where each peripheral had the exact same spacing and comfort-of-reach on all of its buttons, and there was next to no actual difference between the periperals, but as it stands I am having trouble with the way that the HTC Vive controllers and the Oculus Touch controllers are dramatically different in their layout.
                              1. that's typically the start button on a gamepad, so experiences on rift that are designed to support gamepad and touch use this button. makes more sense on the gamepad, but provides same functionality on the touch controller I believe

                              2. yes, I think this is normal. ue4 has done this since the beginning to allow for cross platform support since for the most part gamepads offer very similar layouts with certain nuances per platform.

                              3. however, yes, this doesn't quite work the same way for motion controllers. the slight differences in ergonomics cause a pretty big change in the way control schemes are being designed for each platform which complicates gameplay design (even the 30 degree difference in controller handle orientation). so it would have been nice to say MoveForward = Vive button 1 (action) or Touch Y (axis) without having to use platform checks and branch conditions (or switches and bools), but i'm guessing as the VR industry starts to mature, we will begin to see certain trends that justify why the current system works. (i.e. a=Jump b=shoot became somewhat of a standard in traditional gamepad. am also starting to see that "The Lab" and "Robo Recall" are pretty much setting the standard for motion control input since they both meet the technical requirements of each platform).

                              The VR editor in my opinion tackles this really well using button modifiers to cover cross platform differences. This way you can just check for a keydown of a certain type to validate input.

                              Would be interested to hear others approach to this as well, or maybe a 'best practices for motion control' as I'm still trying to figure out the underlying issues in tester feedback with accidental button pressing.
                              Last edited by paradoc; 04-14-2017, 12:55 PM.
                              Earthborn Interactive | Flutter Bombs | MageWorks

                              Comment

                              Working...
                              X