Unreal Button Mapping names for Oculus Touch

Is there a rosetta stone lying around with the mapping names used in UE4 to correspond to the button signals coming in from the Touch controller? You can see what they are call here, however UE4 has different names for everything so it would be great to know what that is: https://developer3.oculus.com/documentation/pcsdk/latest/concepts/dg-input-touch-touch/

ovrTouch_A User in touching A button on the right controller.
ovrTouch_B User in touching B button on the right controller.
ovrTouch_RThumb User has a finger on the thumb stick of the right controller.
ovrTouch_RThumbRest User has a finger on the textured thumb rest of the right controller.
ovrTouch_RIndexTrigger User in touching the index finger trigger on the right controller.
ovrTouch_X User in touching X button on the left controller.
ovrTouch_Y User in touching Y button on the left controller.
ovrTouch_LThumb User has a finger on the thumb stick of the left controller.
ovrTouch_LThumbRest User has a finger on the textured thumb rest of the left controller.
ovrTouch_LIndexTrigger User in touching the index finger trigger on the left controller.
ovrTouch_RIndexPointing Users right index finger is pointing forward past the trigger.
ovrTouch_RThumbUp Users right thumb is up and away from buttons on the controller, a gesture that can be interpreted as right thumbs up.
ovrTouch_LIndexPointing Users left index finger is pointing forward past the trigger.
ovrTouch_LThumbUp Users left thumb is up and away from buttons on the controller, a gesture that can be interpreted as left thumbs up.

i actually just mapped this out as well…

y = face button 2, x = facebutton 1
b = face button 2, a = facebutton 1

you can call the event ‘AnyKey’ from your character, and use the node ‘get key name’ to print the name of anything pressed.

now a major annoyance is building a platform specific switch that uses axis values from Touch, and Action values from Vive since the Vive trackpad has very different ergonomic implications for axis values.

hopefully this helps.

For Bindings you can scroll down the GamePad and find OculusTouch and OculusRemote way down under the motioncontroller ones.
I’m not sure what the grip button is though … oh, MotionController (L) Grip1

Turning Pawn using Grips buttons (for when you are tired of turning your head too far)



For anyone else wanting one I just made a diagram that shows all Touch inputs and their corresponding mapping in UE4:

3 Likes

I was just gonna paste this but you already have a nice image :slight_smile:
Well, this might be handy for someone who wants to copy text.

A == MotionController (R) FaceButton1
B == MotionController (R) FaceButton2
Y == MotionController (L) FaceButton2
X == MotionController (L) FaceButton1
Hamburger == Gamepad Special Right
MotionController (R) Thumbstick
MotionController (L) Thumbstick
MotionController (R) Trigger
MotionController (L) Trigger
MotionController (L) Grip1
MotionController (R) Grip1

Handy to map these to an enumerator : http://tomofnz.wixsite.com/tomofnz/single-post/2017/01/06/OCULUS-VR-BUTTON-PRESSES-IN-UE4

Ok so I don’t have the touch to test this out but why not just use the actual Oculus Touch key bindings?

I was just wondering if there was any chance of having this thread Stickied…? It will save people from searching for these Touch mappings. :smiley:

I don’t see a map for face button 4. Does anyone know what you would do if you are using a function for facebutton 4 on the vive and you can’t change that how you would do that function on the touch?

Very Good Post! Was pointed here by someone from the Answer Hub, Agree that this information should be more readily available

Hi, If I want let’s say to use the Oculus Touch as a Virtual Joystick in the flying template, or the car template, how can I find the x,y and z Axis values? all I see here is button values and capacitive touch values, or maybe I’m focusing this the wrong way… :slight_smile:

responded in your new thread for this question :slight_smile:

Regarding Vive/Touch cross compatibility, is there any way to generate ‘MotionController(L) Thumbstick Left’ (or right, up, down like d-pad input) with Touch, or is this intended only for vive input?

I believe this is vive only, you can see the inputs that the Oculus SDK provide here so if you wanted it to act like a vive controller you would need to do that in Blueprints, also just and FYI Thumbstick Left/Up/Right/Down are not used for the Vive, at least they weren’t originally, instead the trackpad is mapped to the face buttons like an XBox controller (you can see my other diagram for the vive here)

Tir Nan Og (#6)
Those buttons are for capacitive touch, which registers if your finger is on or nearly on a button (rather than the actual value from pushing it).
It can be used to provide states to drive virtual hand poses. So if your finger is over grip but not over trigger your virtual hand will blend to a pointing pose, but if it’s also on trigger your hand will blend to a fist/grip pose.

I have an honest question for the developers at Epic, and maybe someone here could field it. Why are these mapped the way they are mapped, I am currently struggling to make any sense of the mappings and imagine I can’t be the only one bordering on insanity due to my earlier assumptions of reality.

  1. Why is there a button titled “Gamepad Special Right” on the left hand controller? (assuming the word ‘right’ is also map-able to the concept of the ‘right hand’, and that turning ‘right’ at a red light is not equivalent to turning left at a red light, and that every other button on the touch controllers that claims to be on the ‘right’ is in fact on the right)
  2. Is it normal to map generic controls to a specific peripheral’s buttons with the intention of making games cross -platform? (I assumed it would be the other way around)
  3. Why not expose all of the choices to the designers of the game, then let them handle designing a cross-platform experience? Give the people a choice, yeah?

I imagine the current setup would work well in a situation where each peripheral had the exact same spacing and comfort-of-reach on all of its buttons, and there was next to no actual difference between the periperals, but as it stands I am having trouble with the way that the HTC Vive controllers and the Oculus Touch controllers are dramatically different in their layout.

  1. that’s typically the start button on a gamepad, so experiences on rift that are designed to support gamepad and touch use this button. makes more sense on the gamepad, but provides same functionality on the touch controller I believe

  2. yes, I think this is normal. ue4 has done this since the beginning to allow for cross platform support since for the most part gamepads offer very similar layouts with certain nuances per platform.

  3. however, yes, this doesn’t quite work the same way for motion controllers. the slight differences in ergonomics cause a pretty big change in the way control schemes are being designed for each platform which complicates gameplay design (even the 30 degree difference in controller handle orientation). so it would have been nice to say MoveForward = Vive button 1 (action) or Touch Y (axis) without having to use platform checks and branch conditions (or switches and bools), but i’m guessing as the VR industry starts to mature, we will begin to see certain trends that justify why the current system works. (i.e. a=Jump b=shoot became somewhat of a standard in traditional gamepad. am also starting to see that “The Lab” and “Robo Recall” are pretty much setting the standard for motion control input since they both meet the technical requirements of each platform).

The VR editor in my opinion tackles this really well using button modifiers to cover cross platform differences. This way you can just check for a keydown of a certain type to validate input.

Would be interested to hear others approach to this as well, or maybe a ‘best practices for motion control’ as I’m still trying to figure out the underlying issues in tester feedback with accidental button pressing.

Thanks for the enlightenment, It is always disheartening to see your assumptions proven wholly wrong, but pleasantly reassuring to be on the same page as the majority.

Currently I am probably ‘doing it wrong’ but I have been basically practicing branching VR inputs using the action and axis mapping exposed in editor.

Not suggesting anyone do this, but for examples sake, while ‘move forward’ would be Motion Controller 1 (top of touch pad on Vive, X or A on Oculus Touch), it can thereby be mapped to something like ‘MoveForwardVive’, which could then be bound on an input component and handled with ‘move forward’ logic, next Motion Controller Thumbstick Y can be axis mapped To something like ‘OculusTouch_Left/RightStickY’, watching for potentially forward movement based on a mathematical equation, and firing an event when conditions are met after being bound on an input component for the same ‘move forward’ logic.

It is important to be sure to unbind or not bind Vive inputs in the case of using the oculus touch and vice-versa, otherwise you gain the bonus move forward binding on your A/X button in Oculus Touch which may not be desired.
Admittedly, this feels like and may actually be the worst approach. I would love to hear some others as well.

Couple more useful images to add here:


Life Saver

Really nice!

However for the last note, I believe it should be “Pointing is the opposite of Trigger CapTouch…” ?

THANK YOU! Very clean Pro look too!