Is there a rosetta stone lying around with the mapping names used in UE4 to correspond to the button signals coming in from the Touch controller? You can see what they are call here, however UE4 has different names for everything so it would be great to know what that is: https://developer3.oculus.com/documentation/pcsdk/latest/concepts/dg-input-touch-touch/
ovrTouch_A User in touching A button on the right controller.
ovrTouch_B User in touching B button on the right controller.
ovrTouch_RThumb User has a finger on the thumb stick of the right controller.
ovrTouch_RThumbRest User has a finger on the textured thumb rest of the right controller.
ovrTouch_RIndexTrigger User in touching the index finger trigger on the right controller.
ovrTouch_X User in touching X button on the left controller.
ovrTouch_Y User in touching Y button on the left controller.
ovrTouch_LThumb User has a finger on the thumb stick of the left controller.
ovrTouch_LThumbRest User has a finger on the textured thumb rest of the left controller.
ovrTouch_LIndexTrigger User in touching the index finger trigger on the left controller.
ovrTouch_RIndexPointing Users right index finger is pointing forward past the trigger.
ovrTouch_RThumbUp Users right thumb is up and away from buttons on the controller, a gesture that can be interpreted as right thumbs up.
ovrTouch_LIndexPointing Users left index finger is pointing forward past the trigger.
ovrTouch_LThumbUp Users left thumb is up and away from buttons on the controller, a gesture that can be interpreted as left thumbs up.
y = face button 2, x = facebutton 1
b = face button 2, a = facebutton 1
you can call the event âAnyKeyâ from your character, and use the node âget key nameâ to print the name of anything pressed.
now a major annoyance is building a platform specific switch that uses axis values from Touch, and Action values from Vive since the Vive trackpad has very different ergonomic implications for axis values.
For Bindings you can scroll down the GamePad and find OculusTouch and OculusRemote way down under the motioncontroller ones.
Iâm not sure what the grip button is though ⌠oh, MotionController (L) Grip1
Turning Pawn using Grips buttons (for when you are tired of turning your head too far)
I donât see a map for face button 4. Does anyone know what you would do if you are using a function for facebutton 4 on the vive and you canât change that how you would do that function on the touch?
Hi, If I want letâs say to use the Oculus Touch as a Virtual Joystick in the flying template, or the car template, how can I find the x,y and z Axis values? all I see here is button values and capacitive touch values, or maybe Iâm focusing this the wrong wayâŚ
Regarding Vive/Touch cross compatibility, is there any way to generate âMotionController(L) Thumbstick Leftâ (or right, up, down like d-pad input) with Touch, or is this intended only for vive input?
I believe this is vive only, you can see the inputs that the Oculus SDK provide here so if you wanted it to act like a vive controller you would need to do that in Blueprints, also just and FYI Thumbstick Left/Up/Right/Down are not used for the Vive, at least they werenât originally, instead the trackpad is mapped to the face buttons like an XBox controller (you can see my other diagram for the vive here)
Tir Nan Og (#6)
Those buttons are for capacitive touch, which registers if your finger is on or nearly on a button (rather than the actual value from pushing it).
It can be used to provide states to drive virtual hand poses. So if your finger is over grip but not over trigger your virtual hand will blend to a pointing pose, but if itâs also on trigger your hand will blend to a fist/grip pose.
I have an honest question for the developers at Epic, and maybe someone here could field it. Why are these mapped the way they are mapped, I am currently struggling to make any sense of the mappings and imagine I canât be the only one bordering on insanity due to my earlier assumptions of reality.
Why is there a button titled âGamepad Special Rightâ on the left hand controller? (assuming the word ârightâ is also map-able to the concept of the âright handâ, and that turning ârightâ at a red light is not equivalent to turning left at a red light, and that every other button on the touch controllers that claims to be on the ârightâ is in fact on the right)
Is it normal to map generic controls to a specific peripheralâs buttons with the intention of making games cross -platform? (I assumed it would be the other way around)
Why not expose all of the choices to the designers of the game, then let them handle designing a cross-platform experience? Give the people a choice, yeah?
I imagine the current setup would work well in a situation where each peripheral had the exact same spacing and comfort-of-reach on all of its buttons, and there was next to no actual difference between the periperals, but as it stands I am having trouble with the way that the HTC Vive controllers and the Oculus Touch controllers are dramatically different in their layout.
thatâs typically the start button on a gamepad, so experiences on rift that are designed to support gamepad and touch use this button. makes more sense on the gamepad, but provides same functionality on the touch controller I believe
yes, I think this is normal. ue4 has done this since the beginning to allow for cross platform support since for the most part gamepads offer very similar layouts with certain nuances per platform.
however, yes, this doesnât quite work the same way for motion controllers. the slight differences in ergonomics cause a pretty big change in the way control schemes are being designed for each platform which complicates gameplay design (even the 30 degree difference in controller handle orientation). so it would have been nice to say MoveForward = Vive button 1 (action) or Touch Y (axis) without having to use platform checks and branch conditions (or switches and bools), but iâm guessing as the VR industry starts to mature, we will begin to see certain trends that justify why the current system works. (i.e. a=Jump b=shoot became somewhat of a standard in traditional gamepad. am also starting to see that âThe Labâ and âRobo Recallâ are pretty much setting the standard for motion control input since they both meet the technical requirements of each platform).
The VR editor in my opinion tackles this really well using button modifiers to cover cross platform differences. This way you can just check for a keydown of a certain type to validate input.
Would be interested to hear others approach to this as well, or maybe a âbest practices for motion controlâ as Iâm still trying to figure out the underlying issues in tester feedback with accidental button pressing.
Thanks for the enlightenment, It is always disheartening to see your assumptions proven wholly wrong, but pleasantly reassuring to be on the same page as the majority.
Currently I am probably âdoing it wrongâ but I have been basically practicing branching VR inputs using the action and axis mapping exposed in editor.
Not suggesting anyone do this, but for examples sake, while âmove forwardâ would be Motion Controller 1 (top of touch pad on Vive, X or A on Oculus Touch), it can thereby be mapped to something like âMoveForwardViveâ, which could then be bound on an input component and handled with âmove forwardâ logic, next Motion Controller Thumbstick Y can be axis mapped To something like âOculusTouch_Left/RightStickYâ, watching for potentially forward movement based on a mathematical equation, and firing an event when conditions are met after being bound on an input component for the same âmove forwardâ logic.
It is important to be sure to unbind or not bind Vive inputs in the case of using the oculus touch and vice-versa, otherwise you gain the bonus move forward binding on your A/X button in Oculus Touch which may not be desired.
Admittedly, this feels like and may actually be the worst approach. I would love to hear some others as well.