And it does not work. What hoop did I miss to jump through? CC @VictorLerp
I just need to re-iterate again, that compared to what we had before, this is frankly terrible f-ing UX… completely and utterly terrible. Blueprints were a great reason for people to start out using Unreal, and now Epic has ruined it with this! Nobody can get something running out of the box without spending an hour following a Youtube guide (Epic’s documentation is as always difficult to find, and often an auto generated sorry excuse of a joke if you do find it), or without using a template anymore and templates come with dozens of crap you don’t want, set up in a way you possibly don’t want.
The reason why you can add modifiers to both Input Actions and for each binding in the Input Mapping Context, is that a modifier in your Input Action will apply to all bindings, but a modifier in your Input Mapping Context will only be applied to that binding. An example of this is in the VR Template:
IA_Grab_Left/Right has a Dead Zone modifier applied, but if you look at IMC_Default-> Grab Left/Right → Oculus Touch Grip Axis, you can see that I’ve added a Dead Zone modifier with a different Threshold value. The reason for this is that releasing an object feels better with a higher lower threshold on Oculus Touch. This is just one of the benefits of using Enhanced Input, as you don’t have to write custom gameplay logic to handle such a scenario.
The two steps it seems that you’ve missed to get inputs working are:
Add your Input Mapping Context to your Player Mappable Input Config
Add the Input Mapping Context at runtime (look at VR Pawn Begin Play for reference)
I should note that you don’t have to use Enhanced Input. We haven’t removed the legacy input system. However, Enhanced Input provides significant improvements, some of which can be seen if you compare the VR Template in 5.0->5.1. I was able to remove some gameplay logic, and Blueprints that use Enhanced Input Actions can now be migrated to other projects without having to set up the Input.ini. As an example, this enables Plugins that contain Actors with input to work out-of-the-box.
I had forgotten to put the first item you mentioend in the my list, but I completely missed the second item (despite quite large comments in the template as well).
However, using what you see in the screenshots above, I have axis inputs from the HP motion controllers mapped to two different actions, one for hold and one for release and what I’m seeing is that the hold never triggers, and the release triggers immediately on press… oh, but only when the positive axis returns to 0… if I pull left on the stick, it doesn’t even react. Technically this may be correct, I don’t know, but the UX is in the flipping basement. Guess I just need to copy the template, whereas before, it was pretty intuitive to figure out yourself.
Also, I’m fairly certain that I’m not doing this because I wanted to. The inputs stopped working when moving from 4.27 to 5.1 so I assumed it really was deprecated (at least for VR)… and I see I still can’t get any controller information because this no longer ever returns valid:
This is an limitation in OpenXR. Depending on the runtime all the data will not be available until the application is in the focused state, with the headset worn and any input consuming overlays closed. At least with the SteamVR runtime, it will always take a few frames before this happens.
You see that “set timer by event” node? That runs in a loop, basically, so what you’re saying isn’t the issue here, I already compensated for it (and this method has worked well in 4.27).
The timer will re-try to get the data from “any controller” until it returns valid. Trouble is, if you only have a left controller (HP Reverb G2) powered on, it’s never valid.