OpenXR VR Template | Feedback and Discussion

When I change AxisDeadzoneThreshold value to something else and than bring it back to 0,7 as it was, teleport is not working, why ?

Are there any nodes to discern if a Reverb or G2 is currently being used?

“Get HMDDevice Name”

1 Like

Thank you.

This, unfortunately, lead me here which isn’t too great:

I don’t have both headsets here to test, so I can’t confirm what they’re named… also, am I supposed to do string matching on this or how is it meant to work? (Guess I need to look in the template to see how they solved it.)

I’d imagine that mainly exists for debug purposes. When using OpenXR it retrieves the system name from the XRSystemProperties struct, which has no standardized representation in the spec, so the result may vary between implementations.

OpenXR seems to be designed to hide all this information from the developer, specifically so that it can’t be used for configuration purposes. I’m guessing you want to get the device name to set up controller input. The intended way seems to be to just set up action bindings for each controller type and let the runtime handle everything else.

The difference between configuring different types of controllers seems to be a huge limitation in OpenXR at the moment. For example configuring inputs for controllers that might have a touchpad, thumbstick or both. There doesn’t seem to be any guide or best practices on how this is supposed to be implemented in either Unreal, or OpenXR in general.

It seems the only way to properly handle disparate controls is to create separate actions for each type of control and only bind them to the controllers that support them. It would be a good idea on Epics side to add documentation and examples on the proper way to handle this.

1 Like

Yes, and also to show the correct controller. We run both HP Reverb Pro and HP Reverb G2 in our studio and they use different controllers.

As we may show customers models in VR who have never used VR before, we don’t want to have to teach them every time, so it’s much easier to have the correct controller in VR with a highlight showing them where to push.

Unfortunately, even professional CAD software use custom controller graphics that just confuse people and make them not find the buttons they’re asked to press because what they see in VR doesn’t correspond to what they’re holding in their hands.

For that kind of controlled use case, it might be enough to find out what the runtime outputs from “Get HMDDevice Name” for the different headsets and do the string comparison.

This only holds true as long as you can control what runtime and hardware the user has though. As an example it’s relatively popular to run the G2 through SteamVR together with Valve Index controllers, and sync the tracking spaces with a third party app.

Another option seems to be the “Get Motion Controller Data” node. The name field returns the interaction profile path for the controller, so if the runtime reports a different path for the G2 controller (probably needs the G2 extension set up), it might be used to differentiate them. The paths are defined in the OpenXR spec, so they should be consistent across runtimes.

Yes, string matching is how you’d use the function. The runtimes return names of devices, we don’t have control over that.

Yes and if you enable the HP Reverb controller plugin you’ll have access to input bindings for their controllers, and device visualization will work for HP Reverb G2. We don’t have controller models in /OpenXR Content/Devices for the Reverb Pro, however.

Yes for some input types separate Action events is the only way to “catch all” right now. We’re aware of this and want to make it better.

This is how we can identify which controllers are used (motioncontroller device visualization is using the interaction profile to determine which controllers to display). But if there’s no interaction profile for your controller, OpenXR will pick the one that’s the closest. This is the way we’re doing it until the OpenXR specification details how to deal with controller models.

That’s how you’d do it!

In the past I made a function library-function that I would call that execute SetWorldToMeters and scale motioncontroller models + game logic where that’s relevant.

We’re not going to make that something that “just works”, as there are more differences than similarities in how developers would want to handle that.

1 Like

Yes you’d use VRInteractionBPI If you want to pass input from buttons to held actors, and you’d set up action input events for those in VRPawn just like the existing ones for Trigger.

2 Likes

on the topic, there seems to be a bug where this world scaling freaks out if multiple editor tabs are open. There are tons of online post describing it going back pretty far and it still is happening on the latest launcher build. Is this something that you’re aware of and that might get addressed at some point?

I might have missed if you commented on the link I quoted earlier, but both the HP Reverb and HP Reverb G2 return “Windows Mixed Reality”.

Hmm… mine just reports “None”… but then again, I now can’t seem to make any input mappings work for some reason (just updated to 4.27.1).

Oh right, I forgot to mention that. I was testing it yesterday, and for some reason the motion controller data wouldn’t populate properly until after a few frames after pressing play.

Hi i was experimenting smooth vr locomotion based in various Vr youtube tutorials in unreal 4.26 and earlier versions from the engine, now with open xr in 4.27 using the template with Oculus quest 2 via link, had a problem that for some reason the hmd once in vr had a delay in the movement so the headset displays a gravity movement like a drunk vr experience, the problem goes and comes depending on activating oculus vr or oculus xr plugin. Works for 1 or two tests and then again stars to wobble again, for clarification these events occurs in various cases like when importing VR controllers from 4.26 versions. When opening a vr locomotion in a separate project using the template from 4.26. i dont really understand much about open xr but i think that some vr classes like controllers interfere with hidden operations, someone has the same problem here?. Thanks!!

Which runtime are you using? That’s unfortunately out of our control, but a problem nonetheless - I will log it and see what can be done about it.

This issue is related to having an active SceneCaptureComponent in the scene (the VR Spectator is the default Blueprint that contains it in the VR Template). It’s known, but I have no ETA on a fix as of now.

2 Likes

thanks for your answer @VictorLerp , the more you know…, now i understand is not only my case, unfortunately i can´t track where i used a scene capture component or where it hides if not in the vr spectator to forcea clean up and make the hmd work normally again, if anyone can point mme to the right direction i will be grateful, oh thanks again excellent work by the way.

Hmm… I still get “None” from the node setup below, however I don’t know what to plug into the world context node…

Well, being an industrial designer, I unfortunately don’t understand what a “runtime” is in this context. I use Unreal Engine 4.27 with the OpenXR and HP Controller plugins enabled (and Datasmith, although that shouldn’t have anything to do with VR).

Testing it again, the same setup you have works for me, but I noticed it can’t read the data if the application isn’t focused. For me the SteamVR overlay was automatically opened on launch, and it wouldn’t get the data until I closed it.

It seems this behavior is caused by the xrGetCurrentInteractionProfile() function, called by
FOpenXRHMD::GetMotionControllerData().

Testing this with a custom C++ application interfacing directly with OpenXR, I’m getting consistent results. For the first two frames before the session transitions into XR_SESSION_STATE_VISIBLE,
xrGetCurrentInteractionProfile() will return a XR_NULL_PATH instead of the set interaction profile path. If the SteamVR dashboard is open at application start, it will return XR_NULL_PATH until one frame after the session transitions into XR_SESSION_STATE_FOCUSED. If the dashboard is opened while the application is running, it is still able to get the correct path.

The OpenXR spec is unclear on if this is the correct behavior or not.

I’m testing this with the OpenXR runtime in SteamVR beta 1.20.4, using Windows 10. It would be interesting to know if other runtimes behave the same way.