OpenXR VR Template | Feedback and Discussion

I submitted separate pull requests for the motion range and animgraph node:
https://github.com/EpicGames/UnrealEngine/pull/9747
https://github.com/EpicGames/UnrealEngine/pull/9748

I tried making everything as clean and backwards compatible as possible, although I’m not sure on the best practices in making animgraph nodes.

The bone rotations from the hand tracking plugin also don’t directly work with the hand model, so I added an adjustment for the joint axis. I’m not sure if there is a clean way of setting up the rotations, since OpenXR and Unreal has different axis handedness.

Created a new VR template project using 5.1 preview and it crashes right after clicking on VR preview. This doesn’t happen with 5.0

Assertion failed: DepthSwapchainDesc.Extent.X == SizeX && DepthSwapchainDesc.Extent.Y == SizeY

This has been fixed for 5.1 release, but you can use this workaround for now if you want to continue to experiment with the Preview: UE 5.1.0 p1: VRtemplate crash on start on Quest - null pointer dereference - #16 by Spcarso

Last year Meta released this Quest demo app to show diferent methods and optimizations to increase render quality in 2D panels / text / UI.

2D Panels Demo

Do you think this level of quality is currently achievable with out-of the-box (blueprints) Unreal?

Hi. Do you guys have any updates on hand tracking in the 5.1 VR Template? The hand components aren’t available anymore without the Oculus plugin. Even with the depreciated plugin, the hands don’t seem to work for me. Is there a different way of doing that in blueprints now, or are you guys still working on that? Is there any guide available where one can set up hand tracking “manually”? Lol, a pun.

Natively we support OpenXR extensions (such as EXT_HandTracking) but Oculus devices use a more complex implementation. Such features are developed by the device developers, and can be released as extension plugins. The MetaXR plugin works with Vanilla UE 5.0.3!

@VictorLerp What does the prioritization between Vulkan and OpenGL look like?
In general it seems like Bytedance and Meta are putting more of an emphasis on Vulkan but that is putting things in a very awkward place given that OpenGL still handles some essential situations better than Vulkan. So feels like we are in this no mans land between lack of OpenGL support and proper Vulkan support.

Is the ultimate goal to enable both to work well or to eventually depreciate OpenGL in favor of Vulkan?

1 Like

The android based systems are all moving to Vulkan
I think John Carmack is now behind vulkan and has stopped dev on OGL
he was its main force
I could be wrong
I don’t think it really affects the open xr template though.
The template seems to work with meta for me.
But I only have vulkan on.
Someone said something about disabling “multiview” and it would work on quest with ogl
I haven’t had a chance to test, it was just a day ago
so apparently he built an ogl only proj to the q2 by disabling something.

OpenGL support isn’t there at all for Meta and Pico in 5.1.

And in 5.0 you can get OpenGL to work on the non OpenXR paths with forward shading off and a very narrow set of render settings.

And Vulkan wouldn’t be that big of a deal if High Resolution Video Playback (2K+) didn’t have a large performance impact. That’s assuming all else is well with Vulkan the video playback performance alone is enough to stop certain projects in their tracks when using Vulkan.

2 Likes

I agree
I have a lot of stereo pano’s I use for backgrounds
the video streams are definately chunky in vulkan
I think ogl particles are much better too
I keep the 427 and 503 just in case.
Maybe the video streaming will get an overhaul in vulkan.
I can’t even get 5.1 to compile for quest yet after the ndk change
as for the template itself
I like the new template. The hands are nice.

Vulkan is what our focus and feature development is targeting.

I have to say that the UX of Unreal just went completely down the toilet! Why did Epic raise the threshold for getting up and running to such a degree as this? Before we had one list, one place to easily create input actions and now this???

Before, we were up and running within minutes of switching from Vive to HP Reverb. One list, easy to select the input from menus. Done. And now, there’s not a single tutorial that’s shorter than 10 minutes on Youtube and lots of assets that needs to be created?

2 Likes

ack.
I hadn’t even looked at the new enhanced input
so busy converting and migrating
it doesn’t look fun
That being said I’m still using 5.03 for main production as I use a quest 1 for apk testing and 5.1 won’t compile with sdk 29/ndk 21 so it won’t run on quest 1.

Once you get a chance to work with Enhanced Input a bit I think you’ll like it. While initial asset creation is a step, I’ve found it to be a preferable system to the prior (quest dev as well)

2 Likes

based on everything else I’ve seen with ue5, I’m gunna say they either hate low end platforms or they genuinely have no idea how bad they’re ■■■■■■■ up.

Think about this though, fortnite doesn’t have a new 5.1 build for mobile. Likely won’t be one from what I’ve heard.

2 Likes

UE actually scales pretty good to low end
depending on your cook settings
the xr template runs in a quest nicley
and a quest is a potato hardware device

Template doesn’t have anything special, that also can run on potato. So I wouldn’t use Template as a reference to say UE5.x runs well on Quest 2, because it doesn’t.

1 Like

@VictorLerp
on the Template VRPawn, in the InputAction Hand gestures “Hand Animation Input and data passed to Animation Blueprint” - every input cast to the particular hands AnimInstance on each input. Is there a reason the Right and Left hand AnimInstance isn’t cast to once on BeginPlay and cached to be referenced here, rather than cast to repeatedly? Any thoughts on why or why not doing that would be preferable?

For a complete project I’d probably declare a variable, cast on begin play, and use the cached reference. For the template the casts are just a little bit easier to understand as there’s less BP logic involved.

1 Like

Understandable. I would like to suggest adding notes via a node tag or comments that express best practices in cases like this when ease of understanding is prioritized over it. That can also serve as a good lesson even if not fully implemented in the sample.

“For a complete project, I’d probably declare a variable, cast on begin play, and use the cached reference.” Is actually a pretty solid example of what the note could read.