The 2D Custom mode looking worse would mean it’s not able to properly rectify the image. Both modes looking the same likely means it’s not able to calculate depth and falling back to the 2D projection.
The headset doesn’t seem to provide the camera distortion parameters to correctly calculate the rectification. It would be up to HTC (and possibly Valve) to update the firmware and drivers to provide the correct parameters. Not sure that is possible without factory calibration either.
I could try adding support for using the stock pre-rectified camera frames for 3D reconstruction, but it doesn’t work well for the Index at least.
Hey, how can I enable VST in my game? I am using the Varjo XR-4. I have enabled MR and everything else works fine. The alpha blend works perfectly in Varjo Base, but I think the VST layer is applied after the Unreal render is complete. Hence, it shows black where alpha blend is enabled. I want it to show the real-world feed wherever the alpha blend is enabled. I am trying to achieve an MR world, which I have, but it only shows the real-world feed in Varjo Base, not in the editor or game itself.
Varjo has native support for passthrough when you are using their own OpenXR runtime, so make sure you are using that instead of SteamVR. My applications are only for when using SteamVR, and I don’t know if they work with Varjo headsets, since there is no documentation on whether the camera feed is available through SteamVR.
As you say, the passthrough applied after the render inside the runtime compositor. Unreal renders scenes using an inverted alpha channel. When using unrendered parts of the scene as the source of the cutouts, they will end up inverted by default. Unreal 5.7 added support for the new XR_EXT_composition_layer_inverted_alpha OpenXR extension that flags to the runtime to treat the alpha as inverted. I don’t know if the Varjo runtime supports it yet.
You can enable it by enabling “Invert scene alpha for passthrough"in the Project settings for the OpenXR plugin, and checking if it’s working with the “Is Composition Layer Inverted Alpha Enabled” blueprint node.
If the runtime doesn’t support it (or you are using an older Unreal version), you will have to add a postprocess material to manually invert the alpha. Varjo has a tutorial on how to set it up (you don’t need the Varjo plugin for 5.7 unless you want to use additional features like depth composition):
If you want to support other runtimes as well, you can either always have the postprocess material on, or use the output of the “Is Composition Layer Inverted Alpha Enabled” node to toggle it off.
Epic has a bit more documentation on the 5.7 passthrough feature here:
I already have passthrough (VST) working correctly inside the HMD using Varjo OpenXR (alpha blend / MR).
The problem is the desktop game window / spectator screen: wherever my scene outputs alpha (cutout/mask), the monitor view shows black, while the HMD shows the real-world camera feed (as expected).
I assume this is because the VST video is composited by the Varjo runtime compositorafter Unreal renders, so Unreal’s desktop backbuffer never contains the camera pixels. (Attached screenshots: Varjo headset view vs Unreal window).
My question: Is it possible to show the actual VST camera feed in the Unreal spectator/monitor output as well (i.e., have the composited passthrough visible on the desktop), or does the app need access to the camera frames as a texture to composite it itself?
If it’s not possible with OpenXR passthrough, what’s the recommended approach?
The app would need access to the camera frame textures.
It’s not possible to do with any passthrough method provided by OpenXR itself. They are all designed to only allow the compositor access to the camera frames. This is done for several reasons. Mainly for keeping the camera latency low, but also for privacy and simplicity.
I don’t think there are any official solutions for compositing the cameras in-engine. My old UE4SteamVRPassthrough plugin does it, but it only works for HMDs that expose the camera streams to the SteamVR driver API, and even then only without stereo-correct projection. I highly doubt Varjo has added support for this. Valve never properly published the driver-side API, and its practically hardcoded to only work with their first-party HMDs.
Varjo does seem to provide access to the raw distorted camera frames in their own native SDK at least. See the Varjo_datastream.h and Varjo_types_datastream.hheaders. The headers suggest you would need a Varjo Base Pro license to access the streams though.