I am looking for information on how to build an AR application that runs on a PC and displays on the Meta Quest 3. While I am familiar with building standalone apps for the Quest, I haven’t found much documentation regarding PCVR applications that support Passthrough.
My goal is to visualize high-fidelity, complex 3D models that exceed the Quest’s standalone processing power. I need to use Passthrough to see the real environment while having the ability to manipulate the position of these models.
I noticed that the Unreal Engine VR preview supports something similar, but I would appreciate any links to official documentation, tutorials, or project templates for Unreal Engine that cover PCVR Passthrough.
Meta only supports passthrough on PCVR for development purposes, not for released applications. They require developer accounts with opt-in settings to even enable the passthrough in the first place.
AFAIK their PCVR runtime only supports Meta’s XR_FB_passthrough OpenXR vendor extension, which requires using the MetaXR plugin for Unreal.
Most VR Vendors have very limited or no passthrough support on PC, especially for consumer headsets.
Part of it is likely due to passthrough being hard to implement well for streaming headsets. To implement it properly, the streaming software would have to encode and transmit an alpha channel and optionally depth map along with each frame to the headset. I don’t know if there are any readily available video encoders that support alpha channels, so there may be a significant development effort required to make this perform well.
Meta has opted to instead transmit the camera feed from the headset to the PC, and do the composition there instead of on the headset. This results in a significant latency since the images have to make a round trip from the headset to the PC and back, which might be a reason Meta doesn’t want to support it.
It’s unlikely to change in the future. Meta has shown no interest in supporting PCVR, instead letting their Link software rot.