Image tracking in VR passthrough

Nope you can’t do that with any engine or any plugin if you want to work only with Quest devices without any accessories and it isn’t about engines at all.

Quest device family doesn’t have camera raw buffer access feature.
Even if you integrate ZXing, OpenCV or TensorFlow, you can’t read camera data thus can’t process it to decode it.
Meta forbids that access because so-called privacy reasons.

Couple days ago I talked with a team from Meta who works on “Quest 3 CAD Parts in MR / CAD Viewer” video and gave them some suggestions about CAD viewing.
After main subject, I said couple words about QR reader API. They said, they could look at it but it was not an official meeting. So, there is no guarantee.

In order to partially solve that, I can suggest 3 options.

  1. I developed crossplatform Intel RealSense wrapper for UE5. If you attach a D435i or D455 to Quest with USB C, you can get RGB data and process it with ZXing in FRunnableThread and you can get decoded QR. But that sensor doesn’t have auto-focus feature. So, it can’t detect QR if they are not big enough. Also it is highly susceptible to lightning conditions and camera angles.
    (Btw. distance calculation works good)

  2. Developing a LibUVC wrapper for Unreal Engine 5.
    I will work on that in couple months. This will allow to attach normal webcams to Quest 3. I suggest Logitech Brio or Insta360 Link like high end cameras. Also this plugin can open Epson Moverio and Vuzix development possibilities.

  3. Using a phone to read read/detect real world object

  • integrate an HTTP server to Quest (there are plugins for that),
  • read QR from a phone,
  • Send QR string to Quest with Post request
  • Optionally you can use bluetooth but it is harder because bluetooth characteristic thing.

//
Or you can use Pico Enterprise or HTC Vive.
Even Apple Vision Pro won’t allow us to access that buffer. (But probably they will give a detection api through ARKit)

1 Like