Support for Apple Vision Pro in Unreal ??

I also encountered the same problem. How did you solve this problem?

In addition to what @martin.fink mentioned, check the error and make sure the plugins are loaded in the packaged app when running. In my case the OpenXR plugins were not packed properly the first time around because the guide by @VictorLerp writes SupportedPlatformTarget should be set to ‘visionOS’, but that should be VisionOS.

Just strictly follow the tutorial Unreal Engine 5.4 VisionOS Build on Apple Vision Pro | by SpatialBiggs | Apr, 2024 | Medium

Anyone tried the 5.4.1 launcher version ?

Still not working on 5.4.1 launcher version …

It seems to be even less supported in 5.4.1. Build log now includes:

UATHelper: Packaging (VisionOS): The VisionOS platform is not supported from this engine distribution.

It’s be present on UE5.4 too. Only GitHub source version working.

visionOS is unfortunately not supported w. the binary version of 5.4 and I’ve updated the Quick Start Guide with this information. We’re working on a fix and I’ll update this thread when it’s live.

2 Likes

Thank you! Looks like a fair number of are excited to try it out!

You can still use the source version of 5.4 or Main if you want to test visionOS.

I’m still learning Unreal Engine (Unity refugee). Adding “compile from source and keep it up to date manually” to my plate seems excessive, unless the binary fixes are going to be delayed for months or somesuch.

Only VR mode ? Any chance about MixedReality ?

Curious about

  • AR passthrough
  • pinch gestures
  • gaze based interactions(selection/trace)

Is there any chance to access these functionalities?

  • Passthrough: not supported. The scene is fully immersive and rendered in Metal. There is no access to cameras AFAIK.
  • pinch gestures: this video shows it but I’m not sure how it is implemented. The openXR plugins for visionOS have limited supported for gestures.
  • No gaze, there is not support for it in OpenXR for visionOS for eye gaze. However, it seems that people sometimes call “gaze” to head direction, head tracking is supported.
1 Like

Hoping so much for real integration like Unity PolySpatial …

1 Like

The gesture API is not supported. This is done in Unreal by detecting the distance between thumb and index bone transforms, and the same implementation can be used on OpenXR supported devices.

2 Likes

Thanks for your response.
I was wondering how it was achieved in the video. I observed that the hands are always visible (unlike in a RealityKit app, in which you can execute indirect gesture with your hand on your lap). I guessed it would be based on video recognition of hand gesture and position.

I need to test this feature in AVP. Could you let me know if there is any guide for this?

Just use MotionController pointed to Thumb and Index tip.

1 Like

Hi! I currently have the same issue as you had. However, adding "Enable IOS Simulator Support (Experimental)” and disable the “OpenXRVisionOS” plugin doesn’t seems to fix the issue. I still getting:

[UE] Fatal error: [File:./Runtime/Apple/MetalRHI/Private/MetalRHI.cpp] [Line: 249]

This device does not supports the Apple A8x or above feature set which is the minimum for this build. Please check the Support Apple A8 checkbox in the IOS Project Settings.

I am also using visionOS 1.1 & unreal 5.4.1, plus I follow all steps at the guide: https://medium.com/@SpatialBiggs/unreal-engine-5-4-visionos-build-on-apple-vision-pro-9f256eadc651

Any other idea? Should I should how compile something so the 2 changes I did get reflected?

OpenXRVisionOS should be enable.
Are you on Mac Silicon ? Are you working on UE5 built from source ? Did you strictly follow the step on the tutorial ? And did you check Metal for Desktop and not for mobile ?

Anyone noticed that sound come from the SwiftUI View ? Is there any way to fix this ?
And anyone noticed that if we close the SwiftUI Window, game crash after a few time ?

1 Like