Mixed Immersion Mode Disabled When Enabling Hand Recognition on Apple Vision Pro (UE5.5)

Mixed Immersion Mode Disabled When Enabling Hand Recognition on Apple Vision Pro (UE5.5)

Hello,

I’m currently developing for Apple Vision Pro using Unreal Engine 5.5 (Launcher version) and have encountered an issue. When enabling hand recognition (as shown in this post), the Mixed Immersion Mode gets disabled.

Development Environment:

  • Unreal Engine Version: 5.5 (Launcher version)
  • Xcode Version: 16.1
  • VisionOS Version: 2.0.1
  • Settings: Configured following the official quick start guide.

At this point, I’m unsure whether this behavior is due to a specification of VisionOS, a limitation of Unreal Engine, or an issue in the interaction between the two.

If anyone has experienced a similar issue or has insights into possible solutions or workarounds, I’d greatly appreciate your input.
I will also continue my own investigation and report back if I find any relevant information. Thank you for your help!

I’m also encountering the same problem as well!

According to a post by Alex Coulombe,
“There is a bug in the Epic Games Launcher version of UE 5.5 where the VisionOS ueswift.swift file in the engine folder is not utilized. To use mixed mode, you need to use a source build.”

This might be affecting it.

I am facing same problem, after removing sky sphere and walls , all i can see is a black surrounding. What is meant by using a source build ?

Download and compile Unreal Engine from GitHub: https://www.unrealengine.com/en-US/ue-on-github

1 Like

Vision Pro interactions in VR Template does not work. Even for this do we need to use Source Build ? or do we have to setup OpenXR hand tracking blue print ?

Input in the VR Template is not configured for Vision Pro. You’ll have to use OpenXR hand tracking data to implement your own custom gestures.