[Vision Pro] State of RemoteImmersiveSpace

Hello,

VisionOS beta 26 has introduced RemoteImmersiveSpace along CompositorServices. This new set of APIs allows to render an immersive space in macOS and send it to a Vision Pro device. It requires a running instance in macOS and app in Vision Pro that can display the space. Gestures and tracking is shared back to macOS so it seems that some degree of real time interactivity is possible (probably there are some limitations but we don’t know well yet and it is in beta so it may change).

We are interested on using UE for our apps. There are several advantages on using UE over RealityKit, one of the main ones for us is clothing simulation.

After testing we found out that it can work but the performance in Vision Pro is a bit lagging for more complex models. So using RemoteImmersiveSpace seems like a logical next step. My understanding is that UE renderer already uses its own compositor service renderer. Perhaps Unreal has some interest in adding RemoteImmersiveSpace support in the near future. I have looked in the code base and forums but could not find anything.

It would be great to have some input from UE engineers about this.

Best regards,
Pablo

1 Like

Hey Pablo! visionOS 26 with support for Remote Rendering, Progressive Immersion, Dynamic Render Quality, Hover Effects, and Controller Support is on our roadmap to be investigated. However, we currently don’t have a timeline and I would not expect it prior to UE 5.8.

1 Like