I am using ue50-main and i am able to run vision pro 2.0 on my device, its still experimental
Any infos about UE5.5 Preview 1 Launcher version ? Is the VisionOS SDK in ?
Unreal Engine 5.5 VisionOS Build on Apple Vision Pro guide has been updated from 5.4
A) Path for Blueprint Epic Launcher project - Fast builds
B) Path for previous C++ Source Code project - Slower
Do I need a developer account and a package to run the project?
Hello, I’ve seen your video. Hello, I’ve seen your video. For your video, I would like to ask you two questions.For your video, I would like to ask you two questions.
What are your suggestions for VisionPro to make a gesture system like Meta’s?
Do you have any suggestions on whether Get Hand Tracking() in version 5.5 will crash after adding it?
Meta uses a pose system I believe. Meaning it has pre registered certain hand configurations to execute a given action.
It might be worth reproducing that path. I have not gone this route personally as my exact needs are different.
I find the pose system limiting, but effective for easy use.
Good luck in your projects
Thank you for your reply, I think I’ve fixed this, thank you very much.
Hey folks. Just had a go building the template Blueprint VR project for the Vision Pro. I’m using Xcode 16.2 from the store, Vision OS 2.2 non-beta, and Unreal 5.5.1-38445549+++UE5+Release-5.5 from the launcher. Surprisingly everything went well, and I have an app built with no errors, and a workspace I can open from Xcode. Unfortunately after deployment, the app launches, I see the window with the Swift button, but then crash. The debugger is reporting death in the RenderThread, possibly because of a vertex descriptor error?
validateRenderPassDescriptor:782: failed assertion `RenderPass Descriptor Validation
Memoryless attachment content cannot be stored in memory.
Memoryless attachment content cannot be stored in memory.’
where previously: LogRHI: Error: Failed to create graphics pipeline, hashes: Vertex: 732444CC1C02AC34928CF205707DCEC59CFB9C4A, Pixel: 3CBAB3F783DDA419393ADBEBF65F05FF57F625A8, Pipeline: 9395D2DD3F14DB7C04CBC17C6BC6EF081416030F.
Does anyone know what I might have done wrong here? Thanks!
Hah! My bad. This was the second time I tried, and seeing the option to build for Vision Pro, I didn’t realize I had to enable the Vision Pro plugin for each project. With that on, I can now build and deploy, and can see the template VR environment in on device!
FYI, just in case it help anyone, I was getting code signing errors on the first attempt, and that was because I accepted the default project location inside the Documents directory. Turns out there are some automated OSX processes that use the resource fork in there, and that was preventing code signing. Just starting the project over in a non Documents folder got me past that.
So now the next issue is that with the stock VR template, no modifications, I’m getting an out of memory crash within a minute or so of launching. Is that expected?
There were a few logs of the nature of
[UE] [2024.12.31-17.09.24:391][406]LogInit: Low Memory Warning Triggered
[UE] [2024.12.31-17.09.24:391][406]LogInit: Free Memory at Startup: 4872 MB
[UE] [2024.12.31-17.09.24:391][406]LogInit: Free Memory Now : 1023 MB
[UE] EngineMemoryWarningHandler: Mem Used 606.31 MB, Streaming Texture Memory 35.31 MB, Non Streaming Texture Memory 630.02 MB, OS Free 1021.90 MB
Also maybe notable, there are still shared errors as well, including LogMetal: Error: Failed to generate a render pipeline state object: Vertex attribute 1 is not defined in the vertex descriptor.
Is the VR blueprint maybe out of date and no longer a best practice example I should be starting from?
Hey, I’m trying to get the Get Gaze Data
to work, but im always getting the Return Value as false, as if the eye tracking feature is not supported. I’m using unreal 5.5.2 from the launcher on mac and have the Vision OS2. plugins are enabled for openXREye tracking and openXRVisionOS
is there a specific Plist key that I should add?
I added the below ones
- NSHandsTrackingUsageDescriptionHand Usage
- NSMicrophoneUsageDescriptionMic Needed
- NSWorldSensingUsageDescriptionWorld Sensing
- NSUserTrackingUsageDescriptionUser tracking needed
- NSCameraUsageDescriptionUser camera needed
anyone having this issue?
Sadly, eye-tracking and many other features are locked only for Enterprise-approved folks. So unless you’re a major company or large research firm, that’s not gonna happen.
Eye-tracking is not supported whether Enterprise-approved or not. There are many parts of the API that we don’t have access to when using Metal as the renderer, but it’s our intention to support them if/when they become available to us.
Thank you for your answer @VictorLerp . I really hope we get the APIs soon I’m confident it’s a feature the entire community is looking for.
Hi Victor sorry I have a quick question, is there anyway to access reality kit functionalities inside Unreal? I can build, run and package the project fine for AVP but I would like the look that RealityKit provides when using Mixed immersion. Specifically the IBL updates when you move around the environment when using reality kit, is there something similar that I can do inside unreal or for now only baked lighting is supported? @VictorLerp
Following a lot of the advice in here, I have the VR Template running on an AVP (UE 5.5.3 (blueprint), Xcode 16.2, MacOS 15.3.1). I followed @Biggs’s Medium post, but I’m having two issues:
- I can’t seem to grab anything (I have the additional plist entered)
- If I move more than ~2ft (in the VR template one big floor tile) in any direction, passthrough starts and I drop out of immersion. I can return to the origin and go back into full immersion.
I moved to a larger space, but the same thing happens there.
Help?
- There’s no “grab”-functionality built-in for hand tracking in the VR Template - you’d have to implement that. The template is recommended as it comes w. pre-configured rendering settings that work better than the other templates for AVP.
- That’s how Apple’s Full Immersion Mode works, it has a boundary of 1.5m. You can switch to Mixed Immersion Mode which would let you move further, it’s outlined in the quick start guide how to do that, but in 5.5 and earlier versions you can only change it with a source build. In 5.6 and later versions we default to Mixed Immersion.
Thanks for the info. I assumed that it was an Apple thing, but my google-fu was weak, as I couldn’t find it.
I did a source build earlier, but the built app threw a BAD_ACCESS violation (in GetRenderingXXXX (?) I don’t remember exactly). I’ll go back and try again. I was using the “release” branch, or should I try a different one?
Hi can applevision works on Unreal for AR stuff like qr scan / image tracking? I found Unreal never updated its ARKIT since 4.0.