Ok, some good and some bad…
Good, is that People Occlusion does work in UE5.2
Bad, is that UE5.3 has the below bug and i don’t think is usable for AR. https://forums.unrealengine.com/t/ue5-3-ar-actor-location-not-positioning-correctly-ios/1326768
So a few things to define or get straight.
1st, These images are an example of Person Occlusion not “object occlusion.” The thing to understand is that Person Occlusion only works with “people” or at least what ARkit thinks is a person, which is usually pretty good for people about 20’ or 6meters from the device, unless they are wearing funny clothing or a backpack or something.
2nd thing is that People Occlusion does not use the Lidar sensor. People occlusion is based on AI People Segmentation and Depth estimation built into ARKit. This means that it can run on any iOS device from the last 2-3 years.
3rd, It is possible to build object occlusion geometry based on what the Lidar sensor sees, but that actually doesn’t work with people as the Lidar seems to ignore people. The workflow to set up object occlusion is quite a bit more involved than doing Person Occlusion and I believe there is at least one bug still in there from UE5.1.
4th, It’s unclear how much ARkit actually uses the Lidar for Tracking. It is certainly using the Lidar to find the ground and assist with mapping the world, but it feels like the tracking is still mostly image-based planar feature tracking and MMU (accelerometers). To put it another way, I don’t see any difference in tracking quality between Lidar and non-Lidar-equipped devices. However, Lidar devices are faster at scene mapping. But in any case, all that is handled in ARkit before it hands the camera position over to Unreal, so there is nothing you do differently.
5th, Epic seems to not give a lot of love to AR workflows so don’t expect tutorials on it. But maybe with the eventual Vision Pro support, they will get around to it.
-e