ArKit 4 in UE4.27 - How to enable lidar-based object occlusion?

I am trying to get real-world object occlusion working. I am testing on an iPad Pro (with lidar).

This works out of the box when viewing a USDZ on the device, but how do we enable this feature in UnReal? Seems I can not get it working.

Julian

Hi,
this has been a bug for a long time, now it works, I tested this yesterday.


I tried that on an m1 mac and ios unreal 5.3 and it does not work. New insights? thanks man.

Will check it some time!

Hi guys,
I can’t seem to understand or know how we can Lidar with Arkit for best tracking, i really hope there is some elaborated tutorial as I couldn’t find any through a quick search online.

Ok, some good and some bad…

Good, is that People Occlusion does work in UE5.2

Bad, is that UE5.3 has the below bug and i don’t think is usable for AR. https://forums.unrealengine.com/t/ue5-3-ar-actor-location-not-positioning-correctly-ios/1326768

So a few things to define or get straight.

1st, These images are an example of Person Occlusion not “object occlusion.” The thing to understand is that Person Occlusion only works with “people” or at least what ARkit thinks is a person, which is usually pretty good for people about 20’ or 6meters from the device, unless they are wearing funny clothing or a backpack or something.

2nd thing is that People Occlusion does not use the Lidar sensor. People occlusion is based on AI People Segmentation and Depth estimation built into ARKit. This means that it can run on any iOS device from the last 2-3 years.

3rd, It is possible to build object occlusion geometry based on what the Lidar sensor sees, but that actually doesn’t work with people as the Lidar seems to ignore people. The workflow to set up object occlusion is quite a bit more involved than doing Person Occlusion and I believe there is at least one bug still in there from UE5.1.

4th, It’s unclear how much ARkit actually uses the Lidar for Tracking. It is certainly using the Lidar to find the ground and assist with mapping the world, but it feels like the tracking is still mostly image-based planar feature tracking and MMU (accelerometers). To put it another way, I don’t see any difference in tracking quality between Lidar and non-Lidar-equipped devices. However, Lidar devices are faster at scene mapping. But in any case, all that is handled in ARkit before it hands the camera position over to Unreal, so there is nothing you do differently.

5th, Epic seems to not give a lot of love to AR workflows so don’t expect tutorials on it. But maybe with the eventual Vision Pro support, they will get around to it.

-e

1 Like

Ok so here is some more info on getting Person Occlusion working.

The first thing to say is that it can be a bit funny and probably will take some trial and error.

2nd thing to say is that People Occlusion is implemented via a PostProcess Material that is automatically applied. Since it’s a material it is affected by rendering and shader settings in your Project Settings. Different settings will give you different results so you’ll have to try different things which will be a bit of a pain.
Unfortunately, I can’t say to just use a certain set of settings because i’ve modifed the PostProcess material and changed the way the People Occlusion is implemented. So what works for me may not work in the base AR template project, I can’t remember.

In theory, it’s pretty simple, find the AR Session Config data file. In the AR Template this file is in Content>HandheldAR>D_ARSessionConfig. Make sure Use Person Segmentation for Depth is ticked on. And make sure Enable Session Tracking Features is set to Person Segmentation with Depth. Build the app and deploy to your device and test it out. Hopefully, it works.

If it doesn’t work, or gives you weird results then here are a few things to try… In your Project Settings > iOS, try adjusting the Metal renderer and Shader version.

In the Mobile section try adjusting the Mobile Shading option.

As I mentioned since this is a Post Process material effect, changes to the rendering and shading method can give different results, so you might need to try a bunch of different options which is a pain because you have to build and deploy to your device to test them.

The last thing you can look at is the Post Process material itself. This is located in the AppleARKit plugin. You might be able to modify the material in the Engine Plugin… not sure about that. Most folks will take the plugin and copy it to their project and use it that way which gives you the ability to modify the actual C++ files which you are going to need to do if you want to create a shipping build…
Screenshot 2024-02-20 at 4.44.25 PM

There is a Discord Server that some folks use that can be helpful.
https://discord.gg/w4vAEnjbaq

good luck
-e