How/why exactly does the amount of light in a room affect the tracking on a virtual camera (iPad)?

I received an iPad as a Christmas gift and have been absolutely loving it for my Unreal projects. For a while I was having issues with lag/rubber-banding when trying to record, and ultimately after multiple tests I found out it was simply caused by my room being too dark. As soon as I turned all my lights on, the tracking became perfect.

So I guess as my title says, why exactly is this the case? How does that work?

Something that’s important to note is that I’m using a standard iPad model, not the Pro which has access to LiDAR.

Without Lidar, for tracking, device use SLAM based on camera data, and if camera get black “photo” , then you lost track, because device can’t found dot for tracking.

Very interesting, now I’m kind of wishing I got a Pro instead for better tracking. I’ve been getting decent results with my standard model though so I won’t kick myself too much over it :sweat_smile: thanks so much for the insight!