Environment: OSX 10.14, Xcode 11, iPhone XS Max & iPad Air w/ iOS 13.1.3
The screen tearing issue seems only exclusive to 4.23 (and .1), whereas the camera feed looks like being splitted into individual transparent boxes which don’t sync (a bit like when vsync is not enabled).
stat unit would show the game is running at 60fps, however the camera feed is nowhere near 60f (more like 30f, even with default AR template). Meantime this issue is not affecting 4.22 and 4.21, wonder if this is something introduced with person segmentation or depth recognition in ARKit 3?
Good to know I’m not alone. I suspect something is messed up either with vsync or the tick sync, or depth segmentation. Turning on the tick sync in session config also helps a little bit. But most of the time the tearing seems heavily depends on surrounding brightness and complexity (that’s why was wondering if it’s related to depth API). Anyway - submitting bug report later.
Had dived into this issue earlier and it might caused by something complicated related to metal shader from Fast Math (although I might be totally wrong here). Long story short - if anyone is looking for ARKit in UE, the current best practice would be stay with 4.21 / 4.22 and Xcode 10.x until further updates from issue tracker.
4.23 / 4.24 is not totally unusable though and the final result oddly vary from device to device, and iterations of compilation, i.e. compile same project for 10 times and choose one with least artifact (though this might sounds like total non-sense but guess it’s not something new for people worked with Java before).
As for ARCore - never had the chance to make it work since 4.20, as for targeting android, I sincerely think ARFoundation or Vurforia would be the better choice (in a foreseeable future), or directly use the source build from Google.
Thank you for your effort and sharing this.
However i’m afraid there has been a misunderstanding.
So your app is VR, it does not use camera footage, that is an unusual case. Naturally everyone means the “real world” camera feed from the ios-device when talking about an Augmented Reality framework, not the virtual camera in the UE4 world that renders actors in a scene.
Oh, I’m sorry if I misunderstood. My app uses camera footage in the background for its tracking like Oculus Quest.
Can you check this video whether there’s any problem you mentioned above?
I built and recorded it just before.
Thank you for the video.
Yes the problem is visible in the video aswell. For example at playtime 17-20 seconds. (See image attached, this is at 19sec)
In the video (using iPhone) there is sometimes one tearing in the picture, but on iPads and iPadPros the camera image looks all the time more fractured like a broken chess-board. Endusers would send many complains and give bad reviews, so it is not usable for production.
Mobile hardware renders not in full frames, but in small tiles. The small rendered tiles then get composed to one full images. It seems the engine has a problem at that process.
By now the issue got 19 votes, target fix updated to 4.24.2, but as we learned last time, this is no reliable information and epic remains silent like always.
We are holding back production for an AR-project because of this.
Even if this issue gets fixed, AR Environment Capture Probes are broken too.
AR - Environment Capture Probe Type: Automatic crash
Screen Renders Black in a Packaged AREnvProbe Project
AREnvProbe - App crash on iPad if automatic environment capture is enabled
I will not trust release notes anymore before testing if they are actually usable for production in the future. This is very sad, so disappointed.
Here is a video with comparison. On an iPadPro11 the problem can be seen better.
Besides the blocky tearing, the frame rate also does not seem as smooth it should be.
The video compression makes it a bit blurry (effect is stronger on an actual retina device) but it still can be seen.