I have recently updated an AR (IOS) project to UE5.3 from 5.1. It is now causing an issue where spawned actors are close to the camera/origin and not positioned correctly.
The app works by:
Spawning a trace (using AR Blueprint library)
Get the trace local to world transform
Spawns actor with the location of the trace transform.
This was working in 5.3, but has stopped in 5.1. Outputting the the transform location gives correct values (ex. Wall 1 {-200, -50, 50}, Wall Opposite {+200, +50, 50}) The spawned actors have some sense of world since I can move around and theyâll become slightly smaller or large (no where near as much as they used to). The positions are also relative to each other where they should be. However, instead of being placed on the wall for example, they are placed closer to the camera origin.
The numbers on the left is the trace location and the right numbers are the spawned actorâs location (from GetLocation after it is spawned)
Hi, I donât have a solution but wanted to confirm that I am also dealing with the same problem, tested multiple projects (including handheld AR demo) on multiple devices but tracking never works properly
Same here on 5.3.2 ! Android no problem, on iOS however, the plane detection seems to be real close to the camera origin. Still trying to make sense of things. Looked at the issue tracker and I didnât find anything as specific as this thread so doesnât seem like an official bug has come through.
Looking for a work around or telling my team to stick to Android.
Looking at source to see what might have introduced the issue.
@JeffreyFisher Noticed youâve dealt with some commits for ARKit and tracking. Have you tested the AR template in UE 5.3 and everything seems fine on your end?
I did a test using the default AR Template that comes with Unreal. Here it is in UE 5.2 and UE 5.3. You can see in UE5.3 how the object floats around and doesnât âstickâ properly.
This is on iOS 16
Iâm on iPhone 12 16.6 and it refuses to place the object on any surface and just places it near camera origin because that is the only place the tracked plane shows up.
Yeah my results are also much worse, in 5.3 tracking is completely broken and models are placed right infront of camera and move around with it. Tested on iPad Pro and iPhone XR
I have not tested anything with 5.3 in a while, but I did do some iOS HandheldAR template testing today with what will eventually become 5.4 (currently very similar to UE5/Main) and looked into this problem a bit.
I think the collision tests are working for me, however with all of the options enabled on âLine Trace Tracked Objectsâ (Feature Points, Ground Plane, Plane Extents, Plane Boundary Polygon) I do get a confusing surplus of hits. If I just use the first hit Iâll usually get something closer to the camera than I wanted. In particular Feature Points are perhaps not terribly useful for application developers. They are the points ARKit uses to orient itself, and the documentation only says that they âusuallyâ are on real world object surfaces. After scanning around for a while Iâm also ending up with multiple ground planes. That may be a bug, perhaps we need to throw away old ground planes when a new one is found? If I cut it down to only âTest Plane Boundary Polygonâ I get more useful results: a hit on the seat of the chair, a hit on the floor below the chair. We sort the results in LineTraceTrackedObjects, so the first hit is the hit nearest the phone and therefore usually the desired hit.
I did attempt to play around with Line Traced Tracked Objects as that was my go to suspect for the issue I was experiencing.
If I recall correctly my phone would print out only one Hit from the array of hits so that made me rule out that there were more options I could pick from (because I too was hoping that mayb in the hit array there was a good plane to pick from).
I also tried filtering out some options there as well unchecking each option and trying different options but it didnât seem to make a difference.
But at least you too are noticing some potential bugs.
IDK if the iPhone device its self is playing a role here or not. LiDAR vs vision only or something like that.
In the end for an iOS build that works I just remade my project in 5.2 using the AR Template there and tracking was just fine. I was lucky in that I could do that but hoping the bugs get sorted out in a future UE version.
Hi everyoene,
It is good (and not good) to see others facing similar issue. The positioning is not fixed at all, i did AR example test from Xcode and it is amazing, but in UE the positioning is terrible (noting that scene depth didnt work at all but all of a sudden after few days it started working!)
Hope there is any development at your end
Done a few tests on this 5.3 tracking is unusable, 5.2 tracking works well however when you create a build for TestFlight/App Store the camera feed is black (attached image below). Donât know if anyone knows a work around for this issue?
Dang was hoping it would get fixed in 5.4⌠Looks like we are either a whole engine version or a few hot patches away from being able to do iOS AR work outside of 5.2.
And for those who are wondering, no backward compatibility, going back from 5.3 to 5.2 doesnât go well. You either have to restart your project from scratch or put it on hold until a patch is released.
I did reproduce the issue where objects just werenât staying fixed to the real world well, the following is a work-in-progress c++ fix that one could apply locally to 5.3 or 5.4. The problem is that the projection matrix FOVs were incorrect. They would have been correct on an older 16:9 iphone in landscape, but way off in portrait and somewhat off in landscape on a recent iphone which has a 19.5:9 aspect ratio, because we would always end up using a 16:9 aspect ratio from the Camer aComponet (unless one modified the camera componet aspect ratio in blueprint).
This fix has the ARKit build the projection matrix itself (like arcore or vr stereo rendering) rather than relying on the fov/aspect ratio/letterboxing system used for non-ar/vr applications. For AR we need the projection matrix to match the passthrough camera image.
AppleARKitCamera.h
Add:
void GetViewProjectionMatrix(EDeviceScreenOrientation DeviceOrientation, FSceneViewProjectionData& InOutProjectionData) const;
AppleARKitCamera.cpp
Add:
void FAppleARKitCamera::GetViewProjectionMatrix(EDeviceScreenOrientation DeviceOrientation, FSceneViewProjectionData& InOutProjectionData) const
{
// Use the global viewport size as the screen size
FVector2D ViewportSize;
GEngine->GameViewport->GetViewportSize( ViewportSize );
float FOV = 0.0f;
if (DeviceOrientation == EDeviceScreenOrientation::Portrait || DeviceOrientation == EDeviceScreenOrientation::PortraitUpsideDown)
{
// Portrait
FOV = GetVerticalFieldOfViewForScreen(EAppleARKitBackgroundFitMode::Fill);
}
else
{
// Landscape
FOV = GetHorizontalFieldOfViewForScreen(EAppleARKitBackgroundFitMode::Fill);
}
InOutProjectionData.ProjectionMatrix = FReversedZPerspectiveMatrix(FMath::DegreesToRadians(FOV) * 0.5f, ViewportSize.X, ViewportSize.Y, 10.0f);
}
AppleARKitSystem.cpp
Update the following function:
virtual void SetupViewProjectionMatrix(FSceneViewProjectionData& InOutProjectionData) override
{
if (ARKitSystem.GameThreadFrame.IsValid())
{
ARKitSystem.GameThreadFrame->Camera.GetViewProjectionMatrix(ARKitSystem.DeviceOrientation, InOutProjectionData);
}
}
I tried out this fix and it looks to be working for me.
Your description of the base issue makes a lot of sense. I was able to make the modifications you describe here and build the Handheld AR project. The default asset seems to stick properly to the ground. Iâll do some A/B testing with a 5.2 build of HeldheldAR to see if there is any performance difference, but it looks like youâve solved it.