UE5.3 AR Actor Location Not Positioning Correctly (IOS)

I have recently updated an AR (IOS) project to UE5.3 from 5.1. It is now causing an issue where spawned actors are close to the camera/origin and not positioned correctly.

The app works by:

  1. Spawning a trace (using AR Blueprint library)
  2. Get the trace local to world transform
  3. Spawns actor with the location of the trace transform.

This was working in 5.3, but has stopped in 5.1. Outputting the the transform location gives correct values (ex. Wall 1 {-200, -50, 50}, Wall Opposite {+200, +50, 50}) The spawned actors have some sense of world since I can move around and they’ll become slightly smaller or large (no where near as much as they used to). The positions are also relative to each other where they should be. However, instead of being placed on the wall for example, they are placed closer to the camera origin.

The numbers on the left is the trace location and the right numbers are the spawned actor’s location (from GetLocation after it is spawned)

I also have similar code in C++ that does not work either. The C++ code updates the actor’s position after it has been spawned with no success.

Any help would be appreciated!

AR in IOS in 5.3 has become needlessly difficult.

4 Likes

Hi, I don’t have a solution but wanted to confirm that I am also dealing with the same problem, tested multiple projects (including handheld AR demo) on multiple devices but tracking never works properly

2 Likes

Seeing the same problem in UE5.3.2…

2 Likes

Confirming that i’m seeing the same issue in my project that I upgraded from 5.2.
-e

Oh wow finally a sanity check thread.

Same here on 5.3.2 ! Android no problem, on iOS however, the plane detection seems to be real close to the camera origin. Still trying to make sense of things. Looked at the issue tracker and I didn’t find anything as specific as this thread so doesn’t seem like an official bug has come through.

Looking for a work around or telling my team to stick to Android.

Looking at source to see what might have introduced the issue.

https://github.com/EpicGames/UnrealEngine/commit/210e0a879c6b28078431972d41e85fb8fc17eb2f

@JeffreyFisher Noticed you’ve dealt with some commits for ARKit and tracking. Have you tested the AR template in UE 5.3 and everything seems fine on your end?

1 Like

I did a test using the default AR Template that comes with Unreal. Here it is in UE 5.2 and UE 5.3. You can see in UE5.3 how the object floats around and doesn’t “stick” properly.
This is on iOS 16

AR Template UE 5.2
[Dropbox - AR Template UE5.2_low.mp4 - Simplify your life]

AR Template UE 5.3
[Dropbox - AR Template UE5.3_low.mp4 - Simplify your life]

3 Likes

Your 5.3 Results are way better than my results.

I’m on iPhone 12 16.6 and it refuses to place the object on any surface and just places it near camera origin because that is the only place the tracked plane shows up.


Yeah my results are also much worse, in 5.3 tracking is completely broken and models are placed right infront of camera and move around with it. Tested on iPad Pro and iPhone XR

I am doing remote build from Windows btw

Hello,

I have not tested anything with 5.3 in a while, but I did do some iOS HandheldAR template testing today with what will eventually become 5.4 (currently very similar to UE5/Main) and looked into this problem a bit.

I think the collision tests are working for me, however with all of the options enabled on “Line Trace Tracked Objects” (Feature Points, Ground Plane, Plane Extents, Plane Boundary Polygon) I do get a confusing surplus of hits. If I just use the first hit I’ll usually get something closer to the camera than I wanted. In particular Feature Points are perhaps not terribly useful for application developers. They are the points ARKit uses to orient itself, and the documentation only says that they ‘usually’ are on real world object surfaces. After scanning around for a while I’m also ending up with multiple ground planes. That may be a bug, perhaps we need to throw away old ground planes when a new one is found? If I cut it down to only “Test Plane Boundary Polygon” I get more useful results: a hit on the seat of the chair, a hit on the floor below the chair. We sort the results in LineTraceTrackedObjects, so the first hit is the hit nearest the phone and therefore usually the desired hit.

1 Like

Hey thanks for responding.

I did attempt to play around with Line Traced Tracked Objects as that was my go to suspect for the issue I was experiencing.

If I recall correctly my phone would print out only one Hit from the array of hits so that made me rule out that there were more options I could pick from (because I too was hoping that mayb in the hit array there was a good plane to pick from).

I also tried filtering out some options there as well unchecking each option and trying different options but it didn’t seem to make a difference.

But at least you too are noticing some potential bugs.

IDK if the iPhone device its self is playing a role here or not. LiDAR vs vision only or something like that.

In the end for an iOS build that works I just remade my project in 5.2 using the AR Template there and tracking was just fine. I was lucky in that I could do that but hoping the bugs get sorted out in a future UE version.

Hi everyoene,
It is good (and not good) to see others facing similar issue. The positioning is not fixed at all, i did AR example test from Xcode and it is amazing, but in UE the positioning is terrible (noting that scene depth didnt work at all but all of a sudden after few days it started working!)
Hope there is any development at your end

Done a few tests on this 5.3 tracking is unusable, 5.2 tracking works well however when you create a build for TestFlight/App Store the camera feed is black (attached image below). Don’t know if anyone knows a work around for this issue?

There is a fix in 5.3 that can be backed into 5.2

-e

Can confirm the actor location issue still persists in UE5.4 & UE5.3, while working in UE5.2…

Dang was hoping it would get fixed in 5.4… Looks like we are either a whole engine version or a few hot patches away from being able to do iOS AR work outside of 5.2.

Curious does this effect Vision Pro?

Same issue in 5.4.
ÂżDoes anybody know if there is a UE Bug reported about this frustrating problem?

And for those who are wondering, no backward compatibility, going back from 5.3 to 5.2 doesn’t go well. You either have to restart your project from scratch or put it on hold until a patch is released.

This has finally shown up in the Epic Issue Tracker.

go over and vote it up.

-e

1 Like

Hello,

I did reproduce the issue where objects just weren’t staying fixed to the real world well, the following is a work-in-progress c++ fix that one could apply locally to 5.3 or 5.4. The problem is that the projection matrix FOVs were incorrect. They would have been correct on an older 16:9 iphone in landscape, but way off in portrait and somewhat off in landscape on a recent iphone which has a 19.5:9 aspect ratio, because we would always end up using a 16:9 aspect ratio from the Camer aComponet (unless one modified the camera componet aspect ratio in blueprint).

This fix has the ARKit build the projection matrix itself (like arcore or vr stereo rendering) rather than relying on the fov/aspect ratio/letterboxing system used for non-ar/vr applications. For AR we need the projection matrix to match the passthrough camera image.

AppleARKitCamera.h
Add:
	void GetViewProjectionMatrix(EDeviceScreenOrientation DeviceOrientation, FSceneViewProjectionData& InOutProjectionData) const;


AppleARKitCamera.cpp
Add:

void FAppleARKitCamera::GetViewProjectionMatrix(EDeviceScreenOrientation DeviceOrientation, FSceneViewProjectionData& InOutProjectionData) const
{
	// Use the global viewport size as the screen size
	FVector2D ViewportSize;
	GEngine->GameViewport->GetViewportSize( ViewportSize );
	
	float FOV = 0.0f;
	if (DeviceOrientation == EDeviceScreenOrientation::Portrait || DeviceOrientation == EDeviceScreenOrientation::PortraitUpsideDown)
	{
		// Portrait
		FOV = GetVerticalFieldOfViewForScreen(EAppleARKitBackgroundFitMode::Fill);
	}
	else
	{
		// Landscape
		FOV = GetHorizontalFieldOfViewForScreen(EAppleARKitBackgroundFitMode::Fill);
	}
	
	InOutProjectionData.ProjectionMatrix = FReversedZPerspectiveMatrix(FMath::DegreesToRadians(FOV) * 0.5f, ViewportSize.X, ViewportSize.Y, 10.0f);
}

AppleARKitSystem.cpp
Update the following function:

	virtual void SetupViewProjectionMatrix(FSceneViewProjectionData& InOutProjectionData) override
	{
		if (ARKitSystem.GameThreadFrame.IsValid()) 
		{
			ARKitSystem.GameThreadFrame->Camera.GetViewProjectionMatrix(ARKitSystem.DeviceOrientation, InOutProjectionData);
		}
	}
4 Likes

I tried out this fix and it looks to be working for me. :star_struck:

Your description of the base issue makes a lot of sense. I was able to make the modifications you describe here and build the Handheld AR project. The default asset seems to stick properly to the ground. I’ll do some A/B testing with a 5.2 build of HeldheldAR to see if there is any performance difference, but it looks like you’ve solved it.

thanks so much!
-e

1 Like