I’m working on image recognition in AR, and I was able to build it no problem using ARCore / Android devices. Unfortunately on ARKit / iOS I don’t get any objects recognized in the ARTrackedImage class at all.
Screenshot below to show a basic testing set-up.
The nodes on top connected to Event BeginPlay just start the AR session with my ARSession config, and confirm that the candidate images are there.
Then the nodes below connected to the Event Tick are used to debug the Image Tracking for ARKit.
Does anyone know why I would not get any output at all from the ARTrackedImage class?
I was on 14.X, i updated to 15.4 and it’s still working.
I use it on an iPad 6th gen and somone tested it on an iphone 12 (i don’t know which version of IOS)
Evrything on the arSessionConfig data asset is set to default exept :
Track Scene Objects
User Person Segmentation for Occlusion
Candidates Images (i have 3)
I use a world Session Type (but it worked with image session type too)
Thanks for your responses @Pierre.W - Much appreciated.
Finally, I fixed my issue by rebuilding the project from scratch, as documented here:
Then I used the same Blueprint I posted above (as the level blueprint) to verify image tracking was working.
I’m not sure what caused the issue before, but I was working from the AR Template provided with 4.27. Since creating a project from scratch I don’t have issues.