We have an app that uses Ogre3D for rendering and uses the native ARKit API on iOS. We’re porting that to UE4 and have run into to a few issues:
I’m not using the livelink based implementation in the sample app since I need access to the full vertex buffer with all the points coming from ARKit. I’m calling UARBlueprintLibrary::GetAllGeometries() and simply taking the first valid one.
- There seems to be a lag issue where the tracking seems to smoothed out in unreal. Is UE4 applying noise correction on the tracking coming from iOS? If yes, is there way to access the original non-laggy vertex buffer?
- I can’t seem to find a way to access the camera matrix via the UE4 api. The reason I need this is because I need to map the original camera texture on to the mesh. I tried recreating the iPhone camera manually but can’t seem to line it back up since the scaling and world space conventions seem to be different UE4.