ARKit Facetracking lag and Camera Matrix

We have an app that uses Ogre3D for rendering and uses the native ARKit API on iOS. We’re porting that to UE4 and have run into to a few issues:

I’m not using the livelink based implementation in the sample app since I need access to the full vertex buffer with all the points coming from ARKit. I’m calling UARBlueprintLibrary::GetAllGeometries() and simply taking the first valid one.

  1. There seems to be a lag issue where the tracking seems to smoothed out in unreal. Is UE4 applying noise correction on the tracking coming from iOS? If yes, is there way to access the original non-laggy vertex buffer?
  2. I can’t seem to find a way to access the camera matrix via the UE4 api. The reason I need this is because I need to map the original camera texture on to the mesh. I tried recreating the iPhone camera manually but can’t seem to line it back up since the scaling and world space conventions seem to be different UE4.
  1. No smoothing. We get frame updates from ARKit and submit those to the game thread.
  2. There’s a material node to sample the pass through camera. Otherwise you can use the WorldToScreen on the Canvas (iirc) to figure that out.