Unreal support using Antilatency for mixed reality

Unreal / Antilatency …Can someone please create a thorough tutorial on how to implement Antilatency with Unreal for mixed reality filmmaking? (using 1-3 camera feeds and a greenscreen) There are a few great tutorials out there but most or all of them end up with Unreal kicking out the antilatency tracking and using a PIE camera that are not being tracked by the Antilatency BP_AltSimplePawn. The only decent tutorials I’ve been able to find were provided by people either using Vive or an iphone or ipad for tracking.
Another issue I’ve found is that the SDK provided by Antilatency is great; a drag and drop set up that only takes minutes to get going and the tracking is hyper accurate, but the virtual camera that the plugin pawns control do not provide the necessary settings that a cinema camera actor provides. Settings like sensor size and focal length (lens) are crucial. Seems like the Antilatency team fell a little short with integration into Unreal.