Unreal support using Antilatency for mixed reality

Unreal / Antilatency …Can someone please create a thorough tutorial on how to implement Antilatency with Unreal for mixed reality filmmaking? (using 1-3 camera feeds and a greenscreen) There are a few great tutorials out there but most or all of them end up with Unreal kicking out the antilatency tracking and using a PIE camera that are not being tracked by the Antilatency BP_AltSimplePawn. The only decent tutorials I’ve been able to find were provided by people either using Vive or an iphone or ipad for tracking.
Another issue I’ve found is that the SDK provided by Antilatency is great; a drag and drop set up that only takes minutes to get going and the tracking is hyper accurate, but the virtual camera that the plugin pawns control do not provide the necessary settings that a cinema camera actor provides. Settings like sensor size and focal length (lens) are crucial. Seems like the Antilatency team fell a little short with integration into Unreal.

Hey there,

Sorry that i cant awnser your question but i have the exact same issue. I am looking to composit my virtual word and the real life camera feed together in UE4 and the send it out via a SDI cable.

Did you manage to get any answers?

hi @anonymous_user_b6d88ace ! is this still a valid request? i could break up into the parts and aswer if you like.

Hi, I would love a break down on using Antilatency if you don’t mind. Thank you

Apologies for delay - i am caught up with something myself these days.
unfortunately i am not setup to make a vid-tut but by your post i understand you have a good background and with danger to give a silly answer to a reasonable question have you tried compiling the SDK as a VRPN subject and addind that as a LiveLinkController component to a CineCameraActor?