Does anyone know what Motion Capture/Camera tracking software or solutions are compatible with [or designed to work with] any of the Blackmagic Cinema Cameras?
The application would be the first step in the production workflow to achieve real-time live compositing of live action shot against a green screen using Unreal Engine or similar 3D software, rendering on the fly with HDMI output.
Also would like to mention many of these systems seem to be for ‘VR’ applications but that gets confusing because often that means a 360-VR output for game use and Occulus and such.
My output needs are 2D.
I’m looking for an off-the-shelf system that takes the live action green-screened image and syncs it to a 3D engine internal camera [Unreal Engine - the virtual set] and then composites it and renders in real time to a 2D HDMI output for both live monitoring as well as editable footage, broadcast quality.
Basically looking for the live-action compositing scenario but with the camera-side motion sensor capture as part of the application. Composure looks like it could be part of the workflow but there are many elements of that workflow that just aren’t made clear. I’ve looked through the Composure quick start and tutorial videos, I’m not finding the components needed for my specific application.
I am trying to find out if any users on the forum have been able to accomplish the workflow but with the whole chain included - Live Camera [what camera and lens] - Motion Capture [what software] then into Unreal Engine, out to most likely a keyer [Composure?] and then HDMI output to both monitors and HD recording device.
The live camera shoots the live action against a green screen - motion capture system tracks the points so it matches the internal 3D engine camera, then composites this in a keyer, so the live action lines up with the 3D virtual set.