How to use Composure with VIVE tracker to move the camera?

A summary of what I’m trying to accomplish, my setup, where I am so far, and what I could use help with:::

Background: I’m a filmmaker/vfx artist trying to use VP via my Vive to track the camera and for things like virtual scouting and realtime comping of actors on a green screen stage for reference on set. I then plan on rendering out frames and doing more polished compositing in AE/Nuke later since I’m not confident in my ability to pull a quality key/composite via blueprints. I’m hoping to do the indie diy version of virtual production with no budget for a webseries/passion project. I am a UE newbie so bear with me please.

Setup: I’m using a Black Magic Production 4K Camera through a BlackMagic Mini Recorder 4K via an SDI cable. I am also using a Vive and I currently have one tracker puck mounted to my camera.

Where I’m at in this process: I’ve managed to get a composure setup working with a tracked camera via the vive tracker. The green key is pretty rough, for some reason my camera feed is very noisey in UE4 despite it looking crystal clear in the Black Magic Media Express. This is fine for me as the key is simply a reference for my purposes but is this normal?

I am also bringing in timecode and genlocking UEs output frames to the BM camera which is great. I’m not jam syncing any timecode via other hardware so I’m stuck with the Time of Day my BM is producing but I’m mostly hoping this smooths the tracking discrepancies between UE and the camera footage. Because I’m rendering frames out later, I’m recording the camera movement via Take Recorder and then rendering them out after the shot has been recorded IRL. But this causes some issues (see below). Also I tried to calibrate my setup via the MRCalibration tool but it doesn’t recognize my BM Camera via BM Mini Recorder as a viable input. I have not gotten to lens un-distorting yet. Which leads me to…

Obstacles:

  • How can you calibrate without MRCal tool and how do you account for rig offsets and align everything? I’m currently using an Empty Actor which takes the transform data from the Vive Tracker via BPs and I child my Comp Camera to that where I manually offset transform data until my controllers roughly line up to my camera feed but it’s very rough and tedious as hell. Any tips here for faster and better results?

  • How do you prevent trackers from disconnecting all the time? I have to restart Unreal for my Motion Controller BP to recognize it as ‘Special_1’ every time it disconnects and it’s a real pain. I’ve tried the

  • Is it possible for Take Recorder to record timecode the virtual camera is syncing with so when I play a camera back later it holds that previous TC?

  • How can I render out a camera that’s offset by a recorded empty actor? My current setup I can tie the Vive tracker transform data to my camera but it’s not offset properly to match the real camera. So when I render that camera it’s staring at the sky for instance. If I record the offset empty actor then the camera is static. Not sure how to render out the combination of these two things as they appear to be all separated in Sequencer/Take Recorder. Is there a better way to do this?

  • @.Corson I’ve tried to use your latest VP project but for some reason I can’t even get BM Media Bundle to display in this project. I used the same settings within the BM plugin on several other projects but this one is giving me trouble. I only fired this up for a few minutes so I’ll try again soon, I’m probably missing something obvious.

Thanks again for everyone in these related threads, it’s been incredibly helpful. If anyone has any insights on the problems above I’d be grateful.