Virtual Production and Virtual Set samples really needed!

Tim,

Right now my biggest question is how to delay the Vive Tracker data so it will sync up with the video capture which is several frames behind.

A second request would be recommendations for how create a properly rigged object in UE that takes into account the difference in position/orientation between the camera and tracker. I currently use a rig like in the attached picture, but haven’t tried compensating for the couple of inches of offset between the two devices yet. I’ve also seen postings where people attach the VIVE tracker directly to a DSLR’s hot shoe, but they are also having trouble adjusting the coordinate system correctly.

It would also be very nice if you could have cine camera presets for the Logitech 920 pro webcam and the different gopro models. I think I’ve got the parameters right, but it would be nice if these common cameras were built in.

For doing virtual sets I didn’t find the composure sample or the virtual studio sample very helpful because neither one included a live tracked/movable camera and that was where I had many problems.

A really simple example using a Logitech 920 pro webcam (the one you recommend for mixed reality) and a VIVE tracker would be great. I’d be very happy with something that just inserted a few CG rendered cubes into the live video feed from the camera so that they stay on the same “real world” spot when the camera is moved. In other words, the CG object/character is the foreground and the video feed of the room is the background (and the camera moves).

I’m sure someone who knows what they are doing could put a sample like this together really quickly. Just a basic example, save the hard stuff like color correction, matching lighting, chroma-key and other more involved things for a follow on example.