I’m trying to build an realtime virtual studio for an upcoming musicvideo project.
I’m using HTC Vive and it’s “vive tracker” mounted on my Blackmagic Ursa Mini Pro for camera tracking and then SDI out to a Blackmagic Decklink Duo into Unreal Engine and Composure for keying.
Everything is working excellent except for a few ms of latency or delay between the engine and live video feed.
All is genlocked and timecode provided by the SDI from the Blackmagic Ursa.
I thought that when the engine and camera is genlocked and on same timecode it should match up?
Question is then if someone could point me in the right direction of either adding a slight delay to the engine or the videofeed?
could you describe your workflow? does the camera go into unreal or does it go directly into a video mixer? if you use the unreal chroma key how do you find quality? have you solved the problem of the delay 'who is late the video of the telecamea?
Thanks
could you describe your workflow? does the camera go into unreal or does it go directly into a video mixer? if you use the unreal chroma key how do you find quality? have you solved the problem of the delay 'who is late the video of the camera?
Thanks