Somehow I got the impression Live Link could be used to with VIVE hand controllers and tracking pucks, but now that I have 4.23 it looks like it does not.
When you want to use VIVE (or other) hand controllers and pucks for motion capture or camera tracking It would really be useful to have the ability to bring them in through Live Link.
As I understand it, Live Link data has timecode and can be buffered and interpolated to match up with the timecode of the rendering engine. This feature would be VERY helpful when you are trying to do virtual production where an incoming video source has timecode and the graphics engine is genlocked and synced to it. If I understand it correctly it would automatically delay the live link data to match up with the video.
Right now it’s harder to use vive trackers in a virtual set scenario because there is no way to easily sync them up with the incoming video.
Specifically, it would be nice to have the option to stamp all incoming VIVE tracking data with the current timecode (from the engine or video timecode source) so that when the frame is finally rendered the tracking data that corresponds to the timecode will automatically be used when rendering that frame. That way the position of the tracker in the video and the camera and other objects they were controlling would remain in sync with the video.
Right now I have to use a blueprint that grabs the motion controller data every frame, throws it into a FIFO and uses the motion control data from several frames back. It works but is cumbersome.
Being able to treat hand controller and tracking pucks just like data from a mo-cap system would be a big help for live Virtual Production and Mixed Reality filming. Not all of us have access to a high-end mo-cap stage, using these VIVE and other tracking devices is a good alternative.
P.S. I know the mixed reality plugin can delay tracking data, but this only seems to work if you use that plugin. It would also be great if the “tracker delay” function was always available and there was a way to sync up rendering and tracker data to the frame-start of a video source without timecode.
Hope this makes sense.