It would be really handy if things like VIVE controllers and trackers could be supported as Live Link sources. If you are doing Virtual Production and mixed reality stuff it is much harder to synchronize them with live video and unreal rendered CG. Particularly doing things like time syncing, interpolation, filtering…etc. It would be great to have a way to make them come in though live link so they could be synced up easily with other live link sources like the iPhone face capture and such.
I think it would be fine if there was just a mode switch where you could route VIVE (or other VR) motion tracker inputs through either the normal input system or through Live Link. It might be handy to have both normal and live link inputs from the trackers available at the same time, but for most Virtual Production applications I think sending them through Live Link would be preferable.
Things like VIVE trackers may not be ideal for all Virtual Production setups, but they are cheap and very easy to setup. They work well enough for working on scene creation, testing, pre-vis and other tasks. Using live link would also make it easier to switch from using the VIVE stuff in one studio to a full mo-cap setup in some other location.