The Steam VR Plugin from Valve for Unity allows a non-VR in game camera to be tracked to a third Vive controller attached to a real world camera for making trailers.
https://github.com/ValveSoftware/openvr/tree/master/unity_package/Assets/SteamVR
A good explanation of the work flow is here:
How is Epic making mixed reality trailers for it’s projects? What’s the recommended workflow? The Unity plugin produces 4 different images so you can composite foreground and background elements with the green screen footage. Networked multiplayer with the non-VR machines running the standard camera is what I’m using at the moment and it’s terrible as you need to run a copy of the engine (player) per layer.