Mixed Reality and Spectator View

I’ve been looking into how to do this with what we have at the moment and it looks like the easiest option is to record the orientation / position of the controllers and the camera and then replay that. Has anyone else tried this approach? I’m planning to shoot the trailer on a sound stage / green screen hand held. What’s the results of doing match moving using something like Nuke or 3DEqualizer vs using the controller to track the camera? I assume the advantage of match moving is that you get the correct camera parameters (zoom / FOV) and that they can change during a shot.