Does the ScreenPosition UV-Input node work in VR, and if it does, how does it work?

I understand how the ScreenPosition node would work in flatscreen, the UV of the mesh is where it is on the screen. However, how would that work in VR when there are effectively two screens where the mesh is offset differently depending on which eye it’s rendered to. Also, does Multi-View/Instanced Stereo affect this as you’re now rendering the scene as a large texture at the same time?

Bump, wondering as well. Tnx!