Syncing/matching virtual to real envionments

So I have modelled an exact replica of my room.

I used a Leica laser scanner to get a point cloud and imported this into Blender, because the mesh was poor quality and the textures didn’t look that great, I created a clean model by basically overlaying objects in Blender which aligned with the point cloud surfaces.

I have imported my room from Blender to UE and adjusted the transform of the room to align virtual with real, the result is quite amazing, its really something to be able to reach out in the virtual space and have the walls and door frames align across both worlds.

My question is, rather than the time-consuming “test and adjust” method of adjusting the transform of the room, (which I’m afraid will go out of sync if I need to carry out the Steam VR Room setup again), is there a smarter way I can align the UE coordinate system with the real world coordinate system using either the Base Station locations, or a VIVE tracker puck or something?

My setup:
VIVE Pro Eye w/ wireless adaptor
4 Steam VR BaseStation 2.0
UE5.2

How about using

The component in the player pawn that the camera is parented to will always have the transform of the room-scale origin in SteamVR. Aligning the player pawn so that the component is located on the origin in the game environments will align it with the real world as long as SteamVR maintains the roomscale calibration.

It’s also possible to get the transforms of the base stations with the deprecated SteamVR plugin, but not with OpenXR.