I don’t think there is anything like a calibration direction defined during the setup. At least, and to my knowledge, there is nothing in the Oculus API’s hinting to that (like a struct to maintain that information or a method to retrieve it).
It is more likely that the Oculus Home uses an eye-level tracking origin, which defaults to 1 meter in front of the sensor, and detects when the HMD is worn to re-center the player on the fly.
Someone has partially reversed engineered the Oculus Home, which is built with Unity btw, and found that there is some code detecting whether the user is standing, sitting or even crouching. Not sure however how this code looks like and even if it is currently in use. They may have implemented some advanced auto-orientation method which uses the sensor location for example. Have you tried to start the Oculus Home but face in a completely different direction (e.g. away from the sensors)?
You may achieve something similar in your experiences by offering a VR re-center function. This can be done semi-automatically or on demand. A trick is to start the experience by placing the user inside a sphere (or cube) parented to the VR camera and displaying some titles or disclaimer in it. By asking the player the place themselves comfortably, face forward and press a key to continue/accept the disclaimer, you can assume that position as their new preferred orientation and call a Reset Orientation and Position to set it.