Well, I’ve just experienced it! First of all a well done intro menu for experiencing on screen or in VR headset (EULA & exit as well). Also well done to have a second menu for selecting what kind of hmd is connected. I have a blueprint flow for autodetecting connected hmd and get the correct motion controllers meshes in the hands as for detecting if hmd is connected for entering in vr mode or desktop mode.
I’m asking myself always about what’s better, two independent apps or one with two choices. As I’ve done similar experiences I think it is better to have two different apps. Why? As we have two rendering paths (Deferred+TemporalAA vs. Forward+MSAA) and the first has more capabilities than the second, but the second is better for VR as it gets better performance, answer is obvious. But for customers is very nice to have only one app. I would like to know what rendering path have been used. Some experiences I did get good performance in VR following Deferred+TemporalAA path, but disabling some things. It would be good to know Epic Games opinion on what is better from a enterprise point of view. Considering the church branch has several levels / sublevels and probably all actors are static so all is lightbaked may be deferred is the selected path, but the motorcycle branch, where a more refined product view template has been used and all actors are movable, is it deferred been used as well?
Last but not least I do not understand what HP does in this “kit”. It is a mistery.