I just got stereo layers figured out to render some UMG widgets in my HUD.
I’m using world locked, and as long as I, the player, stand still, it looks great.
As soon as I use joysticks to move my character, or make him jump, the stereo layers end up rendering away from where they’re supposed to be in the direction I’m moving. It’s like the opposite of them lagging behind. And it depends on the framerate too. If the framerate is low, it moves even farther away since the character movement delta is bigger in the frame. It’s almost like it needs the negative velocity applied to its transform to be correct.
I also tried other stereo layer types and I can’t even see the widgets anywhere in the world. I even tried making my stereo layer component be in different tick groups. The stereo component’s tick method is where it sets it up to render.
This is basically getting the world transform of the camera, and maybe it’s ahead of the current frame? Or maybe it’s lagging behind the current frame and due to the inverse math it’s rendering in front of where it should be not where it was previous frame? Either way, something is off by a frame.
I may be able to do a weird workaround where I offset the stereo layer by the delta of the character’s world transform since last frame until this is fixed?
I filed a bug report but I also found an issue that was fixed for Oculus. I did a little experiment and I think even moving my head side to side produces the issue with OpenXR as well. In fact I noticed if I place my controllers on the floor and look at them there’s always this very subtle motion as they appear to wave around the debug axes I’m drawing for where the motion controllers actually are. If I use face locked stereo layers, for example, it’s positioned precisely.
Seems like another workaround is to use face locked stereo layers and update their transform relative to the camera instead. Gonna at least try that for now.
This is in the oculus runtime whereas I’m using the OpenXR runtime. If possible, maybe I’ll submit a similar fix for OpenXR. They basically recompute the world transform as they render the stereo layers instead of using the previous cached value, so it can’t be off at that time. (So simple!)
NVM not so simple, the two functions run on different threads, one game thread, and one the graphics thread…
I made an attempt to fix the engine’s Late Update code but that didn’t work since that doesn’t handle the in game camera moving, only the HMD itself.
I got good results with a temporary workaround. I made the stereo layers TRACKER_LOCKED and positioned them relative to a VR Origin Scene component that was part of my character’s component hierarchy. This is normally the parent of my motion controller components.
I made it so the stereo layer sets its relative transform from the widget component to the origin component, and now it accurately tracks whether I’m moving in the world or not. This is essentially the same math as what happens with WORLD_LOCKED but without the erroneous transform.
FACE_LOCKED didn’t work since the positions didn’t update properly as I turned my head and things lagged behind a bit. I tried being FACE_LOCKED relative to the camera component which would yield the same math.