Mover 2.0 to VR Integration

Hi everyone, I’m currently working on a custom VR locomotion system using Mover 2.0 in UE5.7, and I’m looking to open up a discussion on best practices for handling physical player movement (room-scale) within this framework. My current implementation is functional, but I’m struggling with performance and visual polish—specifically when offsetting the user when walking past the wall/obstacles. I’ve been using a “rubberband” logic to snap the player back to the offset. It is working but it causes a noticeable jitter every time the player moves when using thumbstick.

The real challenge lies in integrating physical movement into Mover’s InputProducer, GenerateMove, and SimulationTick flow without breaking the “Mover way” of doing things. When a player physically walks into a wall or obstacle, I want to trigger a “push-back” feeling where the actor stops at the collision while the camera potentially desyncs or offsets.

I suspect the jitter is a conflict between the engine’s VR “Late Update” and Mover’s internal state synchronization. I’m curious if anyone here has successfully treated the HMD’s delta position as a high-priority movement intent within the FMoverInputCmd to let the SimulationTick validate the move before it’s applied. How are you handling the offset when a player walks through a mesh? Are you using a dedicated movement layer for physical movements, or is there a better way to decouple the render state from the kinematic state to eliminate that frame stuttering?

I’d love to hear how others are structuring their VR-specific Mover components.

1 Like

I briefly considered using LayeredMove for this, but it felt like the wrong tool for the job. LayeredMoves are great for discrete, additive forces (like a knockback or a jump pad), but they don’t provide that frame-to-frame, continuous ‘pushback’ feeling you need when a player is physically leaning against a collider.
Thus for now I’m leaning toward building a dedicated Custom Movement Mode specifically for ‘VR Walking.’