Hi,
I have an issue with a VRCharacter BP. I had its VRRootReference rescaled to test different body mesh sizes. Maybe rescaling the VRCharacter’s capsule is a bad idea ; it seems to work, but with odd behaviors in some more GPU-demanding areas of my test map.
More precisely, the issue I have is that my controllers, and child component, exhibit some sporadic “flickering”, whereby those components seem to, sometimes, quickly transition between two different scales (VRRootReference’s scale and its square scale). I observe seemingly erratic behavior in demanding areas (e.g. in my test map I have a room with 2 planar reflections, and at some locations, I have flickering)
Here is an example with a UI Widget attached to the left controller. What we can see is, at times (but always at the same places) some “flickering” of the UI Text (and, even if it can’t be seen, the controllers themselves). –> UE4 4.24.3 + VR Expansion Plugin - Test scene and capsule scale question - YouTube
I must say that problem is repeatable (at least in my test map) with any VRCharacter whose VRRootReference scale is different than (1,1,1). If I reduce the complexity of the scene (e.g. if I remove one of the two planar reflections), then flickering does not occur. (Also note test map, even in those more GPU demanding areas, generate frames ~ in the 11.0-11.2 ms envelope.)
One obvious workaround is NOT to rescale VRRootReference. I suppose that it’s not good practice to change scale anyway.
But beyond that, I was wondering if the cause of issue was not deeper. I would be grateful if you have some feedback on that issue. Is it a known consequence of tampering with the capsule scale, or do you think my issue goes beyond that, with a cause of other nature? I hope it’s sufficiently clear, thanks for your comments.
It doesn’t have anything to do with the capsule itself, that is the controller reprojection for frame loss bugging out with the different scales, it is reprojecting incorrectly. If you turn off motion smoothing its likely that it will be corrected.
I’ll note that world scale changes and (in some engine versions) root component scaling have historically been problems areas with the low latency updates on the controllers as well.
Thanks for the explanation about incorrect reprojection, I’ll just mention that regarding my previous post, motion smoothing was off (BTW turning it on did not improve behavior). I’ll keep issue aside for the time being, since I’m not knowledgeable enough with retroprojection so far. But I see that it’s not caused by VRE and that indeed, my problem is larger than that. So thanks for the direction.
Yeah it shouldn’t have anything to do with VRE, though you can test outside of it to be sure.
Right, which is why I feel I must be missing something painfully obvious. I’ve been trying to move the character using various functions, but using “show Collision” shows that I’m still able to simply walk away from the capsule, which of course presents a lot of problems. Is there some obvious movement method or setting that I’m missing here?
Are you using the VRcharacter? VRBaseCharacter is a common set of functions for the two actual characters, not a usable vr character in itself.
I looked over the example plugin and created a shorten grip code. But for some reason I cannot get secondary grip to grip in the right spot and I can’t move across the spline once copied into my project. Do I have the grip script wrong or did I mess something up with the code?
You’re completely right, I was using VRBaseCharacter! I was right in that it was painfully obvious Thanks!
I assume for melee? Since you mention spline?
I have some extra components to help people align hands in the base class for that, and it has an override for the GetClosestGripSocketInRange function that changes its default behavior to use the the spline as a snap center instead of the normal socket search.
Having an issue with again, tracking is fine user side but gradually from another clients end head and hands will stop replicating and will stick. is only happens on one of my levels its very weird. Is there a way to have head and hands always replicate whether or not the headset is on?
Thanks,
They do already, the problem is that oculus auto pauses the game when the headset is off, you’ll have to change that behavior as the plugin doesn’t have data to work with then.
I made quick little trick to find out when headset is auto-paused :
The client send a ping at regular interval to the server which trigger a retriggerable delay. If the server don’t receive the trigger, then it consider the headset as off.
For me, tick interval is 1 second and rtriggerable delay is 3.
Hello !
Long time user here since 2018, I currently run into some blocks on the thoughts about the performance in my VR game. Hope you can share some ideas.
My game is a FPS VR Shooter, currently Im trying to save movement costs on player characters.
Character hierarchy is:
SimpleVRCharacter
- Capsule
- Camera
- ParentRelative
- BodyMesh
- Holsters
- WeaponsHolstered
- MotionControllerComp
- WeaponHeld
From profiling, lots of movement cost are involved with repeatedly moving things deep in the hierarchy each frame. For example, every frame, CharacterMovement , MotionController , Weapon it self all move the weapon’s capsule collision because of the hierarchy(UpdateChildransform and UpdateOverlaps). Tick order is also CharacterMovement->MotionController->Weapon.
I also noticed you can set RelativeLocation and RelativeRotation on USceneComponent without incurring an update on movement, that value should come in effect next time the parent updates movement, in a deferred manner.
So my idea is to reverse the tick order, so that weapon set its “desired” relative transform deferred, and then motion controller set its relative transform defered, then character movement finally updates the movement in the EndScoped Movement update. In case, the weapon, the motion controller only update the movement once each frame.
Does sounds like a good plan?
Also, in general, is it ok to tick motion controller before character movement? I see that unreal tends to add parent as the attached component’s Prereq Tick comp. But is it fair to say, as long as children only cares about Relative Transforms, they dont really need to know their parents’ world transform?
Thanks,
Most of the movement cost is in UpdateOverlaps, your first step should be managing collision of attached objects to reduce it as much as possible. You should read into the lengths that Epic went through to reduce it for robo recall and some oversights that they had originally that were causing performance issues.
As far as the motion controller is concerned, no, you cannot have it move before the character movement in most of the grip types, only attachment grip would work correctly like that, every other grip is in world space and the held object is not actually attached or in the parent child hierarchy. Motion controllers need their final world space location post movement in order to manage the held objects final target position.
Also i’ll note that the VR character actually makes less deferred movements than the standard engine pawn, I went and cut out two updates a frame that happen on the normal character, collision updating is the most expensive part of characters engine wide. I also defer the collision capsules physics thread update until the character movement if the character movement is being performed, so the capsules tick is just setting a target.
Parent relative attachment is a thought though, I could move its update into the movement component if there is one active. The component is supposed to be able to be used inside of non plugin pawns, but no reason not to have a case for when it is inside of one of my classes.
Thanks for pointing out the Robo recall case, I’ll look into resources for that.
I’m actually only using attachment grip, with that limit, are there any other concern for the tick order?
I ended up put all the transform updates from character in the movement comp’s tick, so character only tick movement once. The problem is the motion controller’s tick, and the weapon also tick its transform for some swaying animation when being held.
Side topic on the VRCharacter, sounds like it is done well and perfromant. It was actually working great for us until out designer pointed out he wanted to separate head from the capsule so that leaning agisnt table can be possible, I couldnt figure out a way to decouple them for VRCharacter, so ended up downgraded to VRSimpleCharacter, and added some custom logic to do it inside SimpleMovementComponent’s Tick.
I actually just moved the parent relative to the cmc tick so that it could be deferred just now, but I split it up between the actual grip tick and the component movement for the motion controller.
As for the VRCharacter, you could set a waist tracking parent for the capsule and parent relative components and directly control the capsule at will with it, that is the point of that interface addition.
Curious how you decided to de-couple, I haven’t liked anyone’s solutions for that yet as it tends to just be a given zone that the player can free walk if the head isn’t colliding.
Im on 4.22 so my plugin version is also old. Did you just move the code into the cmc tick or did you change relative comp’s tick group? Its in DuringPhysics as the camera in my version. I just defered it to next frame’s cmc updates. But removing that one-frame lag would be great.
I use a two-step method, first, sweep move root to the camera, second, reverse the movement of the root on the parent of the camera. When sweep hit anything, thats when they decouple.
Yeah that is the grace area i was talking about, it doesn’t actually require the user to lean was the issue I always had about it, if your game is designed around it though it would be fine.
I moved the actual update into the CMC’s TickComponent and added a new deferred definition to the beginning of the tick. That way it applies after all of the CMC movement is done, but on the same frame. I moved the controllers there as well if the CMC is active and exists (have to support non VR character owners).
Something that Half-Life: Alyx does is, they allow the VR camera to interpenetrate objects from head movement, but not walking or teleporting.
For example: If the player is up against the wall and presses the analog stick to move in the direction of the wall, they won’t move. If they put their head into the wall, then their vision will turn orange.
The way that seems to work is
- Navmesh (analog stick) movement will move the players camera, but not into walls.
- The player’s navmesh position is constantly updated to match the position of their camera after applying navmesh and playspace movement, while their camera is not interpenetrating objects.
- If a player’s camera starts interpenetrating objects, their navmesh position stays at the last good position, and their vision turns orange.
- If the player tries navmesh movement while their camera is interpenetrating geometry, their camera is re-positioned back at the last good position and their vision returns to normal.
- As a result of 4, the player camera cannot intersect with geometry while using navmesh movement.
seems to be an overall comfortable approach.
Is something like possible with plugin? If so, how would one go about implementing it?
Set bUseWalkingCollision override in the settings and assign a set of custom collision settings for it. will be the collision settings used when not locomoting, and when locomotion IS running it will use the standard collision settings. Then handle your blinders however you wish with a camera collision body.
I’ll note that in general keeping pushback but darkening the screen during it is the same effect as valves, but doesn’t require handling depenetration angles, their solution is rather lacking actually as you can clip through walls and blind interact with things.