Announcement

Collapse
No announcement yet.

VR Expansion Plugin

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Originally posted by RazkingZ View Post





    Having an issue with this again, tracking is fine user side but gradually from another clients end head and hands will stop replicating and will stick. This is only happens on one of my levels its very weird. Is there a way to have head and hands always replicate whether or not the headset is on?

    Thanks,
    They do already, the problem is that oculus auto pauses the game when the headset is off, you'll have to change that behavior as the plugin doesn't have data to work with then.


    Consider supporting me on patreon

    My Open source tools and plugins
    Advanced Sessions Plugin
    VR Expansion Plugin

    Comment


      I made this quick little trick to find out when headset is auto-paused :

      The client send a ping at regular interval to the server which trigger a retriggerable delay. If the server don't receive the trigger, then it consider the headset as off.

      For me, tick interval is 1 second and rtriggerable delay is 3.

      Click image for larger version

Name:	SleepMode.PNG
Views:	352
Size:	140.4 KB
ID:	1734104

      Comment


        Hello Mordentral!

        Long time user here since 2018, I currently run into some blocks on the thoughts about the performance in my VR game. Hope you can share some ideas.

        My game is a FPS VR Shooter, currently Im trying to save movement costs on player characters.

        Character hierarchy is:
        SimpleVRCharacter
        - Capsule
        - Camera
        - ParentRelative
        - BodyMesh
        - Holsters
        - WeaponsHolstered
        - MotionControllerComp
        - WeaponHeld

        From profiling, lots of movement cost are involved with repeatedly moving things deep in the hierarchy each frame. For example, every frame, CharacterMovement , MotionController , Weapon it self all move the weapon's capsule collision because of the hierarchy(UpdateChildransform and UpdateOverlaps). Tick order is also CharacterMovement->MotionController->Weapon.

        I also noticed you can set RelativeLocation and RelativeRotation on USceneComponent without incurring an update on movement, that value should come in effect next time the parent updates movement, in a deferred manner.
        So my idea is to reverse the tick order, so that weapon set its "desired" relative transform deferred, and then motion controller set its relative transform defered, then character movement finally updates the movement in the EndScoped Movement update. In this case, the weapon, the motion controller only update the movement once each frame.

        Does this sounds like a good plan?
        Also, in general, is it ok to tick motion controller before character movement? I see that unreal tends to add parent as the attached component's Prereq Tick comp. But is it fair to say, as long as children only cares about Relative Transforms, they dont really need to know their parents' world transform?

        Thanks,



        Comment


          Originally posted by ChaosBB View Post
          Hello Mordentral!

          Long time user here since 2018, I currently run into some blocks on the thoughts about the performance in my VR game. Hope you can share some ideas.

          My game is a FPS VR Shooter, currently Im trying to save movement costs on player characters.

          Character hierarchy is:
          SimpleVRCharacter
          - Capsule
          - Camera
          - ParentRelative
          - BodyMesh
          - Holsters
          - WeaponsHolstered
          - MotionControllerComp
          - WeaponHeld

          From profiling, lots of movement cost are involved with repeatedly moving things deep in the hierarchy each frame. For example, every frame, CharacterMovement , MotionController , Weapon it self all move the weapon's capsule collision because of the hierarchy(UpdateChildransform and UpdateOverlaps). Tick order is also CharacterMovement->MotionController->Weapon.

          I also noticed you can set RelativeLocation and RelativeRotation on USceneComponent without incurring an update on movement, that value should come in effect next time the parent updates movement, in a deferred manner.
          So my idea is to reverse the tick order, so that weapon set its "desired" relative transform deferred, and then motion controller set its relative transform defered, then character movement finally updates the movement in the EndScoped Movement update. In this case, the weapon, the motion controller only update the movement once each frame.

          Does this sounds like a good plan?
          Also, in general, is it ok to tick motion controller before character movement? I see that unreal tends to add parent as the attached component's Prereq Tick comp. But is it fair to say, as long as children only cares about Relative Transforms, they dont really need to know their parents' world transform?

          Thanks,


          Most of the movement cost is in UpdateOverlaps, your first step should be managing collision of attached objects to reduce it as much as possible. You should read into the lengths that Epic went through to reduce it for robo recall and some oversights that they had originally that were causing performance issues.

          As far as the motion controller is concerned, no, you cannot have it move before the character movement in most of the grip types, only attachment grip would work correctly like that, every other grip is in world space and the held object is not actually attached or in the parent child hierarchy. Motion controllers need their final world space location post movement in order to manage the held objects final target position.

          Also i'll note that the VR character actually makes less deferred movements than the standard engine pawn, I went and cut out two updates a frame that happen on the normal character, collision updating is the most expensive part of characters engine wide. I also defer the collision capsules physics thread update until the character movement if the character movement is being performed, so the capsules tick is just setting a target.

          Parent relative attachment is a thought though, I could move its update into the movement component if there is one active. The component is supposed to be able to be used inside of non plugin pawns, but no reason not to have a case for when it is inside of one of my classes.
          Last edited by mordentral; 03-18-2020, 08:14 PM.


          Consider supporting me on patreon

          My Open source tools and plugins
          Advanced Sessions Plugin
          VR Expansion Plugin

          Comment


            Originally posted by mordentral View Post

            Most of the movement cost is in UpdateOverlaps, your first step should be managing collision of attached objects to reduce it as much as possible. You should read into the lengths that Epic went through to reduce it for robo recall and some oversights that they had originally that were causing performance issues.

            As far as the motion controller is concerned, no, you cannot have it move before the character movement in most of the grip types, only attachment grip would work correctly like that, every other grip is in world space and the held object is not actually attached or in the parent child hierarchy. Motion controllers need their final world space location post movement in order to manage the held objects final target position.

            Also i'll note that the VR character actually makes less deferred movements than the standard engine pawn, I went and cut out two updates a frame that happen on the normal character, collision updating is the most expensive part of characters engine wide. I also defer the collision capsules physics thread update until the character movement if the character movement is being performed, so the capsules tick is just setting a target.

            Thanks for pointing out the Robo recall case, I'll look into resources for that.

            I'm actually only using attachment grip, with that limit, are there any other concern for the tick order?
            I ended up put all the transform updates from character in the movement comp's tick, so character only tick movement once. The problem is the motion controller's tick, and the weapon also tick its transform for some swaying animation when being held.

            Side topic on the VRCharacter, sounds like it is done well and perfromant. It was actually working great for us until out designer pointed out he wanted to separate head from the capsule so that leaning agisnt table can be possible, I couldnt figure out a way to decouple them for VRCharacter, so ended up downgraded to VRSimpleCharacter, and added some custom logic to do it inside SimpleMovementComponent's Tick.

            Comment


              Originally posted by ChaosBB View Post


              Thanks for pointing out the Robo recall case, I'll look into resources for that.

              I'm actually only using attachment grip, with that limit, are there any other concern for the tick order?
              I ended up put all the transform updates from character in the movement comp's tick, so character only tick movement once. The problem is the motion controller's tick, and the weapon also tick its transform for some swaying animation when being held.

              Side topic on the VRCharacter, sounds like it is done well and perfromant. It was actually working great for us until out designer pointed out he wanted to separate head from the capsule so that leaning agisnt table can be possible, I couldnt figure out a way to decouple them for VRCharacter, so ended up downgraded to VRSimpleCharacter, and added some custom logic to do it inside SimpleMovementComponent's Tick.
              I actually just moved the parent relative to the cmc tick so that it could be deferred just now, but I split it up between the actual grip tick and the component movement for the motion controller.

              As for the VRCharacter, you could set a waist tracking parent for the capsule and parent relative components and directly control the capsule at will with it, that is the point of that interface addition.

              Curious how you decided to de-couple, I haven't liked anyone's solutions for that yet as it tends to just be a given zone that the player can free walk if the head isn't colliding.
              Last edited by mordentral; 03-18-2020, 09:33 PM.


              Consider supporting me on patreon

              My Open source tools and plugins
              Advanced Sessions Plugin
              VR Expansion Plugin

              Comment


                Originally posted by mordentral View Post

                I actually just moved the parent relative to the cmc tick so that it could be deferred just now.

                As for the VRCharacter, you could set a waist tracking parent for the capsule and parent relative components and directly control the capsule at will with it, that is the point of that interface addition.

                Curious how you decided to de-couple, I haven't liked anyone's solutions for that yet as it tends to just be a given zone that the player can free walk if the head isn't colliding.
                Im on 4.22 so my plugin version is also old. Did you just move the code into the cmc tick or did you change relative comp's tick group? Its in DuringPhysics as the camera in my version. I just defered it to next frame's cmc updates. But removing that one-frame lag would be great.

                I use a two-step method, first, sweep move root to the camera, second, reverse the movement of the root on the parent of the camera. When sweep hit anything, thats when they decouple.

                Comment


                  Originally posted by ChaosBB View Post

                  Im on 4.22 so my plugin version is also old. Did you just move the code into the cmc tick or did you change relative comp's tick group? Its in DuringPhysics as the camera in my version. I just defered it to next frame's cmc updates. But removing that one-frame lag would be great.

                  I use a two-step method, first, sweep move root to the camera, second, reverse the movement of the root on the parent of the camera. When sweep hit anything, thats when they decouple.
                  Yeah that is the grace area i was talking about, it doesn't actually require the user to lean was the issue I always had about it, if your game is designed around it though it would be fine.

                  I moved the actual update into the CMC's TickComponent and added a new deferred definition to the beginning of the tick. That way it applies after all of the CMC movement is done, but on the same frame. I moved the controllers there as well if the CMC is active and exists (have to support non VR character owners).


                  Consider supporting me on patreon

                  My Open source tools and plugins
                  Advanced Sessions Plugin
                  VR Expansion Plugin

                  Comment


                    Something that Half-Life: Alyx does is, they allow the VR camera to interpenetrate objects from head movement, but not walking or teleporting.

                    For example: If the player is up against the wall and presses the analog stick to move in the direction of the wall, they won't move. If they put their head into the wall, then their vision will turn orange.

                    The way that this seems to work is
                    1. Navmesh (analog stick) movement will move the players camera, but not into walls.
                    2. The player's navmesh position is constantly updated to match the position of their camera after applying navmesh and playspace movement, while their camera is not interpenetrating objects.
                    3. If a player's camera starts interpenetrating objects, their navmesh position stays at the last good position, and their vision turns orange.
                    4. If the player tries navmesh movement while their camera is interpenetrating geometry, their camera is re-positioned back at the last good position and their vision returns to normal.
                    5. As a result of 4, the player camera cannot intersect with geometry while using navmesh movement.

                    This seems to be an overall comfortable approach.

                    Is something like this possible with this plugin? If so, how would one go about implementing it?
                    Last edited by Ghamazh; 03-24-2020, 11:05 PM.

                    Comment


                      Originally posted by Ghamazh View Post
                      Something that Half-Life: Alyx does is, they allow the VR camera to interpenetrate objects from head movement, but not walking or teleporting.

                      For example: If the player is up against the wall and presses the analog stick to move in the direction of the wall, they won't move. If they put their head into the wall, then their vision will turn orange.

                      The way that this seems to work is
                      1. Navmesh (analog stick) movement will move the players camera, but not into walls.
                      2. The player's navmesh position is constantly updated to match the position of their camera after applying navmesh and playspace movement, while their camera is not interpenetrating objects.
                      3. If a player's camera starts interpenetrating objects, their navmesh position stays at the last good position, and their vision turns orange.
                      4. If the player tries navmesh movement while their camera is interpenetrating geometry, their camera is re-positioned back at the last good position and their vision returns to normal.
                      5. As a result of 4, the player camera cannot intersect with geometry while using navmesh movement.

                      This seems to be an overall comfortable approach.

                      Is something like this possible with this plugin? If so, how would one go about implementing it?
                      Set bUseWalkingCollision override in the settings and assign a set of custom collision settings for it. This will be the collision settings used when not locomoting, and when locomotion IS running it will use the standard collision settings. Then handle your blinders however you wish with a camera collision body.

                      I'll note that in general keeping pushback but darkening the screen during it is the same effect as valves, but doesn't require handling depenetration angles, their solution is rather lacking actually as you can clip through walls and blind interact with things.


                      Consider supporting me on patreon

                      My Open source tools and plugins
                      Advanced Sessions Plugin
                      VR Expansion Plugin

                      Comment


                        Originally posted by mordentral View Post

                        Set bUseWalkingCollision override in the settings and assign a set of custom collision settings for it. This will be the collision settings used when not locomoting, and when locomotion IS running it will use the standard collision settings. Then handle your blinders however you wish with a camera collision body.

                        I'll note that in general keeping pushback but darkening the screen during it is the same effect as valves, but doesn't require handling depenetration angles, their solution is rather lacking actually as you can clip through walls and blind interact with things.
                        Interesting. Is there a delegate for pushback start and stop, or a way of getting the pushback vector?

                        Comment


                          Originally posted by Ghamazh View Post

                          Interesting. Is there a delegate for pushback start and stop, or a way of getting the pushback vector?
                          Yes, there are events:

                          Code:
                          AVRBaseCharacter::OnBeginWallPushback(FHitResult HitResultOfImpact, bool bHadLocomotionInput, FVector HmdInput)
                          AVRBaseCharacter::OnEndWallPushback()


                          Consider supporting me on patreon

                          My Open source tools and plugins
                          Advanced Sessions Plugin
                          VR Expansion Plugin

                          Comment


                            Originally posted by mordentral View Post

                            Yes, there are events:

                            Code:
                            AVRBaseCharacter::OnBeginWallPushback(FHitResult HitResultOfImpact, bool bHadLocomotionInput, FVector HmdInput)
                            AVRBaseCharacter::OnEndWallPushback()
                            Very helpful, thank you. For the purposes of blinders, it might be helpful to also cache the pushback vector, so that said blinders can be properly oriented. Might dig into it and see if I can implement something along those lines.

                            Comment


                              Originally posted by mordentral View Post

                              They do already, the problem is that oculus auto pauses the game when the headset is off, you'll have to change that behavior as the plugin doesn't have data to work with then.
                              Someone might have mentioned already, but there is a way to get the Touch Controllers and the Headset to work without wearing the headset, the Oculus DEBUG Program (remember the program that that lets you increase the SuperSampling for the 'old' CV1 etc?) it is in: "C:\Program Files\Oculus\Support\oculus-diagnostics\OculusDebugTool.exe"

                              I just checked with AIRCAR to be sure, it let me Star and Play onthe Monitor (There is HJOPE HERE for Half Life Alyx to play onscreen maybe!!! Couple this Debug routine with a Rift Emulator' program ?? and play onscreen
                              Click image for larger version  Name:	Debug.jpg Views:	0 Size:	119.3 KB ID:	1738124

                              and DUDE.... T H A N K Y O U for VR Expansion. Amazing not to have to deal with somany logistics as a Character Animator etc.

                              ONE Question? I have discovered a really great 'VR Hand Gripping' BP routine that:
                              1) allows hands to always be 'hard' (not go through stuff, always are 'physical') and more importantly
                              2) it will curl its fingers and 'grip' any object you go to pick up (like Lone Echo even a little more interactive, ,,,)
                              3) a Zero G locomotion I have adopted and over a year, perfected (Also like Lone Echo, but with Galactic speed acceleration possibility..) I think it might even be a simple matter to switch to Zero G from VR Expansion, is is probably like when VR Exp says: Start as a Walker, Be a Car Driver Now, Be a Climber now, maybe I can add: be an ASTRONAUT NOW?! in the same manner?

                              I was wondering if you could give advice on incorporating these into VR Exp.? I would be happy to send you a working builds, BPs. Answer any questions, etc.

                              I am anxiously about to go try the 3/23 Repository Download (which Built Perfectly for 4.24 all by itself! :O :~)_ _ _ _ _ |) --- I have been busy animating my characters and building my worlds for many months and haven't been in VR except in VR Mode (where locomotion isn't an issue..) I thought, I'll deal; with Locomotion later,... The Good Lord must be looking over my project because I came here today after many many months and you have UPDATED!
                              There is New Magic! (Do I see weapons? Can I finally stop trying to merge VR Expansion and UVRF? LOL Did I see a mention of a VR BODY?
                              Just wonderful. WONDERFUL! Thank you! I hope my little Debug tip above helps you !!
                              Last edited by NextWorldVR; 03-28-2020, 09:57 AM.

                              Comment


                                Originally posted by RazkingZ View Post





                                Having an issue with this again, tracking is fine user side but gradually from another clients end head and hands will stop replicating and will stick. This is only happens on one of my levels its very weird. Is there a way to have head and hands always replicate whether or not the headset is on?

                                Thanks,
                                Hey buddy, I thought i'd give you a heads up I may have a fix for your headset pausing the game, I don't know if you need a systemic fix or just a working one,.. but this does work to keep it awake without pausing when the headset is not on ones head! (see image and file path above)

                                ( I'm Woody from youtube.com/NextWorldVR by the way...)

                                Comment

                                Working...
                                X