Originally posted by TQwan
View Post
Announcement
Collapse
No announcement yet.
VR Expansion Plugin
Collapse
X
-
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
-
Originally posted by mordentral View Post
If you are trying to directly control the cameras pitch you will need to either turn off LockToHMD (don't do that), or modify where it gets the tracked data and add an offset.
I'm trying to do snap like you do with YAW on the seatedcharacter.
Basicly with some input (either a key presss or mouse moved past a certain value, then increment pitch on camara and/or yaw on character)
It's to help people who can't look up or down easily to still be able to look at something up high or down low (or behind them in case of yaw snap turning)
So on character tick get x / y mouse axis.
Determine if mouse move bigger then some value then call SetActorRotationVR to update yaw rotation for character and update pitch rotation for vr replicated camera.
The code below is what I came up with, its called with
NewRot.Pitch = VRReplicatedCamera->RelativeRotation.Pitch + Mouse Y Axis
NewRot.Yaw = GetActorRotationVR()->.Yaw + Mouse X Axis
NewRot.Roll = GetActorRotationVR()->.Roll
bUseYawOnly = false
bAccountForHMDRotation = true
bApplyPitchToCamera = true
Code:// Sets the actors rotation taking into account the HMD as a pivot point (also moves the actor), returns the location difference // bAccountForHMDRotation sets the rot to have the HMD face the given rot, if it is false it ignores the HMD rotation // bApplyPitchToCamera sets new rotation pitch to vr replicated camera instead of whole character root. UFUNCTION(BlueprintCallable, Category = "BaseVRCharacter|VRLocations") FVector SetActorRotationVR2(FRotator NewRot, bool bUseYawOnly = true, bool bAccountForHMDRotation = true, bool bApplyPitchToCamera = true) { AController* OwningController = GetController(); FVector NewLocation; FRotator NewRotation; FVector OrigLocation = GetActorLocation(); FVector PivotPoint = GetActorTransform().InverseTransformPosition(GetVRLocation()); PivotPoint.Z = 0.0f; FRotator OrigRotation = bUseControllerRotationYaw && OwningController ? OwningController->GetControlRotation() : GetActorRotation(); // Original rotation is always applied to the full character. // This means that when pitch is applied, the whole character body will rotate, which at 90 degrees for example will make most characters float unnaturally in the air. // Usually we just want to rotate the head (and possibly attached arms/hands in non vr) when pitching and ignore pitch rotation for the full body. // To do this I first copy back pitch from VRReplicatedCamera if bApplyPitchToCamera is set. // Note: NewRot.Pitch value is based on a value of the VRReplicatedCamera->RelativeRotation.Pitch + Mouse Y axis offset. // NewRot.Roll and NewRot.Yaw are based on GetActorRotationVR. if (bApplyPitchToCamera) { OrigRotation.Pitch = VRReplicatedCamera->RelativeRotation.Pitch; } if (bUseYawOnly) { // Zero out new supplied Pitch and Roll. NewRot.Pitch = 0.0f; NewRot.Roll = 0.0f; } if (bAccountForHMDRotation) { NewRotation = UVRExpansionFunctionLibrary::GetHMDPureYaw_I(VRReplicatedCamera->RelativeRotation); NewRotation = (NewRot.Quaternion() * NewRotation.Quaternion().Inverse()).Rotator(); } else { NewRotation = NewRot; } NewLocation = OrigLocation + OrigRotation.RotateVector(PivotPoint); //NewRotation = NewRot; NewLocation -= NewRotation.RotateVector(PivotPoint); if (bUseControllerRotationYaw && OwningController) //&& IsLocallyControlled() { OwningController->SetControlRotation(NewRotation); } // Save new calculated pitch before nulling so that we can apply it to the vr camera later. float pitch = NewRotation.Pitch; // When char does not directly use controller pitch. (The bUseControllerRotationPitch should be on VRReplicatedCamera) if (bApplyPitchToCamera && !bUseControllerRotationPitch) { // Null out pitch (before applying location and rotation to root of character.) NewRotation.Pitch = 0.0f; // Note: The pitch should still be applied to Owning Controller in code above (so new pitch information can always be accessed with ControlRotation.Pitch) } // Also setting actor rot because the control rot transfers to it anyway eventually SetActorLocationAndRotation(NewLocation, NewRotation); // Apply pitch to camera after actor has been rotated. // Only do this when Root does not use controller pitch. // bUseYawOnly must be false (since we use pitch). if (bApplyPitchToCamera && !bUseControllerRotationPitch && !bUseYawOnly) { //FRotator camComRot = VRReplicatedCamera->GetComponentRotation(); //camComRot.Pitch = pitch; //VRReplicatedCamera->SetWorldRotation(camComRot); FRotator rota = VRReplicatedCamera->RelativeRotation; rota.Pitch = pitch; VRReplicatedCamera->SetRelativeRotation(rota); } return NewLocation - OrigLocation; }
Note: Currently it doesn't work and it seems the mouse Y change affects the character roll and not the pitch of the camera.
I have a feeling the VR replicated camera (or the whole character) is always rotated 90 degrees, causing this strange issue.
Edit:
I also tried just updating the pitch with a keypress like this, but it didnt work:
https://blueprintue.com/blueprint/nw_4x_wz/
Final Edit:
I got it working. I had to change the pitch of the 'Net Smoother', not the VRReplicated camera which is a child of Net Smoother.
The example in the link allows the user to change pitch rotation with a key press.
This allows multiple functions:
1. Apply a 'snap' for pitch (& roll / yaw) to the first person (vr replicated) camera without directly affecting the character mesh.
2. Changing the netsmoother also rotates it child components, so in non vr you can use it to rotate the hands (since they are attached to it).
In case anyone is interested.
https://blueprintue.com/blueprint/__1_4wog/Last edited by TQwan; 06-04-2018, 08:12 AM.
Comment
-
Originally posted by TQwan View Post
Yeah LockToHMD won;t work, since this will stop VR tracking.
I'm trying to do snap like you do with YAW on the seatedcharacter.
Basicly with some input (either a key presss or mouse moved past a certain value, then increment pitch on camara and/or yaw on character)
It's to help people who can't look up or down easily to still be able to look at something up high or down low (or behind them in case of yaw snap turning)
So on character tick get x / y mouse axis.
Determine if mouse move bigger then some value then call SetActorRotationVR to update yaw rotation for character and update pitch rotation for vr replicated camera.
The code below is what I came up with, its called with
NewRot.Pitch = VRReplicatedCamera->RelativeRotation.Pitch + Mouse Y Axis
NewRot.Yaw = GetActorRotationVR()->.Yaw + Mouse X Axis
NewRot.Roll = GetActorRotationVR()->.Roll
bUseYawOnly = false
bAccountForHMDRotation = true
bApplyPitchToCamera = true
Note: Currently it doesn't work and it seems the mouse Y change affects the character roll and not the pitch of the camera.
I have a feeling the VR replicated camera (or the whole character) is always rotated 90 degrees, causing this strange issue.
Edit:
I also tried just updating the pitch with a keypress like this, but it didnt work:
https://blueprintue.com/blueprint/nw_4x_wz/
Final Edit:
I got it working. I had to change the pitch of the 'Net Smoother', not the VRReplicated camera which is a child of Net Smoother.
The example in the link allows the user to change pitch rotation with a key press.
This allows multiple functions:
1. Apply a 'snap' for pitch (& roll / yaw) to the first person (vr replicated) camera without directly affecting the character mesh.
2. Changing the netsmoother also rotates it child components, so in non vr you can use it to rotate the hands (since they are attached to it).
In case anyone is interested.
https://blueprintue.com/blueprint/__1_4wog/
Changing pitch of the netsmoother is incorrect, it will result in an offset capsule and moves everything from the center of the tracked space.
You should be taking the cameras transform and setting it to My_Offset_As_A_Transform * CameraTransform, that applies the pitch adjustment PRIOR to the cameras rotations, if you just ADD pitch to the hmds rotation the resulting final rotation will be incorrect depending on how they are holding their head.
You also don't want to do it in SetActorRotation, you want to do it in the camera itself (get camera view). The character ignores the cameras pitch for its standing capsule location and only takes yaw.
One thing to consider, is that to do it correctly you would want the HMDs neck / eye offset transform taken into account.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View Post
Changing pitch of the netsmoother is incorrect, it will result in an offset capsule and moves everything from the center of the tracked space.
You should be taking the cameras transform and setting it to My_Offset_As_A_Transform * CameraTransform, that applies the pitch adjustment PRIOR to the cameras rotations, if you just ADD pitch to the hmds rotation the resulting final rotation will be incorrect depending on how they are holding their head.
You also don't want to do it in SetActorRotation, you want to do it in the camera itself (get camera view). The character ignores the cameras pitch for its standing capsule location and only takes yaw.
One thing to consider, is that to do it correctly you would want the HMDs neck / eye offset transform taken into account.
So I thought should be fine for both non vr users (since it would update the camera and attached motion controlls which allows the user to click on various thhings up in the air)
For VR users the update in pitch would affect motion controls, but it looks like they ignore the net smoother pitch, since they update their own rotation.
-----------------------------
Anyway I tried implementing it as you suggested and tried various ways, but it seems that after the pitch is updated it is auto resets back.
All changes below are made in
void UReplicatedVRCameraComponent::GetCameraView(float DeltaTime, FMinimalViewInfo& DesiredView)
First I tried to update the pitch afterCode:if (XRCamera->UpdatePlayerCamera(Orientation, Position))
These changes are in the code bellow as commented out lines.
note 1: I had to figure out how quaternions worked, since initially I was working in the vr part. Therefore the code contains both quaternion and frotate calculations for the new pitch.
Code:// New q = Transform based on quad multiplication. // New R = Transform based on fRotator //FTransform origCameraTransform = CameraTransform //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [Original] Transform: %s"), *origCameraTransform.ToString()); //FQuat pitchQuat(FVector::RightVector, FMath::DegreesToRadians(bAdditionalPitch)); //FQuat newOrientation = Orientation * pitchQuat; //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [New Q] Orientation: %s"), *newOrientation.ToString()); // Clamp pitch -88 to 88 degrees using FRotator (I guess there is also a quaternion way to do this). //FRotator rotClamp = FRotator(newOrientation); //rotClamp.Pitch = FMath::Clamp(rotClamp.Pitch, -88.0f, 88.0f); //newOrientation = rotClamp.Quaternion(); //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [New Q] Orientation clamp: %s"), *newOrientation.ToString()); //CameraTransform = FTransform(newOrientation, Position); //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [New Q] Transform w clamp: %s"), *CameraTransform.ToString()); // Try the same calculation using fRotation instead of quaternion. //FRotator rot = FRotator(bAdditionalPitch, 0, 0); //origCameraTransform.ConcatenateRotation(rot.Quaternion()); //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [New R] Transform no clamp: %s"), *origCameraTransform.ToString()); // Clamp it again //FRotator newRot = origCameraTransform.Rotator(); //newRot.Pitch = FMath::Clamp(newRot.Pitch, -88.0f, 88.0f); //origCameraTransform.SetRotation(newRot.Quaternion()); //UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [New R] Transform w clamp:: %s"), *origCameraTransform.ToString()); CameraTransform = origCameraTransform
Now for both solutions it seems the pitch does get properly applied, however directly after it the view gets reset to original viewpoint (I don't know where).
At first I though this was because I reset the bAdditionalPitch value after it was applied, but even not resetting the value to 0 didn't work.
Then I thought it might not have replicated properly, but that also didn't seem like that case....
note 2: bAdditionalPitch = 10.0f or -10.0f
note 3: bUsePawnControlRotation = false for both VR and non VR pawns in VR Replicated camera.
Code:///************REFERENCE POINT*********** if (bUseAdditiveOffset) { FTransform OffsetCamToBaseCam = AdditiveOffset; FTransform BaseCamToWorld = GetComponentToWorld(); FTransform OffsetCamToWorld = OffsetCamToBaseCam * BaseCamToWorld; DesiredView.Location = OffsetCamToWorld.GetLocation(); DesiredView.Rotation = OffsetCamToWorld.Rotator(); } else { DesiredView.Location = GetComponentLocation(); DesiredView.Rotation = GetComponentRotation(); } // ************CHANGES START HERE ********* if (bAdditionalPitch != 0) { UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [Original] DesiredRotation: %s"), *DesiredView.Rotation.ToString()); DesiredView.Rotation.Pitch = FMath::Clamp(DesiredView.Rotation.Pitch + bAdditionalPitch, -88.0f, 88.0f); UE_LOG(LogBaseVRCharacter, Log, TEXT("ReplicatedVRCameraComponent::GetCameraView [Original] New DesiredRotation: %s"), *DesiredView.Rotation.ToString()); //bAdditionalPitch = 0; }
final note: I also saw this flag: bUseAdditiveOffset, should this perhaps be used?Last edited by TQwan; 06-05-2018, 06:01 AM.
Comment
-
Originally posted by TQwan View Post
I thought NetSmoother is inside the VRRootReference, so it will only affect the components that are its children, so the VR Replicated Camera and motion controls.
So I thought should be fine for both non vr users (since it would update the camera and attached motion controlls which allows the user to click on various thhings up in the air)
For VR users the update in pitch would affect motion controls, but it looks like they ignore the net smoother pitch, since they update their own rotation.
-----------------------------
Anyway I tried implementing it as you suggested and tried various ways, but it seems that after the pitch is updated it is auto resets back.
All changes below are made in
void UReplicatedVRCameraComponent::GetCameraView(float DeltaTime, FMinimalViewInfo& DesiredView)
First I tried to update the pitch afterCode:if (XRCamera->UpdatePlayerCamera(Orientation, Position))
These changes are in the code bellow as commented out lines.
note 1: I had to figure out how quaternions worked, since initially I was working in the vr part. Therefore the code contains both quaternion and frotate calculations for the new pitch.
So then I tried to just update the DesiredView.Rotation.
Now for both solutions it seems the pitch does get properly applied, however directly after it the view gets reset to original viewpoint (I don't know where).
At first I though this was because I reset the bAdditionalPitch value after it was applied, but even not resetting the value to 0 didn't work.
Then I thought it might not have replicated properly, but that also didn't seem like that case....
note 2: bAdditionalPitch = 10.0f or -10.0f
note 3: bUsePawnControlRotation = false for both VR and non VR pawns in VR Replicated camera.
Can you give me an example on how to do it?
final note: I also saw this flag: bUseAdditiveOffset, should this perhaps be used?
I wasn't aware that you wanted hands offset as well, thought it was only visually. But setting the root is still wrong, think about if someone walked 5' out in roomspace from the center of the tracked area (the root) and then you rotated that root, you would be putting them halfway into the floor. If you are doing head locked then it would be fine, but any roomscale application would be incorrect with that.
You can SetPlayerRotationVR to face upwards in pitch (you would want control rotation on for this likely for pitch), it accounts for the HMD location, however pitch values screw with the character movement component severely. So in the end its likely still offsetting by a pitch to the camera, but you'd have to use a pivot at the floor location underneath it since you want the hands to rotate as well.
Something to consider would be the SimpleVRCharacter, it zero's out the X/Y of the HMD to center the character on it and moves the character with the HMD. You could rotate the netsmoother in that setup and it would work like how you want, though again...the collision would not follow the rotation.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View Post
bUseAdditive offset is something epic has built in to add an offset to the camera view.
I wasn't aware that you wanted hands offset as well, thought it was only visually. But setting the root is still wrong, think about if someone walked 5' out in roomspace from the center of the tracked area (the root) and then you rotated that root, you would be putting them halfway into the floor. If you are doing head locked then it would be fine, but any roomscale application would be incorrect with that.
You can SetPlayerRotationVR to face upwards in pitch (you would want control rotation on for this likely for pitch), it accounts for the HMD location, however pitch values screw with the character movement component severely. So in the end its likely still offsetting by a pitch to the camera, but you'd have to use a pivot at the floor location underneath it since you want the hands to rotate as well.
Something to consider would be the SimpleVRCharacter, it zero's out the X/Y of the HMD to center the character on it and moves the character with the HMD. You could rotate the netsmoother in that setup and it would work like how you want, though again...the collision would not follow the rotation.
Basically I'm trying to make it easier for people with disabilities to use my game.
Worst case scenario is for people that can only look forward, so I supply control options for snap turning with both yaw and pitch.
When using VR without motion controllers, it's okay that pitching is also applied to the motion controller hands, because this would still allow the user to interact with the world with the laserbeam from the hands. This approach allows the users an alternative to the gaze/'look at' clicking.
When using VR with motion controls with the user should just be able to adjust the pitch without modifying the hands.
This currently seems to work, since the motion controllers keep updating their own location in world space.
When using non vr I currently hide the hands and use a 1p/3p mesh.
Any interaction, can be directly drawn on screen (e.g menu as HUD and not a 3D item in the world; and things like 'press X to pickup item')
However in this case the yaw/pitch changes should still be replicated to reflect where the character is looking at.
Sidenode: In order to simulate head movement on the character I simply get the rotation of the camera (or netsmoother) and use this in the animation bp of the character.
----
When using VR, I'm always using head locked, so I guess the roomscale thing won't be a problem.
Also the pitch offset is clamped from to -88 to 88 degree's so they can't screw the camera up too much
You are right; the pitch on net smoother is only for the visual camera effect, that allows the users to changed their view manually.
Yeah I tried SetPlayerRotationVR, however there is a hard coded yaw stuff in there. Like you said 'pitch values screw with the character movement component severely' I can definitely account for that. Thats why I made SetPlayerRotationVR2 shown in a previous post, which is a very hacky way to restore the pitch for the camera. Unfortunately it only worked for non-vr mode, since in VR the relative rotation of the VR Replicated Camera seems to be overwritten afterwards.
I think that as long as yaw change is applied to the VR root the collision should be fine right?
What do you mean with collision would not follow the rotation? I thought the VRRootComponent does the collision and the Motion Controller Hands have their own collision.
Edit: I just noticed that the VRMovementReference Perform Move Action Snap Turn rotates the character with an offset a bit to the left front, instead of rotating the center of the character. I wonder if this is what you are referring to, it should be easy to check if this might be a bug by calling the function with a key and increment it 60 degrees (so it always rotates on the same points), you should see the character moving around a specific point instead of it's own center.
Note: I put the character mesh below the VR root (not netsmoother), so that it doesn't get affected by pitch.
Note 2: I also made 1 character containing all the logic, since the FPSPawn sometimes causes crashes during startup. This is caused by how I set up the inheritance structure. The crash occurs because the left and right motion controller both load procedural meshes, which are sometimes loaded in the wrong order. (you can ignore this; for other that run into this problem and want to continue using both FPSPawn and the VivePawn, you can simply move the FPSPawn out of the Content Directory and place it back after the project is loaded. Note that this 'fix' gets very annoying after some time and therefore I refactored them into 1 Character)
Anyway my character structure looks like this:
[BP] MyCharacter (contains both FPS Pawn and VivPawn logic) >
[C] MyCharacter.h (contains inventory/weapon and other stuff) >
[C] AVRCharacter.h (your plugin, does the vr magic ) >
[C] AVRBaseCharacter.h (your plugin, does the vr magic ) >
[C] ACharacter.h (epic's character base)
Although I will probably refactor this, since my AI bots inherit from [C] MyCharacter.h in order to get a weapon & inventory, but since [C] MyCharacter.h inherits [C] AVRCharacter.h they also have all the vr stuff like hands which I just disable on the inherited bot BP.
*BP = Blueprint
> = inherits
C = C++ fileLast edited by TQwan; 06-05-2018, 01:37 PM.
Comment
-
Originally posted by TQwan View Post
Note: I put the character mesh below the VR root (not netsmoother), so that it doesn't get affected by pitch.
Note 2: I also made 1 character containing all the logic, since the FPSPawn sometimes causes crashes during startup. This is caused by how I set up the inheritance structure. The crash occurs because the left and right motion controller both load procedural meshes, which are sometimes loaded in the wrong order. (you can ignore this; for other that run into this problem and want to continue using both FPSPawn and the VivePawn, you can simply move the FPSPawn out of the Content Directory and place it back after the project is loaded. Note that this 'fix' gets very annoying after some time and therefore I refactored them into 1 Character)
Anyway my character structure looks like this:
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
A 4.20 branch is up on the template repository (compiles and runs), I won't be taking it to the main until I get it tested fully, they changed a ton of things relating to navigation/ai and some fairly core character movement sections.
Also there are some preview 1 bugs that they need to fix in future iterations, as normal I suggest not updating until out of preview.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Hi Mordentral,
First thanks a ton for pointing me in the right direction with Yurinik's patch for using more devices than the current Unreal SteamVR implimentation would allow. That's working for me.
I have noticed another issue lately after I updated to 4.19 version of the template which I'm not sure was an issue before. At least I don't remember noticing this on a previous version I was using. I was noticing with my IK body setup which is driven by gripped motion controller components that there's a very apparent jitter to other clients controlled characters, while my own character is smooth. This is the same for all players. The owned character is smooth, and other connected characters are jittery.
With a latest version of your template just before your 4.20 work I was able to isolate the issue to a repeatable example with these steps:
-with VivePawnCharacter I enabled the steamVR representations of the controllers/trackers, and hid the other hands/controllers that come enabled with it.
-ticked on autoActivate for the netsmoother on the components (not sure if this is necessary)
-I disabled late updates, and set the grip motion controllers to smooth replicated motion
-gripped motion controllers are set to netupdate rate of 60 by default
-packaged in shipping.
-Connected server/client from two machines.
-From here I'm noticing jitter on the gripped motion controller for the other connected character that i don't see on my own character.
-To see if i could see the smoothing effect happening better, I played with the console command net pktLoss=50 to simulate loosing half of the packets so i could see what the client smoothing is doing better. The more the pktLoss the more the jitter is pronounced, and I don't notice any interpolation happening.
Other things I tried:
-Changing the net update rate to really low percentage. This makes the jitter more pronounced.
-Changing the net update rate all the way up to 100. The higher the update rate the more high frequency the jitter.
-Tried LAN mode, also the same
-Swapping to the Epic MotionController components has the same effect as net update frequency of 100 on your components. Which makes sense as it's essentially full replication but no smoothing.
Right now, I'm not noticing any smoothing happening on the gripped motion controller components from the other clients point of view so I was wondering if something has changed with the smoothing setup that I need to account for, or maybe I have just had it set up wrong from the beginning, and am just now noticing.
Also, if I wanted to play with the network smoothing myself where/how would be best to do that? I noticed your lowpassfilter_rolling average, and lowpassfilter_exponential functions and was wondering how to swap between the different types of smoothing to try them out.Last edited by MagicBots; 06-11-2018, 07:33 AM.
Comment
-
Originally posted by MagicBots View PostHi Mordentral,
First thanks a ton for pointing me in the right direction with Yurinik's patch for using more devices than the current Unreal SteamVR implimentation would allow. That's working for me.
I have noticed another issue lately after I updated to 4.19 version of the template which I'm not sure was an issue before. At least I don't remember noticing this on a previous version I was using. I was noticing with my IK body setup which is driven by gripped motion controller components that there's a very apparent jitter to other clients controlled characters, while my own character is smooth. This is the same for all players. The owned character is smooth, and other connected characters are jittery.
With a latest version of your template just before your 4.20 work I was able to isolate the issue to a repeatable example with these steps:
-with VivePawnCharacter I enabled the steamVR representations of the controllers/trackers, and hid the other hands/controllers that come enabled with it.
-ticked on autoActivate for the netsmoother on the components (not sure if this is necessary)
-I disabled late updates, and set the grip motion controllers to smooth replicated motion
-gripped motion controllers are set to netupdate rate of 60 by default
-packaged in shipping.
-Connected server/client from two machines.
-From here I'm noticing jitter on the gripped motion controller for the other connected character that i don't see on my own character.
-To see if i could see the smoothing effect happening better, I played with the console command net pktLoss=50 to simulate loosing half of the packets so i could see what the client smoothing is doing better. The more the pktLoss the more the jitter is pronounced, and I don't notice any interpolation happening.
Other things I tried:
-Changing the net update rate to really low percentage. This makes the jitter more pronounced.
-Changing the net update rate all the way up to 100. The higher the update rate the more high frequency the jitter.
-Tried LAN mode, also the same
-Swapping to the Epic MotionController components has the same effect as net update frequency of 100 on your components. Which makes sense as it's essentially full replication but no smoothing.
Right now, I'm not noticing any smoothing happening on the gripped motion controller components from the other clients point of view so I was wondering if something has changed with the smoothing setup that I need to account for, or maybe I have just had it set up wrong from the beginning, and am just now noticing.
Also, if I wanted to play with the network smoothing myself where/how would be best to do that? I noticed your lowpassfilter_rolling average, and lowpassfilter_exponential functions and was wondering how to swap between the different types of smoothing to try them out.
And the smoothing on the controllers doesn't use a low pass filter, i'm not smoothing out jitter, i'm smoothing out between updates received based on the expected time between updates. And nothing has changed in the smoothing area since 4.16 or so.
Also the default MotionControllers DON'T have client up replication at all, so I can say that it likely is your hand / procedural mesh if you used theirs and noticed anything at all, since theirs isn't even replicating up to begin with.
*Edit* What do you mean by jitter by the way? Do you have a gif or video of it?
You can verify that smoothing is doing anything at all by setting the update rate to something silly low like 5, and then viewing with and without smoothing turned on.Last edited by mordentral; 06-11-2018, 09:42 AM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View Post
Sounds more like you have the hand/procedural mesh being replicated rather than the controller replication is messing up. Setting it to 100 htz shouldn't require smoothing at all, so if it is still doing it then you have something wrong.
And the smoothing on the controllers doesn't use a low pass filter, i'm not smoothing out jitter, i'm smoothing out between updates received based on the expected time between updates. And nothing has changed in the smoothing area since 4.16 or so.
Also the default MotionControllers DON'T have client up replication at all, so I can say that it likely is your hand / procedural mesh if you used theirs and noticed anything at all, since theirs isn't even replicating up to begin with.
*Edit* What do you mean by jitter by the way? Do you have a gif or video of it?
You can verify that smoothing is doing anything at all by setting the update rate to something silly low like 5, and then viewing with and without smoothing turned on.
Comment
-
Originally posted by MagicBots View Post
Ok, sorry for the false alarm. I also updated an IK plugin I was using when moving to 4.19, and didn't notice they added replication by default so what I was seeing was double replication causing jitter. Then it was just a matter of me confusing myself with the Net pktLoss thing leading me to think the smoothing was the problem. Smoothing is working great, and thanks for the response!
Since the controllers rep by default with the plugin the IK should just be client side entirely.Last edited by mordentral; 06-12-2018, 10:01 AM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Heads up to people still on 4.18 or who don't look at the logs, I added a warning in 4.19 about how the InteractibleSettings were going to be deprecated soon and to move away from them into custom gripped objects and the pre-made levers / sliders instead.
4.20 is when that deprecation is happening, they will be removed at the same time as I add a new system "GripScripts" that will have less overhead, more options, and is fairly easy to extend. GriptScripts will come with a built in script that emulates the original InteractibleSettings for convenience sake (and as an example), however the current settings that people may be using will be lost and have to be carried over.
The functionality of InteractibleSettings was essentially dead at this point, with custom grips doing it cleaner and with less overhead, I haven been trying to find a good time to remove it for a few engine versions now, doing it alongside the addition of the GripScripts made the most sense to me.
*Edit* Grip scripts may actually be waylaid, currently they arent' panning out like I had hoped.Last edited by mordentral; 06-15-2018, 08:30 AM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
ok so im a bit confused on the grip system with gameplay tags. and i couldn't find any documentation in your wiki.
im trying to set up a common oculus system where the items are picked up on grip and dropped on grip release. triggers would be the use buttons. my test object is the gun which i love and have expanded on vastly to do all kinds of guns now, and flashlights or just socket griped pickups. its great code.
but i cant figure out the grip tag system.
i got kinda got it working. i added a grip input and set the tags for the grabR and grapL commented code and it picks up and drops correctly, but then i had issues figuring out how to properly rehookup the triggers so i can use them for use events.
can you please tell me a bit about how the grip tags system works and how i could use it for what i need.Last edited by sphinix257; 06-15-2018, 12:10 AM.
Comment
Comment