VR Expansion Plugin

bUseAdditive offset is something epic has built in to add an offset to the camera view.

I wasn’t aware that you wanted hands offset as well, thought it was only visually. But setting the root is still wrong, think about if someone walked 5’ out in roomspace from the center of the tracked area (the root) and then you rotated that root, you would be putting them halfway into the floor. If you are doing head locked then it would be fine, but any roomscale application would be incorrect with that.

You can SetPlayerRotationVR to face upwards in pitch (you would want control rotation on for likely for pitch), it accounts for the HMD location, however pitch values screw with the character movement component severely. So in the end its likely still offsetting by a pitch to the camera, but you’d have to use a pivot at the floor location underneath it since you want the hands to rotate as well.

Something to consider would be the SimpleVRCharacter, it zero’s out the X/Y of the HMD to center the character on it and moves the character with the HMD. You could rotate the netsmoother in that setup and it would work like how you want, though again…the collision would not follow the rotation.

---- additional info –
Basically I’m trying to make it easier for people with disabilities to use my game.
Worst case scenario is for people that can only look forward, so I supply control options for snap turning with both yaw and pitch.

When using VR without motion controllers, it’s okay that pitching is also applied to the motion controller hands, because would still allow the user to interact with the world with the laserbeam from the hands. approach allows the users an alternative to the gaze/‘look at’ clicking.

When using VR with motion controls with the user should just be able to adjust the pitch without modifying the hands.
currently seems to work, since the motion controllers keep updating their own location in world space.

When using non vr I currently hide the hands and use a 1p/3p mesh.
Any interaction, can be directly drawn on screen (e.g menu as HUD and not a 3D item in the world; and things like ‘press X to pickup item’)
However in case the yaw/pitch changes should still be replicated to reflect where the character is looking at.

Sidenode: In order to simulate head movement on the character I simply get the rotation of the camera (or netsmoother) and use in the animation bp of the character.


When using VR, I’m always using head locked, so I guess the roomscale thing won’t be a problem.
Also the pitch offset is clamped from to -88 to 88 degree’s so they can’t screw the camera up too much :wink:
You are right; the pitch on net smoother is only for the visual camera effect, that allows the users to changed their view manually.

Yeah I tried SetPlayerRotationVR, however there is a hard coded yaw stuff in there. Like you said ‘pitch values screw with the character movement component severely’ I can definitely account for that. Thats why I made SetPlayerRotationVR2 shown in a previous post, which is a very hacky way to restore the pitch for the camera. Unfortunately it only worked for non-vr mode, since in VR the relative rotation of the VR Replicated Camera seems to be overwritten afterwards.

I think that as long as yaw change is applied to the VR root the collision should be fine right?
What do you mean with collision would not follow the rotation? I thought the VRRootComponent does the collision and the Motion Controller Hands have their own collision.

Edit: I just noticed that the VRMovementReference Perform Move Action Snap Turn rotates the character with an offset a bit to the left front, instead of rotating the center of the character. I wonder if is what you are referring to, it should be easy to check if might be a bug by calling the function with a key and increment it 60 degrees (so it always rotates on the same points), you should see the character moving around a specific point instead of it’s own center.

Note: I put the character mesh below the VR root (not netsmoother), so that it doesn’t get affected by pitch.
Note 2: I also made 1 character containing all the logic, since the FPSPawn sometimes causes crashes during startup. is caused by how I set up the inheritance structure. The crash occurs because the left and right motion controller both load procedural meshes, which are sometimes loaded in the wrong order. (you can ignore ; for other that run into problem and want to continue using both FPSPawn and the VivePawn, you can simply move the FPSPawn out of the Content Directory and place it back after the project is loaded. Note that ‘fix’ gets very annoying after some time and therefore I refactored them into 1 Character)
Anyway my character structure looks like :
[BP] MyCharacter (contains both FPS Pawn and VivPawn logic) >
[C] MyCharacter.h (contains inventory/weapon and other stuff) >
[C] AVRCharacter.h (your plugin, does the vr magic ) >
[C] AVRBaseCharacter.h (your plugin, does the vr magic ) >
[C] ACharacter.h (epic’s character base)
Although I will probably refactor , since my AI bots inherit from [C] MyCharacter.h in order to get a weapon & inventory, but since [C] MyCharacter.h inherits [C] AVRCharacter.h they also have all the vr stuff like hands which I just disable on the inherited bot BP.

*BP = Blueprint
> = inherits
C = C++ file

For the record, if you are working not in roomscale (you aren’t in roomscale) then there is actually very little point to use my custom characters. You could just use a normal character and add the motion controllers to it or a pawn. The extended features of my characters really only come into play in multiplayer or roomscale, there are some specific mechanic fixes (like a stutter that default characters have going from falling to walking) that I fixed but most are livable. As for that rotational offset, that is likely from the capsule offset that is defaulted to simulate neck position, you can clear that to 0 since you are head locked.

A 4.20 branch is up on the template repository (compiles and runs), I won’t be taking it to the main until I get it tested fully, they changed a ton of things relating to navigation/ai and some fairly core character movement sections.

Also there are some preview 1 bugs that they need to fix in future iterations, as normal I suggest not updating until out of preview.

Hi ,

First thanks a ton for pointing me in the right direction with Yurinik’s patch for using more devices than the current Unreal SteamVR implimentation would allow. That’s working for me.

I have noticed another issue lately after I updated to 4.19 version of the template which I’m not sure was an issue before. At least I don’t remember noticing on a previous version I was using. I was noticing with my IK body setup which is driven by gripped motion controller components that there’s a very apparent jitter to other clients controlled characters, while my own character is smooth. is the same for all players. The owned character is smooth, and other connected characters are jittery.

With a latest version of your template just before your 4.20 work I was able to isolate the issue to a repeatable example with these steps:
-with VivePawnCharacter I enabled the steamVR representations of the controllers/trackers, and hid the other hands/controllers that come enabled with it.
-ticked on autoActivate for the netsmoother on the components (not sure if is necessary)
-I disabled late updates, and set the grip motion controllers to smooth replicated motion
-gripped motion controllers are set to netupdate rate of 60 by default
-packaged in shipping.
-Connected server/client from two machines.
-From here I’m noticing jitter on the gripped motion controller for the other connected character that i don’t see on my own character.
-To see if i could see the smoothing effect happening better, I played with the console command net pktLoss=50 to simulate loosing half of the packets so i could see what the client smoothing is doing better. The more the pktLoss the more the jitter is pronounced, and I don’t notice any interpolation happening.

Other things I tried:
-Changing the net update rate to really low percentage. makes the jitter more pronounced.
-Changing the net update rate all the way up to 100. The higher the update rate the more high frequency the jitter.
-Tried LAN mode, also the same
-Swapping to the Epic MotionController components has the same effect as net update frequency of 100 on your components. Which makes sense as it’s essentially full replication but no smoothing.

Right now, I’m not noticing any smoothing happening on the gripped motion controller components from the other clients point of view so I was wondering if something has changed with the smoothing setup that I need to account for, or maybe I have just had it set up wrong from the beginning, and am just now noticing.

Also, if I wanted to play with the network smoothing myself where/how would be best to do that? I noticed your lowpassfilter_rolling average, and lowpassfilter_exponential functions and was wondering how to swap between the different types of smoothing to try them out.

Sounds more like you have the hand/procedural mesh being replicated rather than the controller replication is messing up. Setting it to 100 htz shouldn’t require smoothing at all, so if it is still doing it then you have something wrong.

And the smoothing on the controllers doesn’t use a low pass filter, i’m not smoothing out jitter, i’m smoothing out between updates received based on the expected time between updates. And nothing has changed in the smoothing area since 4.16 or so.

Also the default MotionControllers DON’T have client up replication at all, so I can say that it likely is your hand / procedural mesh if you used theirs and noticed anything at all, since theirs isn’t even replicating up to begin with.

Edit What do you mean by jitter by the way? Do you have a gif or video of it?

You can verify that smoothing is doing anything at all by setting the update rate to something silly low like 5, and then viewing with and without smoothing turned on.

Ok, sorry for the false alarm. I also updated an IK plugin I was using when moving to 4.19, and didn’t notice they added replication by default so what I was seeing was double replication causing jitter. Then it was just a matter of me confusing myself with the Net pktLoss thing leading me to think the smoothing was the problem. Smoothing is working great, and thanks for the response!

Yeah…I have been suggesting to people for awhile now to turn off IK replication, several of the different IK plugins them use it and the VRplugin already handles that better (less bandwidth, smoothing, on the actual component) so its just doubling it up.

Since the controllers rep by default with the plugin the IK should just be client side entirely.

Heads up to people still on 4.18 or who don’t look at the logs, I added a warning in 4.19 about how the InteractibleSettings were going to be deprecated soon and to move away from them into custom gripped objects and the pre-made levers / sliders instead.

4.20 is when that deprecation is happening, they will be removed at the same time as I add a new system “GripScripts” that will have less overhead, more options, and is fairly easy to extend. GriptScripts will come with a built in script that emulates the original InteractibleSettings for convenience sake (and as an example), however the current settings that people may be using will be lost and have to be carried over.

The functionality of InteractibleSettings was essentially dead at point, with custom grips doing it cleaner and with less overhead, I haven been trying to find a good time to remove it for a few engine versions now, doing it alongside the addition of the GripScripts made the most sense to me.

Edit Grip scripts may actually be waylaid, currently they arent’ panning out like I had hoped.

ok so im a bit confused on the grip system with gameplay tags. and i couldn’t find any documentation in your wiki.

im trying to set up a common oculus system where the items are picked up on grip and dropped on grip release. triggers would be the use buttons. my test object is the gun which i love and have expanded on vastly to do all kinds of guns now, and flashlights or just socket griped pickups. its great code.

but i cant figure out the grip tag system.
i got kinda got it working. i added a grip input and set the tags for the grabR and grapL commented code and it picks up and drops correctly, but then i had issues figuring out how to properly rehookup the triggers so i can use them for use events.

can you please tell me a bit about how the grip tags system works and how i could use it for what i need.

In the example gun I have that shown, on grip it rebinds the UseButton to trigger and the drop button to Grip if the gun is held at the handle, if it is held outside of the handle then it retains the trigger grip/drop.

You can swap gameplay tags around live on objects to control what buttons they want for what actions.

The gameplay tags aren’t part of the plugin itself, the template does all of the control code for what the plugin does in blueprint and gameplay tags were the easiest method of allowing any object with any interaction for testing. The BPs are just calling the correct plugin nodes for grip and drop depending on the tag values and what button was used for the action.

However, if you really just want those actions you mentioned above, you can skip all of the code setting parts and in the objects default tags just set the following

Grip.OnGrip
Drop.OnGrip
Use.OnTrigger
EndUse.OnTriggerRelease

And it will work like you want without live re-assignment, will grip and drop with side grip press, will use with trigger press, and end use on trigger release.

How i can a little bit Rotate the Weapon in Hand?

Hi , plugin is fantastic. We’re migrating our game over to it bit by bit, we’ve come across one thing that we can’t replicate with your plugin though -

We have a leaning system in our game where collisions at your feet prevent further non-roomscale locomotion, but the player can still use roomscale to lean over objects. So we can lean over railings or over worktops and tables, but are prevented from moving through them using controller movement.

With your plugin, HMD position seems to be locked completely to the capsule, so leaning over objects is impossible. Have seen in some places you mention a “neck offset” but we can’t seem to find any such option (perhaps isn’t even a solution to our problem?)

A perfect solution would be to have the capsule recognize steps and ramps and handle vertical travel with precision, but to only block the player on a horizontal plane past 40-50cm or some arbitrary value so there is some generous tolerance, and maybe as a back up have the camera position will fully block the player if they try to look through a wall or something. Is anything close to possible?

Anyway, help would be super appreciated for , since not being able to look / lean over objects feels very limiting. Hopefully it’s just something we’re missing, keep up the awesome work :>

Bump. I am also extremely interested on modifying the collision capsule in game. I use alot of climbing and leaning over ledges and tips for haveing a bit of leniency for passing thru certain objects in game would be awesome to know

I don’t intend to add officially myself as the concept in general is flawed, without a waist tracker the “lean” effect can be done just by the player walking forward, you don’t have any real way to actually require them to make the leaning motion. At that point there is no difference between an offset collision / no collision on the object and allowing “leaning”. Faking detection of a lean with head to floor and hand detection is impractical IMO.

And then WITH a waist tracker there is no need, the capsule will follow the waist instead and the problem doesn’t exist.

The neck offset is a value on the capsule that offsets it by a distance from the camera, you can modify it during runtime if you wish.

I had a working concept with a seperate capsule around the head and some logic to deal with the two, but I didn’t like what it required me to do for multiplayer and I locked the branch away back in 4.12 or so. Its on the very farthest back burner of my features TODO.

Also there is the WalkingMovementOverride collision option, it allows you to use two seperate collision setups, one for locomotion, and one for roomscale. So you can walk through objects like tables, but you can’t locomotion through them.

Aha, thanks for the reply.

While I agree the concept is flawed without a waist tracker, VR hardware is so limited right now that workarounds are necessary to make up for the lack of tech :mad: Can only dream of full body tracking right now.

Anyway WalkingMovementOverride is more or less exactly what I was after, it works very nicely after some quick tests, thanks so much!

There is your answer too, I think! :smiley:

How i can a little bit Rotate the Weapon in Hand on Vive? the Position on Rift is good but on Vive need little bit rotate

Its about a 30 degree pitch difference in general for how you hold the object between touch and vive.

The latest template has controller profiles implemented that take care of that difference for you, see: https://www.youtube.com/watch?v=cRVhdjpMyys

However you can also manually offset grips by OffsetTransformByControllerProfile the node, or manually doing that logic when using a slot grip, or using seperate sockets.

The latest is just that bOffsetByControllerProfile option on the motion controller itself which moves the actual controller to account for the difference.

Thanks for Info, is it possible set the heigt for Vive? Oculus is nice but Vive can be a little bit higher.

Same thing, the offset is a full transform, you can adjust location as well.