VR Expansion Plugin

Dear and VRExpansionPlugin devotees,

Wanted to post a quick update for my VR stealth project “Espire 1: VR Operative”.

game is built 160% on top of the VRExpansionPlugin - clearly built upon the sliding locomotion & climbing from the template. The GripInterface, everything is being used.
The VR pawn is working great with both the Vive and the Rift, and the plugin’s VRAIController makes using a VR pawn with AI targeting a breeze. I have also been able to migrate the project from 4.11 (from memory) to 4.15 with very little issues or work in maintaining the plugin components.

is such a fantastic plugin and I hope to support it in any way I can. Of course if Espire 1 is able to be released I hope you will be happy to discuss a royalty for the use of the plugin if that is possible - no support requested in return. It is only because of the plugin that a budding VR designer like myself could test and prototype gameplay mechanics in such a short time. It is also a huge learning resource in the many ways to approach VR interaction, multiplayer replication and more.

Huge thanks for everything you do and I do not understand how you can commit so much time to a project and share the results with the whole world for free. I hope the generosity will be payed back to you tenfold!

Very interested to see what others are doing with the plugin. I will be making regular videos like the one above and will feature the VRexpansionPlugin and how it is being used. I wont post again on here about that to avoid derailing the thread, just wanted to chime in and say thanks again!

Cool update. Thanks for share your all your great and continuos work…

I wish to check because I need to add or modify some behaviours for my purpose in the way to grab it or interact with the constraited actors, following the nodes the scope is till Grip Object by Interface I suppouse all that things are managed between that and VRGrip Interface, all done in C++ and not possible to manage by BP, isn’t it?

What exactly are you looking to modify? I have CustomGrips for things so far out there that none of the current options allow for it that let you do all of the logic yourself for that object (the dial in the 4.16 template uses ).

Also the settings for the grips can be altered during the grip itself so most things are achievable between that and adjusting constraints on the simulating object.

I’m open to considering opening the physics grips up more to blueprint if you have found a need for it that can’t be solved another way currently, I have been debating how much to expose the constraint settings already.

Thanks, I forgot to reply to him

Good to hear, i’m glad everything is working out so far. Its really nice to see you using a Rift in your video as I can’t fully test the plugin with that hardware and have to make some assumptions about compatibility.

No need to avoid posting here though, its actually generally a good idea to provide what others are doing with the plugin for reference of its capabilities, my template isn’t an actual game and doesn’t play like one so its not the full picture.

I’d actually like to eventually make a list of all of the projects that I know of using it and post them with links in the OP eventually.

Some thing I need to do is grab a object with the two hands, but secondary fixed in a specific position (grippable sphere for example), maybe just could be rotated for then launch a skeletal animation, not like the Gun example, moveable the second hand along the mesh.
On the other hand, there are a mini tutorial up to the interface cases? I’ve realized in the template, the BP_PickupCube doesn’t neet to be “Grippable actor”, just a static mesh but with the “VRGrip Interface”, but the Gun, Potion, Door and Drawer actor is based on the Grippable base…

Thumbs up if all you think also than Epic should donate a Rift to **** :cool:

Hey .

Sure thing. I’m now also wondering if going with a slightly delayed yaw rotation for UParentRelativeAttachmentComponent would be a good way to go. And the addition of a neck model so that UParentRelativeAttachmentComponent doesn’t move forwards (and down) as the player looks down is also something I’m thinking of testing. Just trying to make it seem like there is a body there rather than feeling like there are items hanging off of the head. I’ll let you know how it goes once I get to .

Cool beans! Thanks!

Grippable base is there to define the custom settings for a grip, in the template character I set a default set of gripping rules for anything that is just a simulating static mesh which originally the BP_PickupCubes fell under. I have since converted them to inherit the interface though as an example of custom grippable bases that don’t inherit from my pre-made components / actors.

For two handed interactions you “can’t” have both hands rooted in a single position, the object would either need to scale along with their movements or one of the hands needs to be free on an axis as you can’t “constrain” VR hands to specific points as the control of them is outside of the game itself.

You however can attach the visible representation of the hand to the mesh instead so that it appears to be locked in place as most of the gun systems do in games. If instead you mean that you just want the front hand to be the pivot, you would just have to reverse the gripping roles.

I’m not entirely sure of what you are saying, if you actually mean rotate a globe in place, physically constraining it with 2 axis of movement and using two manipulation grips would work for that.

I have been considering adding another type of secondary grip that rotates objects around the center of the two hands like in the VR editor, the original implementation was primarily designed around weapons and guns. However I haven’t needed it for any interaction so far so it hasn’t been a big priority (the current rotation method around primary grip works for most cases and even provides more accuracy in movements in a lot of cases as you have a defined point of pivot).

Updated the 4.16 template tonight, forgot that I wanted to project the controller forward vector onto the floor plane so that you don’t slow down when the controller is facing partially upwards. I meant to do awhile ago but it slipped my mind.

If it’s ok I’d like to make sure I’m doing right. From what I’ve tested it seems only the inherited Parent Relative Attachment produces the desired behavior.

If I add a new, uninherited PRA to the VR camera w/ default transform (l:0,r:0,s:1), I’m seeing a strange problem with its transform; location of the PRA is about 100 meters above the camera, and adding an offset to any meshes or child actors parented to the PRA, it shows the PRA is ~90 degrees rotated on Z.

Should I only be using the inherited PRA?

next thing isn’t really related to the plugin specifically…

Is there a way to attach something (child actor/skeletal mesh/ static mesh) with an offset to a PRA?

Example of why: I want to translate a skeletal mesh from the motion controller back to its default offset on the holster/PRA at any time (on button press).

I think I could accomplish by parenting a dummy skeletal mesh to the PRA, and add 2 sockets to it, but it seems kinda like a hack to me.

Set the parent relative attachment to 0,0,0 relative to the camera always and 0,0,0 rotation, sounds like you added it in and it auto filled in some rotations. Also there is no reason to have multiples of it, what ever you attach to it retains its overall offset so keep one and attach multiple components to that one.

To attach something with an offset, put it where you want it and then attach it…or attach it and set its relative location. It behaves like a normal parent - child relationship except the Parent Root Attachment unrotates itself from the HMD pitch/roll.

Hey, thanks a lot for the reply. That’s what I thought, but they were all default.

I set the relative location after attaching it, and then verified the loc/rot/scale of the PRA. The PRA transform is default (sorry I didn’t expand the rotation fully in the ss, but it is in fact 0,0,0). Sorry, when I saw your first suggestion last week, I wrongfully interpreted it as adding a new PRA to the camera. The inherited one the VR_Character comes with works perfectly with no issue. The non-inherited ones don’t work right for me. They were good in the viewport, but bad in game (made sure all collisions on the meshes were off).

On a side, seeing that the PRA needs to have a default transform relative to the camera, from what I can tell there isn’t an “easy” way to maintain the offset because I change the parent at runtime. What I wanted to do was unparent the skel mesh from the PRA, translate it to the motion controller and reparent it to that (and vice versa), but once I unparent it from the PRA I lose the offset.

But, I accomplished exactly what I wanted by doing the following (it’s dirty):

①Created an invisible skel mesh with 1 bone and added 2 sockets to it (to serve as the offset retainers for each skel mesh’s holster rotation/location)
②Set the invisible skel mesh as the child of the Inherited Parent Root Attachment.
③Added 2 skel meshes as children of the invisible skel mesh (referencing its respective sockets).

Granted I’m not doing something wrong, here’s a suggestion:
Would be cool if the Inherited PRA had 1 bone in it to which sockets could be attached.

The PRA doesn’t need a relative transform from the camera, the PRA is intended to be a proxy camera that you can set things relative to instead, it also does’t need sockets, you can store direct relative transforms to it instead or just a plain scene component if you want an object to pull a location from.

For example, if you add a holster at X:100 Y:0 Z:0 on the PRA it will float 100 units in front of you at all times. If you want to lerp something into place without problems when rotating then attach it on release to the PRA, and lerp it over time to the holster location IN RELATIVE SPACE not world space, that way it still rotates with you so you can’t just spin in circles to prevent it ever reaching its target (Lerp from current relative location to holsters relative location).

If your location is different when de-taching then you need to consider world space when moving the object to the hand, as soon as you detach the object it is no longer in relative space to the PRA but to either its new parent or the world so if you are using a stored variable then it is wrong, so either lerp to the hand in world space, or convert the location prior to detach to relative to the skeletal meshes new parent and lerp in there or do like the above and lerp it in relative space to the PRA and only once it is within a radius of your intended location do you attach it to the new parent (probably best for the same reasons as above, handles body rotation better, lerp from current relative position to controllers WorldTransform.GetRelativeTransform(PRA->ComponentTransform)).

The problem isn’t the PRA’s behavior, you appear to be thinking about how to achieve your desired effect slightly incorrectly.

You’re right, I was going about it the wrong way using bones. Adding scene components seems to be the cleanest of the two ways of achieving what I want, sans the skeletal mesh. Sorry for wasting your time, but thanks a lot for all the info!

Pushed most of my recent work over into the plugins 4.16 branch, patch notes are below


Changed VRGripInterface events to better allow for c++ overrides
(from BlueprintImplementableEvent to BlueprintNativeEvent)


Changed secondary grip bAllowSecondaryGrip into an enum with different types **(WILL REQUIRE BP CHANGES)**
+	SG_None, // No secondary grips
+	SG_Free, // Can secondary grip anywhere
+	SG_SlotOnly, // Can only secondary grip at a slot
+	SG_FreeWithScaling, // With scaling
+	SG_SlotOnlyWithScaling // With scaling

Added prelim secondary grips with scaling (needs more work, but functional).

SlotOnly and Free types are for the users convienance for checking prior to
calling AddSecondaryGrip. (VRTemplate updated to do ).


Changed the build.cs to place a lot more of the Dependancy modules into
private includes instead of public.


Added a VRLogComponent, can be used to render to texture either the console
or the ouput log so that they can be visible and useable in VR.

Also contains functions for passing keyboard and command input into the console
without a keyboard (Template now has an example of  component).

Hi firstly what amazing work. I’m doing a short climate change animation for kids 10-13yrs with the main character (a lemming) will be played by my wife. I want to combine your VR character setup and blend it with Kinect to give my wife an immersive Actor experience to portray the lemming.

My question
I am getting an error message at the startup of the plugin and also after right-click on files execution, and compile. It still doesn’t work.

Can anyone assist, i am a noob with unreal but i definitely want to learn it by doing so i can use it to tackle global issues.

For help a video tutorial would be helpful.

I’ll definitely show the result when done and credit the developer :slight_smile:

You’ll need to build it in visual studio, I only have pre-packaged binaries for 4.15.2 at the moment. The plugin page has a short TODO for installation and setup (should probably be an image based step by step I guess).

I’m probably gonna get hanged for asking …I couldn’t find specific info on it in thread.

tl;dr
What’s the easiest way to implement only the climb ability on a blank VR_Character?

I realize all of the info is already provided, but I lack experience and so am having troubles making sense of it.

I started using a blank VR_Character. Now I’d like to implement climbing into it, rather than migrate it over to the Vive_Pawn_Char.

I want to enable climbing/grabbing on static objects only.

On a blank VR Char, currently I got it set up to where the player can press the thumbpad to equip/unequip each respective hand. The grips/triggers function differently depending on the thing equipped. When nothing is equipped in a hand, then I want to allow the climb ability on that hand.

The Vive_Pawn_Character in the template project has awesome climbing. From what I can tell, it looks like I just need to, copy some functions out of it, and their corresponding vars, and hook them up to the motion controllers in my character to work, in addition to adding a sphere collision component to each MC. But, I’m worried that’s not all.

I was hoping I could get a heads up on what’s required for getting climbing going in a blank VR_Character.

Sorry if has already been covered.

Will try thanks :slight_smile:

The character has the climbing movement type built in at the core classes so that networking works correctly. So yes, actually all you really need to do is handle the few blueprints I setup to tell it what to do.

The climbing movement doesn’t track things itself, it takes a direct movement that you pass in to it and makes sure that it is replicated correctly and turns on/off at times that would prevent hitching. You can pass in things like sliding down a ladder / zip line to it as well.

Just check for overlap / trace when going for a climbing grip and make sure the hit object is static, then you can record the grip position relative to that object, and each tick afterwards pass in the inverse of the difference from that first spot into the climbing movement modes direct movement which should move the hand back into the original position again.

On release you just end climbing movement which sets falling, the character also has a seperate ClimbingStepUpHeight variables that defines how far down you have to reach to be considered as a valid step up during climbing, values of 0 or negative should stop stepping up entirely if that is your wish.

Such requests are better left to PM, send me a message there and if you wouldn’t mind deleting that post too.

Thanks

Pushed a new commit to the repository and updated the template for the new changes



Made all grippables replicate their grip settings by default now with
an optional boolean to turn it off.
Tested new scaling grips in a network enviroment (mostly functional).

Going over the changes as they are fairly visual