VR Expansion Plugin

That doesn’t have anything to do with the grasping hands implementation though, you are copying a pose that is skinned differently or something.
Curious why you are even copying pose from a mesh though, instead of passing in the animation reference.

Yeah i knew it wouldnt do anything. It was a last ditch effort as im getting kinda frustrated. Im using copy pose from mesh as the mesh includes 2 hands, and they are positioned in a tpose, with a distance apart. Using the animBP wouldnt work with mesh. Im probably just going to experiment and try to find a solution.

Oh, then the left hand would have to be animation mirrored in that case.

When held with both hands in the Melee script, the on secondary grip event is not called. Is it normal to distinguish by the “is held” function?

The melee weapons use the bMultiGrip option. Each hand is a distinct full grip instead of one of them being the “secondary attachment” modifier.

Sorry about not getting back sooner - kinda got lost in development :slight_smile:

Did you get around to adding the utility function you mentioned above? From what I could figure out the GripSettings Struct is not usable for my purpose: I’m spawning BPs that each contain multiple Gripable components and need to set the GripPrio for those components. The problem would be solved by having the same approach for Grippable Mesh & Grippable Skeletal Mesh as for VRDial and the others.

No you didn’t get back so I forgot about it ;p

The dials don’t have a “setpriority” function either, they just have a uproperty so it has a setter. Its the same with the Grip settings, but I can add a dedicated setter for all grippables just to ease use a bit. I’ll toss it into an update in 4.24+ tonight.

edit Should be live in 4.24+

Hey , question for you whenever you get a chance.

I’ve managed to map and replicate your grasping hands onto my full IK mesh, using a custom grip component tied to the palm of the hand, and ripped off your collision capsule and overlap model to great effect. It works a treat, and look darn good doing it - but I can’t for the life of me get both the IK and any sort of physics interactions on the hands.

My first inclination was to use physical animation components and profiles on the mesh, setting all bones below hand_r and hand_l to simulate (everything is mapped to epic skeleton.) seems to have no effect at all - all of the examples I was able to find that used physical animation components were all both 1) not using IK and 2) simulating only below a single bone (generally pelvis) rather than two. could be a general misunderstanding I have about the effect of these components, but I think the conclusion I came to is that since IK is based on evaluating toward a target position, it will always attempt to reach that position regardless of collision. I could be wrong here.

Since I came to the first conclusion, my second inclination was to create a separate mesh for the hands, then simulate those as floating hands and use their (post-physics) position data to inform the IK targeting. However, never matched up perfectly - it seems there is an offset from the pivot of the fully simulated hand mesh and the IK targeting, and I can’t for the life of me get those offsets to match up - meaning I can get it to line up perfectly in one position, and think I have the offset transform all figured out - but then moving still results in misalignment. I think is probably related to the hand’s center of mass being near the palm, whereas the IK target is always the wrist.

I’m a bit lost as to where to go from here, and was wondering if you might have any insight into what I’m trying to do, or if I’m just completely barking up the wrong tree. :slight_smile:

Thank you!

1 Like

The physical animation component should work just fine with that setup (two bone chains), you can have multiple bone chains that you effect with it. However the issue that you would run into with your attempted approach there is that the IK runs before the physics does, the way the physical animation component works is that it attaches constraints to each bone and then to a dedicated kinematic actor on the physics thread for each bone. It samples the skeletal target location and rotates and positions the kinematic actor so that the constraint will attempt to position the bone in the animation pose.

#1. only works if the bone is simulating

#2. You have to have constraints between the bones or it will free float (you also want the physical component to be using component space not the world space option).

#3. will have no up chain effect on non simulating bones since it is post IK, with your setup you describe the hand would offset but the wrist would stay in place, you really need to fully simulate the entire arm up to the calvicle if you want that setup to work correctly.

For your second attempt, that is possible (though there are points you would need bone stretching), but you would have to change your skeletal meshs tick group to PostPhysics in order to sync them correctly (animbp is ran from skeletal mesh tick so it would move your IK sampling to post physics). You would also have to sample the hand pose IN the animbp there to also be post physics.

As a third option you could also run a proxy physics object constrained to the controllers as an invisible hand that you track with the IK.

However you might want to look into how the physical grasping hands handle the fingers instead of trying to fully simulate fingers, it works like the Alyx hands where you can still animate the collision but the collision body is one welded object, so there is no instability when colliding into things as forces oscillate up the chain.

Also if you do go fully physics, you may want to consider going the Custom grip type and constraining the object fully to your palm instead, latest template has an example of how to get the correct offsets for that. Though you can also constrain the hand to the object and leave everything as is and let the object pull the arm around as well.

1 Like

Thank you so much! While attempting to follow a disconnected physical skeletal mesh, what I’m seeing is an offset from the hand to the resultant IK solve. I’ve attached some pictures showing , with the effect axes visualized. There’s clearly a roughly ~30 degree Z rotation offset, which can be simply solved with an offset transform, but you can see from the three different positions here that the axis of rotation is off. is contrasted with the the last 2 images, which show directly following the motion controller, which is closer to the exact transform but still not quite. When I wasn’t trying to follow a physics object it wasn’t a big deal to be off, I could solve with a simple transform, but with physics it’s extremely noticable.

The first set of images is following the world-space socket transform of the controller at hand_r. is after setting the constraint reference frame to hand_r instead of the palm, as I saw in your blueprints. I’ve tried making a secondary socket on the hand mesh that would become the IK target, and I can get close, but never close enough it seems, while manually placing the socket.

If you can think of a way to solve , I’m all ears! Otherwise, I’m going to keep looking into other solutions. Based on your response regarding the first approach, it seems like you think it might be possible. If the IK runs before the physics does, isn’t that the order I’d want it to go in? That way the IK affector could still be unaffected by physics while the mesh is, and I can keep the grip components tied to the skeleton.

The last picture is what I tried for physical animations, which resulted in...absolutely no difference whatsoever. Am I missing something? I've already applied a profile to the physics asset...it just seems to do nothing. EDIT: I got physical animations to work. Sort of. I had the wrong collision settings on the skeletal mesh. :) Now to play around with it more.

EDIT2: After toying around with some settings, I had to ramp up the orientation strength of the physical animation data to keep the hand oriented how it’s supposed to. However, has introduced a lot of jitter in the physics. I’m guessing may be related to tick ordering, though I’m not sure. I’ve even had a couple outright crashes during , no debug data, no crash report. Not exactly sure what that is, but might also be tick ordering! With a simulated mesh like , do you think you’d recommend using physics constraints to pull the hands into position (like a marionette,) rather than IK? That’d be a tad cumbersome because I’d have to redo the positioning logic for shoulder and elbow location to be physics-based, but if there’s no way to realistically resolve jitter, I’d much rather put in the time than leave it so bouncy.

If anyone else tries , be sure to set your collision to either totally ignore traces or at least ignore the VR Trace channel. Otherwise you end up gripping yourself and fly into the air:

Thank you so much for your time. You’re a great help!

1 Like

Hi all, I have just switched my VR project over from VR template to the VR expansion plugin, and was wondering if anybody might know how to make one of the controllers do a 45-degree rotation like half-life Alyx does, as since the switch my previous solution is unfortunately not working. I do realize that might not be the right forum to ask in but I figured that it could be worth a shot anyway!! Cheers :slight_smile:

Hi. Is the Example Template released under the MIT license? There is no license file on its github repo, and I’d like to get a confirmation or denial before using things from it.
Either way, thanks for your work, saved me months of work.

See post

Thanks MaSe87!!

I shall now look further into that one then…

A problem that I have run into now with the VR Expansion plugin is that if I place a physics water volume over the whole of the play area, then it seems to stop my knuckles controllers from working for some reason (i.e. the A button will no longer cycle through the control modes properly?)

I am trying to create an underwater VR play area at the moment also using Galidar’s ‘Oceanology’ plugin and am eventually hoping that it will end up playing a little bit like ‘Lone Echo’ does I suppose…

Any help that anyone may be able to give would be very much appreciated, please! Thank you :slight_smile:

Yeah it is, I never placed a license file in it as it wasn’t intended to be a game base as is.
All assets are either free assets available for the engine, 10 min mockups from me, or from the community.
Should be good to use anything.

I think the default hands were set to not cycle movement modes when an object is overlapped that can be grabbed. Your water volume must be blocking the VRTraceChannel and registering as something that can be grabbed (climbed).

Hey there, back again with another question, still can’t comment too much on the added back inherited “Mesh”… But i did notice rotating them -90 seems to cause issues with other meshes that are also attached to the same PRC.

As for the question… here’s a small video example of the issue…
Im not entirely sure is solvable, but maybe it is… Tinkering with the Replication Type under grip settings seems to do something, but not much…

[video]https://streamable.com/chcoh6[/video]

The first part shows what the physics body looks when its being dragged by the capsule as the Server is the player… Second half shows the same case, however as the client joining with 100-200 simulated ping instead, thus a weird “drag” problem arises…

It seems, as the character moves, (pelvis is kinematic to PRC) the physics/simmulated parts of the body get stuck behind, and dont catch up until the “server” version of the Character/Actor has caught up as well… At least is what it seems to look like it’s doing when both screens are placed side by side. These are both using VR Base Characters, and the Character Meshes are the uGrippableSkeletalMeshComponent attached to the PRC. I haven’t tried with the default Inherited mesh yet, as I haven’t switched that to the Grippable one instead yet.

Seems like a strange thing to try and solve, but setting up non Vr characters way allows the VR player to posses anyone and control them and their related functions which is somewhat core to our required gameplay, especially in multiplayer. Let me know if any solutions come to mind, anything is always appreciated as usual.

Are you component replicating the mesh? Because you shouldn’t be. Also make sure its not colliding with another object.

Outside of that, there shouldn’t be any issues.

Also rotating the inheirited mesh shouldn’t effect anything either Edit well there actually was an issue when net smoothing was enabled, I released a fix for that.

No replication set on the mesh and tested with it’s own custom collision channel set to ignore everything, still the same results… Physics seem to be tied to the actors server location rather than the actors client location…

Thank you very much for your response !

Yes, you are correct. I tried to create a brand new physics volume on a brand new VR Expansion template level and it suffers from the same problem, unfortunately.

The player keeps getting stuck and the controls won’t cycle through their settings properly…

Hopefully there will be a fix for at some point eventually?

Best wishes,

J