Sorry If I missed it, but has anyone successfully had this amazing plugin run on 4.25.1 with Android/Mobil VR (specifically Quest)? If so, anyone want to help a fella? My understanding is that previous versions of UE would need to be uninstalled or would no longer work properly because of the updated NDK/SDk. I would like to know if this can run on the newest Oculus branch and what steps to get it all together. Thank you to anyone with info.
Announcement
Collapse
No announcement yet.
VR Expansion Plugin
Collapse
X
-
Originally posted by DarthJandis View PostSorry If I missed it, but has anyone successfully had this amazing plugin run on 4.25.1 with Android/Mobil VR (specifically Quest)? If so, anyone want to help a fella? My understanding is that previous versions of UE would need to be uninstalled or would no longer work properly because of the updated NDK/SDk. I would like to know if this can run on the newest Oculus branch and what steps to get it all together. Thank you to anyone with info.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
- 1 like
Comment
-
hi. mordentral.
I watched youtube(https://www.youtube.com/watch?v=qs_aeVAIcRA) related to upper body ik.
Is the code related to the upper body ik submitted to the template? or is it already submitted?Last edited by CokeKuma; 06-10-2020, 12:32 AM.
Comment
-
Originally posted by CokeKuma View Posthi. mordentral.
I watched youtube(https://www.youtube.com/watch?v=qs_aeVAIcRA) related to upper body ik.
Is the code related to the upper body ik submitted to the template? or is it already submitted?
Right now the actual movements are far more 1:1 than other options, what I am not happy with is edge cases and some of the fixes for those lowering the quality of the overall model.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View PostHaving seperate hands is because of how the engine handles simulating inversely scaled meshes, the simulation is incorrect and flipped and doesn't account for it (neither collision nor position are correct). So you have to offset to account for that then, then when you attach to something you have to account for it again, it gets really messy to deal with, so a community member made a left hand mirror mesh for me to use instead.
If you are using non physical (not simulating) hands then you can use the inversed mesh fine as it won't have those problems, you just need to spawn another right hand instead of the left hand mesh.
Like this:
https://i.imgur.com/u4kjGQr.png
It will work fine as long as your attachment transform nodes look like this in the copy you have:
https://i.imgur.com/Lqnvfx4.png
I revised the attachment transform logic to be a lot cleaner later on and just convert to world space and base off of there to avoid FTransform issues with inversing offscale transforms (had to use Matrix's to get around it before that). I don't know how old your copy you are working off of is. The current one threw the offsets into a function also as it also added in secondary gripping offsets.
*Edit* I'll note that the spawn picture is showing an addition transform as I changed them to center and rotate to match real life hands, you may want to try out the latest 4.25 copy of them.
i updated the grasping hand blueprint but the mesh deforming issue is still present. The image i shown was just something similar. The only node i have in my animbp was a "copy pose from mesh" node hooked up to the grasping hand mesh. The actuall mesh deforms way more than the image shows.
Comment
-
Originally posted by NebulyDev View Post
i updated the grasping hand blueprint but the mesh deforming issue is still present. The image i shown was just something similar. The only node i have in my animbp was a "copy pose from mesh" node hooked up to the grasping hand mesh. The actuall mesh deforms way more than the image shows.
Curious why you are even copying pose from a mesh though, instead of passing in the animation reference.Last edited by mordentral; 06-10-2020, 09:21 AM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View Post
That doesn't have anything to do with the grasping hands implementation though, you are copying a pose that is skinned differently or something.
Curious why you are even copying pose from a mesh though, instead of passing in the animation reference.
Comment
-
Originally posted by NebulyDev View Post
Yeah i knew it wouldnt do anything. It was a last ditch effort as im getting kinda frustrated. Im using copy pose from mesh as the mesh includes 2 hands, and they are positioned in a tpose, with a distance apart. Using the animBP wouldnt work with this mesh. Im probably just going to experiment and try to find a solution.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by CokeKuma View PostWhen held with both hands in the Melee script, the on secondary grip event is not called. Is it normal to distinguish by the "is held" function?
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by Warner V View Post
Oh, and another thing I found: For the VRDial, VRLever, VRDial & VRMount the "Set Grip Priority" is exposed in blueprint, but for Grippable Mesh and Grippable Skeletal Mesh it is not. Any chance these can be exposed as well with the update to 4.25?
Originally posted by mordentral View Post
That isn't a function, the interactibles are just not based on a grippable base and only implement specifically what they need, so they have a variable on their base with the grip priority. The GripSettings structure contains the normal grippable GripPriority and you can directly set that.
I can add a utility function though.
Did you get around to adding the utility function you mentioned above? From what I could figure out the GripSettings Struct is not usable for my purpose: I'm spawning BPs that each contain multiple Gripable components and need to set the GripPrio for those components. The problem would be solved by having the same approach for Grippable Mesh & Grippable Skeletal Mesh as for VRDial and the others.
Comment
-
Originally posted by Warner V View Post
Sorry about not getting back sooner - kinda got lost in development
Did you get around to adding the utility function you mentioned above? From what I could figure out the GripSettings Struct is not usable for my purpose: I'm spawning BPs that each contain multiple Gripable components and need to set the GripPrio for those components. The problem would be solved by having the same approach for Grippable Mesh & Grippable Skeletal Mesh as for VRDial and the others.
The dials don't have a "setpriority" function either, they just have a uproperty so it has a setter. Its the same with the Grip settings, but I can add a dedicated setter for all grippables just to ease use a bit. I'll toss it into an update in 4.24+ tonight.
*edit* Should be live in 4.24+Last edited by mordentral; 06-11-2020, 08:06 PM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Hey Mordentral, question for you whenever you get a chance.
I've managed to map and replicate your grasping hands onto my full IK mesh, using a custom grip component tied to the palm of the hand, and ripped off your collision capsule and overlap model to great effect. It works a treat, and look darn good doing it - but I can't for the life of me get both the IK and any sort of physics interactions on the hands.
My first inclination was to use physical animation components and profiles on the mesh, setting all bones below hand_r and hand_l to simulate (everything is mapped to epic skeleton.) This seems to have no effect at all - all of the examples I was able to find that used physical animation components were all both 1) not using IK and 2) simulating only below a single bone (generally pelvis) rather than two. This could be a general misunderstanding I have about the effect of these components, but I think the conclusion I came to is that since IK is based on evaluating toward a target position, it will always attempt to reach that position regardless of collision. I could be wrong here.
Since I came to the first conclusion, my second inclination was to create a separate mesh for the hands, then simulate those as floating hands and use their (post-physics) position data to inform the IK targeting. However, this never matched up perfectly - it seems there is an offset from the pivot of the fully simulated hand mesh and the IK targeting, and I can't for the life of me get those offsets to match up - meaning I can get it to line up perfectly in one position, and think I have the offset transform all figured out - but then moving still results in misalignment. I think this is probably related to the hand's center of mass being near the palm, whereas the IK target is always the wrist.
I'm a bit lost as to where to go from here, and was wondering if you might have any insight into what I'm trying to do, or if I'm just completely barking up the wrong tree.
Thank you!
Comment
-
Originally posted by Benjamin Paine View PostHey Mordentral, question for you whenever you get a chance.
I've managed to map and replicate your grasping hands onto my full IK mesh, using a custom grip component tied to the palm of the hand, and ripped off your collision capsule and overlap model to great effect. It works a treat, and look darn good doing it - but I can't for the life of me get both the IK and any sort of physics interactions on the hands.
My first inclination was to use physical animation components and profiles on the mesh, setting all bones below hand_r and hand_l to simulate (everything is mapped to epic skeleton.) This seems to have no effect at all - all of the examples I was able to find that used physical animation components were all both 1) not using IK and 2) simulating only below a single bone (generally pelvis) rather than two. This could be a general misunderstanding I have about the effect of these components, but I think the conclusion I came to is that since IK is based on evaluating toward a target position, it will always attempt to reach that position regardless of collision. I could be wrong here.
Since I came to the first conclusion, my second inclination was to create a separate mesh for the hands, then simulate those as floating hands and use their (post-physics) position data to inform the IK targeting. However, this never matched up perfectly - it seems there is an offset from the pivot of the fully simulated hand mesh and the IK targeting, and I can't for the life of me get those offsets to match up - meaning I can get it to line up perfectly in one position, and think I have the offset transform all figured out - but then moving still results in misalignment. I think this is probably related to the hand's center of mass being near the palm, whereas the IK target is always the wrist.
I'm a bit lost as to where to go from here, and was wondering if you might have any insight into what I'm trying to do, or if I'm just completely barking up the wrong tree.
Thank you!
#1. This only works if the bone is simulating
#2. You have to have constraints between the bones or it will free float (you also want the physical component to be using component space not the world space option).
#3. This will have no up chain effect on non simulating bones since it is post IK, with your setup you describe the hand would offset but the wrist would stay in place, you really need to fully simulate the entire arm up to the calvicle if you want that setup to work correctly.
For your second attempt, that is possible (though there are points you would need bone stretching), but you would have to change your skeletal meshs tick group to PostPhysics in order to sync them correctly (animbp is ran from skeletal mesh tick so it would move your IK sampling to post physics). You would also have to sample the hand pose IN the animbp there to also be post physics.
As a third option you could also run a proxy physics object constrained to the controllers as an invisible hand that you track with the IK.
However you might want to look into how the physical grasping hands handle the fingers instead of trying to fully simulate fingers, it works like the Alyx hands where you can still animate the collision but the collision body is one welded object, so there is no instability when colliding into things as forces oscillate up the chain.
Also if you do go fully physics, you may want to consider going the Custom grip type and constraining the object fully to your palm instead, latest template has an example of how to get the correct offsets for that. Though you can also constrain the hand to the object and leave everything as is and let the object pull the arm around as well.Last edited by mordentral; 06-12-2020, 08:27 AM.
Consider supporting me on patreon
My Open source tools and plugins
Advanced Sessions Plugin
VR Expansion Plugin
Comment
-
Originally posted by mordentral View Post
The physical animation component should work just fine with that setup (two bone chains), you can have multiple bone chains that you effect with it. However the issue that you would run into with your attempted approach there is that the IK runs before the physics does, the way the physical animation component works is that it attaches constraints to each bone and then to a dedicated kinematic actor on the physics thread for each bone. It samples the skeletal target location and rotates and positions the kinematic actor so that the constraint will attempt to position the bone in the animation pose.
#1. This only works if the bone is simulating
#2. You have to have constraints between the bones or it will free float (you also want the physical component to be using component space not the world space option).
#3. This will have no up chain effect on non simulating bones since it is post IK, with your setup you describe the hand would offset but the wrist would stay in place, you really need to fully simulate the entire arm up to the calvicle if you want that setup to work correctly.
For your second attempt, that is possible (though there are points you would need bone stretching), but you would have to change your skeletal meshs tick group to PostPhysics in order to sync them correctly (animbp is ran from skeletal mesh tick so it would move your IK sampling to post physics). You would also have to sample the hand pose IN the animbp there to also be post physics.
As a third option you could also run a proxy physics object constrained to the controllers as an invisible hand that you track with the IK.
However you might want to look into how the physical grasping hands handle the fingers instead of trying to fully simulate fingers, it works like the Alyx hands where you can still animate the collision but the collision body is one welded object, so there is no instability when colliding into things as forces oscillate up the chain.
Also if you do go fully physics, you may want to consider going the Custom grip type and constraining the object fully to your palm instead, latest template has an example of how to get the correct offsets for that. Though you can also constrain the hand to the object and leave everything as is and let the object pull the arm around as well.
The first set of images is following the world-space socket transform of the controller at hand_r. This is after setting the constraint reference frame to hand_r instead of the palm, as I saw in your blueprints. I've tried making a secondary socket on the hand mesh that would become the IK target, and I can get close, but never close enough it seems, while manually placing the socket.
If you can think of a way to solve this, I'm all ears! Otherwise, I'm going to keep looking into other solutions. Based on your response regarding the first approach, it seems like you think it might be possible. If the IK runs before the physics does, isn't that the order I'd want it to go in? That way the IK affector could still be unaffected by physics while the mesh is, and I can keep the grip components tied to the skeleton.
The last picture is what I tried for physical animations, which resulted in...absolutely no difference whatsoever. Am I missing something? I've already applied a profile to the physics asset...it just seems to do nothing.
EDIT: I got physical animations to work. Sort of. I had the wrong collision settings on the skeletal mesh.Now to play around with it more.
EDIT2: After toying around with some settings, I had to ramp up the orientation strength of the physical animation data to keep the hand oriented how it's supposed to. However, this has introduced a lot of jitter in the physics. I'm guessing this may be related to tick ordering, though I'm not sure. I've even had a couple outright crashes during this, no debug data, no crash report. Not exactly sure what that is, but might also be tick ordering! With a simulated mesh like this, do you think you'd recommend using physics constraints to pull the hands into position (like a marionette,) rather than IK? That'd be a tad cumbersome because I'd have to redo the positioning logic for shoulder and elbow location to be physics-based, but if there's no way to realistically resolve this jitter, I'd much rather put in the time than leave it so bouncy.
If anyone else tries this, be sure to set your collision to either totally ignore traces or at least ignore the VR Trace channel. Otherwise you end up gripping yourself and fly into the air:
Thank you so much for your time. You're a great help!Last edited by Benjamin Paine; 06-12-2020, 10:43 PM.
Comment
Comment