Hi.
I created hand sockets on my characters skeletal mesh and with the use of the preview mesh I adjusted the sockets to fit the preview mesh like if it was being held, All good so far.
I’m then moving the hands with IK to the transforms of a similar mesh to the preview mesh in the scene, BUT its setting the hands bones directly to this, and I want the hands to use the hand sockets transforms as some kind of offset so they align perfectly just like they did in the preview mesh feature.
Im just having a hard time doing the math to pull this off.
Any ideas of a good way to solve this.
Normally you attach the right hand (bone) to the object mesh socket. Then use IK on the left hand to get it to appear attached to another object mesh socket.
For right hand alignment you create an Actor class for the mesh. Then adjust the location in the viewport. Once you have that you can create a Left Hand Offset variable (vector). You’ll need this for the left hand IK.
Hey Thanks for the reply, I fixed this last night, My requirements are different as Its a VR thing. I looked through some old code where I did something similar and worked it in to the new thing.
Using a combination of get sockets relative transforms /world transfroms of target and inverse transfrom.
Im trying to grab objects in vr but unreal grab component works fine but in case of accuracy it does not work the way it was supposed to be in my application.
So can we move or manipulate our oculus hand right and left skeletal mesh according to our grab object ? For example, if i have to grab a sphere my hand will wrap around it like the way we grab any sphere object in real world.
Is it possible to do ? Can you please help me or how did you fixed it ?