Plugin

Hi . First of all, thanks great work. :smiley:
I’m new with UE4 and I’m starting to work on animating a character arms based on Hydra controllers. I got a question regarding IK while rotating hands. I’ve used @cleerusher123 calibration and it is perfect, cheers man.
My questions are: ā€œHow do I rotate my forearm and arm while rotating my handsā€? ā€œIs there a common position to set arm pole vectors while using FABRIKā€?

Cheers

I want to share with you my experience with hand rotation. I had a lot of problems before and I think that maybe some other noob, like me, will enjoy with my solution:
My first problem were the rotation. As we already know Euler rotation aren’t so good because of gimbal lock and minimal angular changes that causes maximum rotations, so I had to tweak my rotations and adding something here, subtracting there, invert axis and so on and so forth.
While looking at your solution had different angular recalibration due to different hand’s bone axis rotation. Thus, since quaternion does not suffer of gimbal lock and are best for rotation here comes my solution at the problem:
Why not rotating the Hydra controller quaternion to match hand’s quaternion? Doing so I can avoid tweak for recalibrating hands and just use the rotation of the hydra and the hands will rotate perfectly.

  1. First of all we need to calibrate, so after putting our self in T-Pose we press the left Hydra button and then do the calibration



    some times the HydraID is inverted, so we check and swap it if needed. We get the shoulder mid point and into Set Quaternion Offset we get the difference (in quaternion) between Hydra and hand quaternion, and save the value locally (code will be added below).
  2. In playerController we get, every tick, the Hydra controller data and save position and rotation in local variable that will be given to the animation blueprint of the character.


    We add the position of the hydra with the position of our hydra-shoulder mid-point and the shoulder midpoint of the character (is a fixed position). GetProperHandRotation rotate the Hydra quaternion with the offset quaternion saved during the calibration and in output it give the ā€œrealā€ rotation of the hands relative to the T-Pose.
  3. is the code

    I had to create temporal variable, it is probably a waste of resources but without doing so it crashes all the time

PS. I had to modify the plugin to get the quaternion in blueprint. Probably the good way will be to attach the plugin to the playerController.cpp but I found much easier and readable accessing via blueprint.
Sorry for my bad english, I’m working on it.

[=iferoporefi;334674]
I want to share with you my experience with hand rotation. I had a lot of problems before and I think that maybe some other noob, like me, will enjoy with my solution:
My first problem were the rotation. As we already know Euler rotation aren’t so good because of gimbal lock and minimal angular changes that causes maximum rotations, so I had to tweak my rotations and adding something here, subtracting there, invert axis and so on and so forth.
While looking at your solution had different angular recalibration due to different hand’s bone axis rotation. Thus, since quaternion does not suffer of gimbal lock and are best for rotation here comes my solution at the problem:
Why not rotating the Hydra controller quaternion to match hand’s quaternion? Doing so I can avoid tweak for recalibrating hands and just use the rotation of the hydra and the hands will rotate perfectly.

  1. First of all we need to calibrate, so after putting our self in T-Pose we press the left Hydra button and then do the calibration
    …
    …
    …
    some times the HydraID is inverted, so we check and swap it if needed. We get the shoulder mid point and into Set Quaternion Offset we get the difference (in quaternion) between Hydra and hand quaternion, and save the value locally (code will be added below).
  2. In playerController we get, every tick, the Hydra controller data and save position and rotation in local variable that will be given to the animation blueprint of the character.
    …
    …
    We add the position of the hydra with the position of our hydra-shoulder mid-point and the shoulder midpoint of the character (is a fixed position). GetProperHandRotation rotate the Hydra quaternion with the offset quaternion saved during the calibration and in output it give the ā€œrealā€ rotation of the hands relative to the T-Pose.
  3. is the code
    …
    I had to create temporal variable, it is probably a waste of resources but without doing so it crashes all the time

PS. I had to modify the plugin to get the quaternion in blueprint. Probably the good way will be to attach the plugin to the playerController.cpp but I found much easier and readable accessing via blueprint.
Sorry for my bad english, I’m working on it.
[/]

Hey iferoporefi cool share there, though some things could be a bit simpler :). In general the rotation of the hydra shouldn’t need calibration instead you only calibrate the base location in relation to the user and then require the hydra to be properly facing forward (cables coming out of the back of the base station). The other thing I would raise is that you can avoid quaternions in blueprints completely by using combine rotators which take rotators and use quaternions internally.

[=iferoporefi;331999]
Hi . First of all, thanks great work. :smiley:
I’m new with UE4 and I’m starting to work on animating a character arms based on Hydra controllers. I got a question regarding IK while rotating hands. I’ve used @cleerusher123 calibration and it is perfect, cheers man.
My questions are: ā€œHow do I rotate my forearm and arm while rotating my handsā€? ā€œIs there a common position to set arm pole vectors while using FABRIKā€?

Cheers
[/]

Generally you IK/FABRIK to your wrists and then use FK for the wrist rotation. If you want to link the wrist rotation to the elbow e.g. something like the chicken maneuver you would need to move the effector location based on a vector pointing from your desired rotation ā€˜bottom’.

[=Brunohbk;327099]
Hi , thanks for developing plugin, it’s great and it’s really helpful.
Anyway, i have a question for you.
I’m trying to rig the arms of a 3D character with Hydra.
I have linked the wrist joints to the hydra controllers, but I would like to get the local rotation instead of the global rotation.
Is there a way to get it?

Any help appreciated.
[/]

In blueprint you use combine rotators with the rotations you want to combine. E.g. if you wanted to add the pawns facing rotation you would combine the hydra rotation with the pawn/control rotation and the net result would be the sum of the two.

Just want to confirm: the new motion controller support in 4.9 preview doesn’t include Hydra yet, does it (didn’t seem work for me but I didn’t take a look at the source yet to see if they were using the SDK)?

I’m very interested to know that too. I even did not figure out how to use the new motion controller interface in the 4.9, is it accessible in BP or just in C++?
Thanks for all your work !
Edit²: I don’t know if some of you receive the STEM updates, but the STEM Core API and VR SDK (including UE4 integration) will be backward compatible with the .
The Core API is available now (but the UE4 integration is not entirely ready yet), and the VR SDK should be released in september.
So, good news!

I was wondering if the new motion controller stuff in 4.9 will effect the future of plugin? Do you think it will have all the functionality that plugin gave us when 4.9 supports the Hydra? its the only motion/hand controllers currently available to people like me who don’t have access to a Vive dev kit or Oculus touch dev kit. plugin was a godsend.

On a completely different subject. Gitnamo, you have done really great things for the community so far and your work on all these plugins has been outstanding. I have just received my Perception neuron motion suit with gloves and full body mocap for VR. It is awesome. The only problem is the guys who made it are not integrating it with UE4 anytime soon (they have a few other priorities they say). They have a unity plugin but I (and many many others who backed it on kickstarter) really would like UE4 integration. Is something you would consider looking into? the SDK is available here. Don’t worry if not as I imagine you are very busy but just let it be known that you would be a huge hero to many of us perception neuron kickstarter backers who are really eager for UE4 integration of the suit for VR.

[=muchcharles;343921]
Just want to confirm: the new motion controller support in 4.9 preview doesn’t include Hydra yet, does it (didn’t seem work for me but I didn’t take a look at the source yet to see if they were using the SDK)?
[/]

It does not yet, I am planning to make 4.9 version which will support the motion controllers. You should see something hopefully sometime next week.

[=LNaej;353134]
I’m very interested to know that too. I even did not figure out how to use the new motion controller interface in the 4.9, is it accessible in BP or just in C++?
Thanks for all your work !
Edit²: I don’t know if some of you receive the STEM updates, but the STEM Core API and VR SDK (including UE4 integration) will be backward compatible with the .
The Core API is available now (but the UE4 integration is not entirely ready yet), and the VR SDK should be released in september.
So, good news!
[/]

Also a STEM backer, but I haven’t had a to look into their API, I imagine they will want to support the motion controllers as well down the line.

[=Mrob76u;369601]
I was wondering if the new motion controller stuff in 4.9 will effect the future of plugin? Do you think it will have all the functionality that plugin gave us when 4.9 supports the Hydra? its the only motion/hand controllers currently available to people like me who don’t have access to a Vive dev kit or Oculus touch dev kit. plugin was a godsend.

On a completely different subject. Gitnamo, you have done really great things for the community so far and your work on all these plugins has been outstanding. I have just received my Perception neuron motion suit with gloves and full body mocap for VR. It is awesome. The only problem is the guys who made it are not integrating it with UE4 anytime soon (they have a few other priorities they say). They have a unity plugin but I (and many many others who backed it on kickstarter) really would like UE4 integration. Is something you would consider looking into? the SDK is available here. Don’t worry if not as I imagine you are very busy but just let it be known that you would be a huge hero to many of us perception neuron kickstarter backers who are really eager for UE4 integration of the suit for VR.
[/]

Appreciate the kind words Mrob76u, I’m looking forward to your VR dungeon crawler :D.

Regarding the 4.9, see above as I do plan to have it directly support the Motion Controller abstraction and I had one version working when I was at the London Vive jam. It allowed me to test vive interaction without going to the limited units that were available for the jam without any code change between the use cases. All you really need to do is set the motion controller position whenever the hydra moved event happens (and have a calibration step).

For the hydra, 2x motion controller points + input mapping give you a pretty much complete use case so that will probably be the recommended route going forward.

Regarding the Neuron, it was one of the kickstarters I didn’t back, but if the community gets me a suit for integration I’ll make a plugin for it np.

That’s great news. I look forward to the new 4.9 plugin. Thanks

The plugin is being heavily featured on the Twitch stream today =).

One point if you watched it, he talks about how the origin of the tracking is at the base station and how that may make you need to move it closer, but if you read through the thread here, people have given blueprints for calibrating the coordinates by moving the controllers to certain positions and pressing a calibrate button.

Yep that’s cool. No pressure on a 4.9 update now lol :wink:

[=muchcharles;379647]
The plugin is being heavily featured on the Twitch stream today =).

One point if you watched it, he talks about how the origin of the tracking is at the base station and how that may make you need to move it closer, but if you read through the thread here, people have given blueprints for calibrating the coordinates by moving the controllers to certain positions and pressing a calibrate button.
[/]

Haven’t seen the stream, but it could be the Epic’s internal hydra plugin and not one!

[=Mrob76u;379655]
Yep that’s cool. No pressure on a 4.9 update now lol :wink:
[/]

Ask and ye shall receive… well an experimental commit at least :slight_smile:

grab the experimental branch zip here.

You should be able to just follow Epic’s Motion Controller Setup documentation page and it will work as expected.

Since is an experimental commit, it will break all previous compatibility; however you no longer need to add anything to the scene/blueprints in order to use input mapping, and you get motion support the same way as shown in the twitch stream and epic’s documentation. Please test it and give me feedback on bugs and what features you really miss from the old style.

Once all the bugs are ironed out and we figure out how to do calibration easily (I’m thinking a function library callable from anywhere), I’ll add back in the missing Hydra specific components with multi-cast delegation and you’ll have best of both worlds in a stable release :smiley:

Update to E0.8.0 (experimental branch)
-Adds support for UE 4.9 hardware agnostic Motion Controller interface,
both for Input Mapping and Left/Right hand 1:1 motion.
-commit removes past compatibility, wait for stable release if you
wish to have both.

[=;380733]
Haven’t seen the stream, but it could be the Epic’s internal hydra plugin and not one!

[/]

It was one =) Awesome work, downloading the 4.9 experimental now after I mess around with the Infiltrator release.

You are the man. I will test out and let you know.

Great work. So easy to setup and seems to track so much better than before. Thanks very much.

Maybe I’m being simple, but I cannot get the motion controllers to behave as I’d hope. I have to move the wands meters in front of the base unit to have them appear where they ought to be.

The pawn I’m using for my player character is mostly default - the only changes are a slightly narrower capsule, and some extra interaction functionality - but otherwise everything is standard.

I’ve tried moving the MC component to the inverse of the hand position against 0,0,0, but it doesn’t actually seem to do anything.

[=muchcharles;381338]
It was one =) Awesome work, downloading the 4.9 experimental now after I mess around with the Infiltrator release.
[/]

Just watched the stream and you’re right! It was interesting to see someone else’s approach from a blank slate to the plugin. I think I will need to fix that docking problem by remembering the left/right offset directly and assume 0=left 1=right unless you dock different.

[=Crow87;382416]
Maybe I’m being simple, but I cannot get the motion controllers to behave as I’d hope. I have to move the wands meters in front of the base unit to have them appear where they ought to be.

The pawn I’m using for my player character is mostly default - the only changes are a slightly narrower capsule, and some extra interaction functionality - but otherwise everything is standard.

I’ve tried moving the MC component to the inverse of the hand position against 0,0,0, but it doesn’t actually seem to do anything.
[/]

The experimental branch doesn’t have a calibration function directly available yet, I’m thinking about what the best approach would be (probably a global function you can call). If you don’t want to wait for a stable release which will include such a function, you can calculate the correct offset yourself and change the parent position to both motion controllers.

The hydras report their position from the center of the base station (the green orb). Whatever is your parent (e.g. pawn’s root component) of your motion controllers will then act as that center. So to move the positions forward you need to calibrate the hydras to a known point. On page 3 in thread we talked about one calibration method, which is to make a T-Pose with your arms and hitting calibrate (capturing the positions at that pose), which then gives you the offset from the base to the shoulder midpoint (and arm length from the vector/2), you can then add offset so that the controllers appear almost 1:1. You can set offset as your controllers’ parent position. E.g. make an invisible component parent the two motion controllers and just set that component’s position as the offset and it will work as you expect it.

There are other calibration methods which may work better for VR, e.g. if you place your controller by your face (you know your hmd position) and then extend your arm and take a second calibration point, you get the same offsets but with higher reliability to get closer 1:1 match.

[=;383070]
Just watched the stream and you’re right! It was interesting to see someone else’s approach from a blank slate to the plugin. I think I will need to fix that docking problem by remembering the left/right offset directly and assume 0=left 1=right unless you dock different.

The experimental branch doesn’t have a calibration function directly available yet, I’m thinking about what the best approach would be (probably a global function you can call). If you don’t want to wait for a stable release which will include such a function, you can calculate the correct offset yourself and change the parent position to both motion controllers.

The hydras report their position from the center of the base station (the green orb). Whatever is your parent (e.g. pawn’s root component) of your motion controllers will then act as that center. So to move the positions forward you need to calibrate the hydras to a known point. On page 3 in thread we talked about one calibration method, which is to make a T-Pose with your arms and hitting calibrate (capturing the positions at that pose), which then gives you the offset from the base to the shoulder midpoint (and arm length from the vector/2), you can then add offset so that the controllers appear almost 1:1. You can set offset as your controllers’ parent position. E.g. make an invisible component parent the two motion controllers and just set that component’s position as the offset and it will work as you expect it.

There are other calibration methods which may work better for VR, e.g. if you place your controller by your face (you know your hmd position) and then extend your arm and take a second calibration point, you get the same offsets but with higher reliability to get closer 1:1 match.
[/]

Ah, now I feel silly. Was trying to move the actual roots for the individual motion controllers - didn’t realise they could be parented and moved that way. Oops.

I’m assuming experimental branch doesn’t yet package properly, as I’m hitting errors every time I build with the plugin loaded, but successful without.

[=Constructive Illusions;388767]
It kinda works in 4.9…but only without the Rift (DK2). When I start the game in ā€œVR Previewā€ mode, the left and right head movements move the hands (MotionController Components) into the oposite direction. When I look left, the hands go right and the other way round.
[/]

You may have pawn rotation attached to your hmd rotation, see A new, community-hosted Unreal Engine Wiki - Announcements - Epic Developer Community Forums for details on how to separate those. The easiest option is the blueprint one by adjusting the camera manager in a custom player controller.

[=Constructive Illusions;389549]
Thanks, that fixed it :slight_smile:

But there’s another problem :stuck_out_tongue: ā€œHydra Left Joystick Y (and X)ā€ doesn’t work while ā€œHydra Right Joystick Y (and X)ā€ listens to both controllers :confused:
[/]

Just checked the source and it shouldn’t be a problem, did you dock both hydras before trying the joysticks? You can also use the Motion Controller input mapping events to keep the bind hardware agnostic.

First up, i want to thank you VERY much for the work you have put into this, and importantly the recent update to accommodate 4.9, i recieved my hydra today, and am simply loving it!

I did want to let you know that i am experiencing consistent crashes when using the Rift Dk2, in VR preview mode. During play everything works beautifully, however when i go to exit the VR preview (esc) the editor crashes, unfortunately without displaying crash info.

I’m not doing anything very interesting with the hydra, merely tracking their position to display hand interactables.

I realise is not tremendously helpful to debug, and may indeed be more to do with the rift, if there is anything you would like to have tested or debugged please let me know how i can help.

Cheers