Plugin

Hey ! First thanx for your work !
However, is there anyway to get the “Event Hydra Controller Moved” values for each controller ? it works with debug sphere, however, i’m trying to move to different component differently and i don’t find anyway to get it working. Would be really nice to get an event like “Event Right Hydra Controller Moved” and “Event Left Hydra Controller Moved”.
I think there is a way by getting axis values and stuff, but the event is way simpler and compact

[=MattOstgard;133418]
Oh man that is awesome. Any idea when you’ll be ready to release an update?
[/]

will be a separate plugin.

I’m currently still working on IMU based controller integration for plugin which is a key input group in addition to direct position controllers such as the hydra/stem/kinect. is because the plugin is meant to be a middle-point plugin which will abstract away body position data from the actual hardware that provides it. Since we will have a lot of new input devices coming out in the near future, integrating each one directly will be counterproductive and not future proof.

I believe that most of these devices trying to forward parts of a whole ‘body position’ data, which means hands, fingers, limbs, and general skeletal information. The idea behind a plugin of nature would be to abstract that data structure in a way that can be accessed by developers directly, and allow for the motion input plugin to forward and merge the actual hardware input to that data structure. will allow developers to focus on the VR aspects of development and not the input binding as well as allowing for easy integration of future input technology without changing any game logic code. Imagine akin to Input Mapping but for a body position data set.

Downstream from that data structure will be a convenience character(pawn) bp that will have IK/FK handles attached to the UE default skeleton. If you use the body position data, you will be able to easily forward that to the skeleton fully or partially, using IK/FK or where it is missing use animation. Or if you want to use the data set in non-skeletal way you can ignore that convenience character or build your own (say if you wanted to control wings instead of limbs and may not want a 1:1 mapping).

In the coming week I hope to release an early github for and to get input from other VR developers regarding structure, needs, and overal design for a plugin of sort; hopefully some help can be had, so that we can build something robust which we will all use.

[=Darknoodles;134045]
Hey ! First thanx for your work !
However, is there anyway to get the “Event Hydra Controller Moved” values for each controller ? it works with debug sphere, however, i’m trying to move to different component differently and i don’t find anyway to get it working. Would be really nice to get an event like “Event Right Hydra Controller Moved” and “Event Left Hydra Controller Moved”.
I think there is a way by getting axis values and stuff, but the event is way simpler and compact
[/]

question was asked by PMBallisticDK earlier in thread, the answer remains the same:

Each blueprint event emits an integer called ‘controller’. Simply add an IF statement and compare it to the controller you want (typically 0 for left, 1 for right) and any statements after that IF statement will only be for the controller you want. Additionally if you want to support people potentially misplacing their controllers, you can make a call to ‘HydraWhichHand(int32 controller)’ which will determine which hand the controller is being held in (returning 0 for left, 1 for right). is determined by where the controller was last docked (which side of the dock it was).

Thanks ! my bad, i however browsed the previous page, but as it was very late i may have skipped it :frowning: thanks again :slight_smile:

[=;135034]

In the coming week I hope to release an early github for and to get input from other VR developers regarding structure, needs, and overal design for a plugin of sort; hopefully some help can be had, so that we can build something robust which we will all use.

[/]

Cool I’ll keep an eye out for it. Thanks!

[=;135034]

In the coming week I hope to release an early github for and to get input from other VR developers regarding structure, needs, and overal design for a plugin of sort; hopefully some help can be had, so that we can build something robust which we will all use.

[/]

Sounds awesome !

I can’t wait to take a look at the Github and test it, and help out. Are you going to set up an issue tracker so we can easily find current bugs/features to implement and easily submit them as pull requests? I think that would be great.

Hey ! Thanks for all the work. Had a quick search and can’t quite find what I want to do with the plugin… rather than explicitly setting the input mapping from the editor (which works fine) I would like to set a default state for one of my pawns. similar to:

UPlayerInput::AddEngineDefinedAxisMapping(FInputAxisKeyMapping(“CinePawn_Yaw”, EKeys::MouseX, 1.f));

However, when I try to replace EKeys::MouseX with say EKeysHydra::HydraLeftRotationYaw I can’t get it to compile b/c the required header FHydraPlugin.h cannot be found by my project. I don’t have an issue finding the public headers such as HydraDelegate.h etc though. What is the best way to expose the EKeysHydra so I can create my own definedAxisMapping in code? Thanks!

Hi guys, I’m having some trouble getting the plugin to work. I set the input as advised but nothing seems to happen. If I turn on the Sixense Motion Creator 2 I get some movement, but as a joypad. Please could someone post an example project so I can see what I’m doing wrong?

Also, the VRMotionInput plugin looks incredible , I can’t wait to try it :slight_smile:

[=savantguarde;150021]
Hey ! Thanks for all the work. Had a quick search and can’t quite find what I want to do with the plugin… rather than explicitly setting the input mapping from the editor (which works fine) I would like to set a default state for one of my pawns. similar to:

UPlayerInput::AddEngineDefinedAxisMapping(FInputAxisKeyMapping(“CinePawn_Yaw”, EKeys::MouseX, 1.f));

However, when I try to replace EKeys::MouseX with say EKeysHydra::HydraLeftRotationYaw I can’t get it to compile b/c the required header FHydraPlugin.h cannot be found by my project. I don’t have an issue finding the public headers such as HydraDelegate.h etc though. What is the best way to expose the EKeysHydra so I can create my own definedAxisMapping in code? Thanks!
[/]

When I moved the EKeysHydra structure in 0.6.2 to clean up the dependencies it hid the keys from C++ input mapping. With 0.6.5, these have now been moved back to the delegate and their object code definition has been moved to the delegate as well. means your required use case will now work, just make sure your project has a HydraDelegate.cpp copy in the project source folder in order for it to compile (which you will want to remove when you compile for shipping since it collapses the dll into monolithic .exe).

[=davidmcclure]

Hi guys, I’m having some trouble getting the plugin to work. I set the input as advised but nothing seems to happen. If I turn on the Sixense Motion Creator 2 I get some movement, but as a joypad. Please could someone post an example project so I can see what I’m doing wrong?

Also, the VRMotionInput plugin looks incredible , I can’t wait to try it

[/]

Will need more information to help you out. How are you trying to use the plugin? The simplest way to use it is to drop the HydraPluginActor into your scene and use the input mapping system. The second simplest way is to use the provided blueprint events that are emitted inside HydraPluginActor to bind the received data to whatever you want to do as is shown in the video.

The VRMotionInput plugin is a bit delayed for now, have a lot on my plate atm, but will hopefully get around to releasing the base code soonish ™.

Hello!

I’ve never used the hydra before and I’m currently attempting to follow along with your tutorial that you posted back in April using the hydra-ue4-master folder. However, it appears that due to the updates to both Unreal and possibly the plugin too, some of the nodes that are being used in your video tutorial do not appear to exist in any of my available event nodes lists. I am currently using version 4.4.3 of Unreal. For example, the node “Event Hydra Undocked/Docked” does not appear. Nor does “Hydra controller moved” or “Hydra trigger changed”. In other words, I’m having trouble following along and could use some help! Would there happen to be a way of accessing the nodes you were using in the tutorial or could there be a way to substitute the nodes you were using with ones that I can access in version? Really, any information you could give me about how I could go about learning how to use the Hydra with Ue4 would be awesome, and I would greatly appreciate your help! Thank you!

[=aialexander;155760]
Hello!

I’ve never used the hydra before and I’m currently attempting to follow along with your tutorial that you posted back in April using the hydra-ue4-master folder. However, it appears that due to the updates to both Unreal and possibly the plugin too, some of the nodes that are being used in your video tutorial do not appear to exist in any of my available event nodes lists. I am currently using version 4.4.3 of Unreal. For example, the node “Event Hydra Undocked/Docked” does not appear. Nor does “Hydra controller moved” or “Hydra trigger changed”. In other words, I’m having trouble following along and could use some help! Would there happen to be a way of accessing the nodes you were using in the tutorial or could there be a way to substitute the nodes you were using with ones that I can access in version? Really, any information you could give me about how I could go about learning how to use the Hydra with Ue4 would be awesome, and I would greatly appreciate your help! Thank you!
[/]

The nodes are the same, you need to have your plugin enabled first. Follow the video again, pay attention to how to install and enable the plugin. Once you’ve confirmed its enabled, sub-class the HydraPluginActor in blueprint (use class viewer to find it). After placing the new blueprint in the scene you will receive all of those notifications inside your sub-classed blueprint actor.

is all shown in the video and explained in the wiki/readme.

The only thing that is different is that input mapping isn’t shown, but that is covered in both the wiki and in the readme.

@, for the case where you want to use the Hydras to control the hands, how do you set that up to get from base-station-relative to something that’s usable as an IK target?

My current idea is to do something like this:

C = inverse(A) * B

Where A is the bone transform for the hand, B is the transform for the Hydra, and C is the bone-transform relative Hydra transform. Does that make sense at all? I’m working in Blueprint btw.

@

Hello again!

While I’m not sure what I did right time, I took your advice and created a new project and re-copied the “Plugins” folder to my project folder and it worked. All of the events were there inside the HydraPluginActor Blueprint. I’m not sure what exactly caused that problem to begin with. That I know of, I did not do anything differently. I know for certain that the plugin was enabled when I first tried it out because I found the HydraPluginActor immediately when I searched in the Class Viewer. In any case, thank you for your response!

[=;156363]
@, for the case where you want to use the Hydras to control the hands, how do you set that up to get from base-station-relative to something that’s usable as an IK target?

My current idea is to do something like this:

C = inverse(A) * B

Where A is the bone transform for the hand, B is the transform for the Hydra, and C is the bone-transform relative Hydra transform. Does that make sense at all? I’m working in Blueprint btw.
[/]

I would also be interested in learning this. Ideally, I would like to set up a pair of hands in Unreal that will respond the motion of the Hydra and will have the ability to interact with objects in the scene. However, because I am not a programmer, I am uneducated in process. So far, has been my only resource to find these answers. I have seen plenty of examples of working in Unity, but not as many in Unreal, and those that have succeeded in implementing functionality that I have reached out to have not yet been able respond to my questions. If anyone has knowledge of process that would be willing to share it with me I would greatly appreciate a push in the right direction. :slight_smile:

[=aialexander;155760]
Hello again!

While I’m not sure what I did right time, I took your advice and created a new project and re-copied the “Plugins” folder to my project folder and it worked. All of the events were there inside the HydraPluginActor Blueprint. I’m not sure what exactly caused that problem to begin with. That I know of, I did not do anything differently. I know for certain that the plugin was enabled when I first tried it out because I found the HydraPluginActor immediately when I searched in the Class Viewer. In any case, thank you for your response!

I would also be interested in learning this. Ideally, I would like to set up a pair of hands in Unreal that will respond the motion of the Hydra and will have the ability to interact with objects in the scene. However, because I am not a programmer, I am uneducated in process. So far, has been my only resource to find these answers. I have seen plenty of examples of working in Unity, but not as many in Unreal, and those that have succeeded in implementing functionality that I have reached out to have not yet been able respond to my questions. If anyone has knowledge of process that would be willing to share it with me I would greatly appreciate a push in the right direction.

[/]

My guess is you missed placing the actor in the scene, but I’m glad its working for you now :). Regarding the second part, see my answer below or if you’re patient VR Motion plugin will eventually be released which will have setup for the hydra as a default down the line.

[=;156363]
@, for the case where you want to use the Hydras to control the hands, how do you set that up to get from base-station-relative to something that’s usable as an IK target?

My current idea is to do something like this:

C = inverse(A) * B

Where A is the bone transform for the hand, B is the transform for the Hydra, and C is the bone-transform relative Hydra transform. Does that make sense at all? I’m working in Blueprint btw.
[/]

In the actor where you wish to use IK, find where the root component origin is. Translate Hydra Base to Origin and all your position data will be now in component space (actor local space). You can then translate easily to the bone space, or you can use directly by selecting the bone position relative to component space.

To translate the base to origin I did the following: 1) require a T-pose calibration, center point of both hands at pose is the shoulder mid point. 2) Add shoulder to origin offset; using a hardcoded Z(height) offset is OK for most use cases, will need another data point to provide full flexibility, but to my knowledge torso lengths do not vary by a massive amount and can easily be corrected by the user by calibrating with arms a little higher or lower. In my case the origin was near the waist so it was something like 40cm down from shoulder midpoint. Essentially you are getting a vector pointing from base -> player’s actor origin. All hydra positions with correction will now report the distance from that point.

While I am planning to have included automatically as a calibration class in VR Motion plugin, my current work load is unfortunately delaying the plugin, so I hope will help you get your hands working in the meantime!

[=;156555]
My guess is you missed placing the actor in the scene, but I’m glad its working for you now :). Regarding the second part, see my answer below or if you’re patient VR Motion plugin will eventually be released which will have setup for the hydra as a default down the line.

In the actor where you wish to use IK, find where the root component origin is. Translate Hydra Base to Origin and all your position data will be now in component space (actor local space). You can then translate easily to the bone space, or you can use directly by selecting the bone position relative to component space.

To translate the base to origin I did the following: 1) require a T-pose calibration, center point of both hands at pose is the shoulder mid point. 2) Add shoulder to origin offset; using a hardcoded Z(height) offset is OK for most use cases, will need another data point to provide full flexibility, but to my knowledge torso lengths do not vary by a massive amount and can easily be corrected by the user by calibrating with arms a little higher or lower. In my case the origin was near the waist so it was something like 40cm down from shoulder midpoint. Essentially you are getting a vector pointing from base -> player’s actor origin. All hydra positions with correction will now report the distance from that point.

While I am planning to have included automatically as a calibration class in VR Motion plugin, my current work load is unfortunately delaying the plugin, so I hope will help you get your hands working in the meantime!
[/]

Ohhhhh okay, that explains your code snippet from before. I didn’t realize that’s how you were doing it. For whatever reason at the time I just figured it’d be better to do it more correctly math-wise and try and get the hydra position into bone space.

Anyway, I’ll give that a try, thanks :slight_smile:

Edit: So I suppose after calculating the offset, I just want to subtract it from any reported hydra positions?

[=;156658]
Ohhhhh okay, that explains your code snippet from before. I didn’t realize that’s how you were doing it. For whatever reason at the time I just figured it’d be better to do it more correctly math-wise and try and get the hydra position into bone space.

Anyway, I’ll give that a try, thanks :slight_smile:

Edit: So I suppose after calculating the offset, I just want to subtract it from any reported hydra positions?
[/]

The problem with the math approach is that you do not have a reference to where the player is sitting/standing in reference to the base. You cannot escape needing a calibration unless you have a controller as a body-fixed point (which you will be able to do with a 3+ point STEM system). If you’re using the hydras as hands, they will both be movable points in relation to the body so you need to get a vector to the players origin to adjust for base position variance (real world table heights and offsets).

The other thing to consider is that different players have different arm lengths and while you would think a direct 1:1 mapping would be ideal, the avatar people embody may be larger/smaller than they are in real life and you will need to scale hydra movement to be in sync with the avatar. A T-pose handles both of these things which is why I consider it the ideal calibration for Hydras.

In terms of the offset, it depends on how you are getting your vectors (adding or subtracting), but it should be RawHydraPosition + BaseToShoulderVector + ShoulderToOriginOffset = HydraInOrigin. If you are using world positions remember that all of these are relative and you will need to subtract out the actor world position from the instances you use world position.

Just wanted to add that one of the best ways to debug vectors is to draw them (Draw Debug Arrow). One of the very cool things about rift development is that you can see these vectors in true 3d, so you can visualize the whole setup to ensure every vector is doing the correct thing.

[=;156986]
The problem with the math approach is that you do not have a reference to where the player is sitting/standing in reference to the base. You cannot escape needing a calibration unless you have a controller as a body-fixed point (which you will be able to do with a 3+ point STEM system). If you’re using the hydras as hands, they will both be movable points in relation to the body so you need to get a vector to the players origin to adjust for base position variance (real world table heights and offsets).

The other thing to consider is that different players have different arm lengths and while you would think a direct 1:1 mapping would be ideal, the avatar people embody may be larger/smaller than they are in real life and you will need to scale hydra movement to be in sync with the avatar. A T-pose handles both of these things which is why I consider it the ideal calibration for Hydras.

In terms of the offset, it depends on how you are getting your vectors (adding or subtracting), but it should be RawHydraPosition + BaseToShoulderVector + ShoulderToOriginOffset = HydraInOrigin. If you are using world positions remember that all of these are relative and you will need to subtract out the actor world position from the instances you use world position.

Just wanted to add that one of the best ways to debug vectors is to draw them (Draw Debug Arrow). One of the very cool things about rift development is that you can see these vectors in true 3d, so you can visualize the whole setup to ensure every vector is doing the correct thing.
[/]

Sorry to keep bugging you about this, but I just wanna clear up some stuff in your equation there.

BaseToShoulderVector = Vector from the base station to the shoulder mid point?

ShoulderToOriginOffset = Shoulder mid point to character origin?

[=;156555]

In the actor where you wish to use IK, find where the root component origin is. Translate Hydra Base to Origin and all your position data will be now in component space (actor local space). You can then translate easily to the bone space, or you can use directly by selecting the bone position relative to component space.

To translate the base to origin I did the following: 1) require a T-pose calibration, center point of both hands at pose is the shoulder mid point. 2) Add shoulder to origin offset; using a hardcoded Z(height) offset is OK for most use cases, will need another data point to provide full flexibility, but to my knowledge torso lengths do not vary by a massive amount and can easily be corrected by the user by calibrating with arms a little higher or lower. In my case the origin was near the waist so it was something like 40cm down from shoulder midpoint. Essentially you are getting a vector pointing from base -> player’s actor origin. All hydra positions with correction will now report the distance from that point.

[/]

I hope you will forgive me, but I don’t understand what means. Is done with the character Blueprint, or is done using C++? Am I correct to assume that the shoulder midpoint of the chosen actor will be the Hydra Base or does the base refer to something else? Is done by dropping the Hydra Plugin actor into the Character Blueprint? Is the Shoulder to Origin offset added using a Blendspace? Is there a kind soul out there that would be willing to give me a breakdown of this? I am unsure of where in the editor to look first.

[=;157211]
Sorry to keep bugging you about this, but I just wanna clear up some stuff in your equation there.

BaseToShoulderVector = Vector from the base station to the shoulder mid point?

ShoulderToOriginOffset = Shoulder mid point to character origin?
[/]

Whoops! Made direction mistakes in the definitions I gave you, it should be

RawHydraPosition - BaseToShoulderVector - ShoulderToOriginOffset = HydraInOrigin

see masterpiece :rolleyes: :

can of course be rewritten as:
RawHydraPosition + *ShoulderToBaseVector * + *OriginToShoulder *= HydraInOrigin

NB: General way to visualize adding vectors


if you subtract them simply reverse the direction of the B vector.

[=aialexander;157272]
I hope you will forgive me, but I don’t understand what means. Is done with the character Blueprint, or is done using C++? Am I correct to assume that the shoulder midpoint of the chosen actor will be the Hydra Base or does the base refer to something else? Is done by dropping the Hydra Plugin actor into the Character Blueprint? Is the Shoulder to Origin offset added using a Blendspace? Is there a kind soul out there that would be willing to give me a breakdown of this? I am unsure of where in the editor to look first.
[/]

I plan to make an example of how to set up once I get it set up myself, if you’re willing to wait a couple of days.

[=;157281]
Whoops! Made direction mistakes in the definitions I gave you, it should be

RawHydraPosition - BaseToShoulderVector - ShoulderToOriginOffset = HydraInOrigin

see masterpiece :rolleyes: :

can of course be rewritten as:
RawHydraPosition + *ShoulderToBaseVector * + *OriginToShoulder *= HydraInOrigin

NB: General way to visualize adding vectors


if you subtract them simply reverse the direction of the B vector.
[/]

Perfect, thanks dude, is really helpful. I think I should be able to get it set up in the next day or two using the diagram and other info here.

[=aialexander;157272]
I hope you will forgive me, but I don’t understand what means. Is done with the character Blueprint, or is done using C++? Am I correct to assume that the shoulder midpoint of the chosen actor will be the Hydra Base or does the base refer to something else? Is done by dropping the Hydra Plugin actor into the Character Blueprint? Is the Shoulder to Origin offset added using a Blendspace? Is there a kind soul out there that would be willing to give me a breakdown of this? I am unsure of where in the editor to look first.
[/]

is purely referring to the vector math required to translate the raw hydra positions (what you get from Hydra Moved event/etc) to actor space.

A T-pose means you ask the user to spread out their arms forming a T (can be standing or sitting). You will capture the position that the hydras report at point, for example by asking them to push a button when they hold pose, then save the Left and Right hydra positions that you get when they do this.

If you then do (Left Hydra Raw + Right Hydra Raw)/2, you get the shoulder midpoint position of your user in hydra space. You can now use calibration point to translate all your future input into actor space.

is achieved by
RawHydraPosition - BaseToShoulderVector - ShoulderToOriginOffset = HydraInOrigin

All of is simply vector addition and division which you can implement in your blueprint.

You can then use the HydraInOrigin position vector to drive your IK setup.

[=]
I plan to make an example of how to set up once I get it set up myself, if you’re willing to wait a couple of days.
[/]

Very cool, looking forward to seeing your implementation!