In the grippable object, it is a GripInterface function that you can override and pass out anything from. It lets you override the default behavior of finding the closest correct prefix socket and instead pass out any transform that you want.
Would you be talking of “Closest Grip Slot in Range” or am I still looking in the wrong place?
Yeah that one, you can return “true” for found one, and any transform that you like (in world space). You can override it per object to return different results (like component locations instead of sockets, ect).
In this case it would be a projected location to your object based on the hand position relative to it.
You could also do all of this in the character itself, but it sounds like it is a one off thing?
First of all, amazing, amazing work you’re doing. Like you’ve mentioned in your video’s I am having a tougher time getting started than I did with the SteamVR plugin for Unity but I know it will be worth it once I get through this. This is being touched on right now so I wanted to chime in to also mention that I would love to implement something similar to the skeleton poser in the steamVR plugin on the Unity side as I find it a major immersion booster and a lot of fun to do.
So this could be a combination of looking at the current transform of the object you’re trying to grab and passing that on to the hand socket to make it transition more smoothly? In the Echo Arena video it seems like its snapping to the hand but maintaining its own rotation on the horizontal axis but taking over some rotation from the hand. And then using an animated pose to match the transform of the object? I’m not exactly sure how I’ll work make those two play nice together but I’m just trying to understand the basic theory behind implementing this.
I’ll be donating to the patreon as soon as I have a little spare cash as I really think its amazing what you’ve done over these last few years.
The SteamVR plugin over at unity has a bunch of default poses that it blends between on the animation side. IE: a default hand pose for gripping each object, then it blends in the current finger Curl values of the Index controllers with an open hand pose. When no object is held I believe that it just uses the hard finger skeletal transforms instead.
Their poser doesn’t “live solve” geometry or deal with any position poses (as far as I know its all snap grip based), it just blends pose states. Though I do think they have a manual poser in it so you can create poses in engine and save them out for use.
In UE4 that is fairly simple to achieve, you can pass out a “grip pose” animation from the object and apply it in the AnimBP. The reason I don’t have an example of this in the template is because I honestly just don’t have any hand animations to use for it. I was considering adding an interface query to get the suggested pose from the gripped object, but its something I have mostly felt was better handled on the users end as they may end up wanting it very complex and any default implementation may not be enough information.
Letting the end user poll for the requested animation pose themselves with a custom interface is likely better in the end.
Yuri I believe has a hand posing plugin available on the marketplace (VR Hands Solver), I haven’t tried it myself so I won’t vouch for it but you can look at it and see if it meets your needs for that part as well. It uses a custom skeletal mesh though which means it will be a bit incompatible with some things.
Love the plugin, I have a simple question - how or where do I access the skeletal mesh component on the VRcharacter? I know I can add one, but I am using root motion movement which only applies to the inherited mesh, so for now I am not using root motion but would very much like to.
Looking forward to your feedback.
You are using root motion movement with a VR Character?
Or do you mean you are using it with a VRCharacter but it is possessed by AI?
I removed the built in skeletal mesh from the component array as I had assumed that no-one would be using root motion for movement in VR and it would have to be re-parented anyway. I would be interested in (any) good reason why someone would actually be using it. I’ve actually been actively removing root motion checks in the character movement component in order to avoid the (minor) perf hit of checking if it is active.
Finished up cleaning up the OpenVRInput module over the past two days to prep for Index release.
It now cleanly handles when someone is using the Official Valve plugin and doesn’t try to handle input anymore and just has the new component and gesture detection.
Also finished up the replication for it, with multiple replication modes and smoothing, the documentation on www.vreue4.com for it will be revised tomorrow with the new information.
In the GIF below the right hand is using replicated data (not local) and is updating at 5htz (5 times a second), the smoothing I added makes it a lot more fluid. I set the default 10htz but the value is fully customizable to whatever network load you are willing to allow.
I have done some readings on the documentation (https://vreue4.com/) but I am still not sure whether this plugin will be good for my usage case. And I cant afford to read all the pages in this post (213 pages lol). So here is a question:
I have an oculus rift and also leapmotion for finger movement. And they need to be replicated. I know networking is the core feature of the plugin. Will this plugin help me? Is it going to be plug and play thingy or I still need to integrate other libraries? Thank you.
Well it would replicate everything except for the leap motion fingers, those would require custom replication as they don’t tie into any of the base engine controller framework.
Ouch… thank you. The fingers replication has been bugging me for quite some time.
I have an example of finger replication in my OpenInput module that you could reference, when its cross platform I have to drop Valve official compression and self replicate the finger transforms (for index controllers / skeletal input), its not all that hard to adapt to any skeletal input (may end up doing that in the future).
That is at 5htz (5 times a second) update rate over the network with smoothing, the right side hand is replicated and the left side hand is local. I default it to 10htz but wanted to show a lower speed one for examples sake.
In VRExpPluginExample Content\VRExpansion\Vive\Testing\SnapPoint\SnapActor is failing to compile. Needs to be updated to latest changes.
Fixed and uploaded, I think that entire example actually need an overhaul, not entirely sure of its use currently. But its compiling again.
Got a few questions about your OpenVRInput module and the official SteamVRInput plugin. I don’t fully understand how some of this stuff works so apologies if some of my questions don’t make sense.
- Firstly, am I right in thinking the documentation you mentioned hasn’t been updated yet regarding replication?
- Is there any reason to use the official plugin over yours? You say you have made it cleanly handle using official plugin but still allowing your gesture detection, why use the official plugin at all though as yours can also handle the skeltal input?
- Can SteamVRInput/your plugin handle using a vr glove controller (eg manus)? This might not really make much sense as, at least for the steamvrinput plugin, i don’t think thats the point of the plugin. But for your plugin at least if it was possible then the replication/gesture detection would be very useful.
Yeah I got sidetracked dealing with some things with it and haven’t updated the documentation yet. Generally its just ticking on “replicate skeletal data” and choosing a replication mode. There are no real complicated steps. I’ll get around to updating the documentation soon, I only finished final testing last night and just need to clean some things up now.
Mine only handles the action system currently because the built in engine version does not contain skeletal input capacity yet and I didn’t want to ship a module that “requires” another party’s module to function. The Valve plugin also handles the action system and when it is installed mine turns that part of itself off in order to let it work as intended, basically it comes down to if you prefer the valve plugin or the native engine “Beta Input” style (mine is the beta input but with modifications to allow the skeletal system). Eventually when the engine gets some patches in that fixes the problem i’ll be removing that part of my module entirely.
As far as what the difference is from the valve plugin with that aside, theirs doesn’t handle mapping the hands to full body meshes, or replication, or easy gesture detection. They also are using some proxy inputs and a large quantity of input keys for each supported controller for mapping inputs which is a bit of a mess and some may prefer to not deal with until it is re-factored. If you are doing single hand meshes in singleplayer though I don’t see a reason why you couldn’t / wouldn’t just use theirs.
- If the vr glove controller makes OpenInput drivers for them then they would work out of the box, since they use vive trackers in their promotional material I would hope that they do it correctly and actually write a vendor driver for them for SteamVR and everything “just works”.
Sweet. Just checking I wasn’t being blind.
Thanks for all the info! I don’t really understand some of what you’ve said but as I work a bit more with this stuff I’ll check back and hopefully the other things will make more sense.
Unfortunately that is not the case. The gloves work completely separately to steamvr and the trackers are independently used to position the wrist/hand. I’m gonna submit a support ticket and ask if they have plans to do that now I know though!
My project is multiplayer and will have full body ik so from what you’ve said looks like your plugin might be very helpful. Does your gesture detection/replication work only for the skeletal data from your open input module or will it also work if skeletal data coming from SteamVRInput or from something like the manus glove? Main focus for me is actually manus and then supporting vive controllers as a fallback.
The skeletal replication currently IS of the OpenInput skeletal data, I haven’t expanded it to cover anything else yet. Assuming they correctly create their SteamVR controller driver they should be able to pass out the skeletal pose through the API and have it just work.
Hey man! After getting your advice on using the slot grips I found the tutorial you had on the documentation website and it seems to be working alright yet my objects scaling is being multiplied by 100. In the actual object which is a disc, I have the scale as 0.1 but when I grab it it sets it to 10. Any idea on how to fix this?
When you get a socket world transform it has its parent object scale, so when you make it relative to the object it scales incorrectly (doubles up the scale), when you are passing out the transform set the scale to 1. I may be switching over to that node passing out the relative space transform instead soon though, depends on how that would effect users as generally just passing out world transform of things is easier.
Edit That is assuming that you are overriding that function, if you aren’t then it should be working correctly.