I hope somebody will be able to help me with this or at least give me some direction. I’m working on implementing the valve index controllers (a.k.a. knuckles) over the network however I’m having a hard time understanding how the index controllers generated hand pose would be replicated over the network.
As far as I understood the SteamVR Input plugin offers two animation graph nodes, one takes care of the hand pose (given some parameters) and the other offsets the wrist bone so that the hands are positions perfectly relative to the controllers. Traditionally you would replicate animations over the network simply by keeping the animation state machine in sync with all the other clients connected. The problem is that the hand pose given by the SteamVR Input plugin is generated on the fly and not part of the state machine.
So far, the only method I could think of is using the snapshot pose feature that allows you to cache a skeletal mesh pose intro a structure then send the structure over the network and feed the pose into the animation blueprint of the right skeleton. To get this to work with just four hands I had to tweak some of the network bandwidth related variables inside of the DefaultEngine.ini file as sending such structures far exceeds the previously set values. This really doesn’t feel or seem the right way of implementing the knuckles animations over the network.
So, is there something obvious I’m missing? Do you know of any other ways for replicating the knuckles animation over the network?
I have a stand alone module that replicates the skeletal transforms for SteamVR that you could reference, it could be cleaner but its still in between the official engine module and the steamVR module so it can’t be really cleaned up until the engine version is eventually updated. OpenVR ships with a skeletal transform compression/decompression function pair but it still has to be used manually and is platform locked.
It handles a few different cases, including cross platform where you can’t actually decode the native SteamVR compressed skeletal data and have to use a custom compression setup. It also handles smoothing the data between updates in a naive way.
(right hand is replicated here and updating at 10htz with smoothing)
As for the Valve plugin and replication…its not currently really compatible with it, they don’t allow feeding in custom values to their blueprint nodes. But the actual input side of things can be used and then the animation data replicated on top of it with alternate nodes just fine. I am unsure as to when/if their plugin is intended to support replication of the skeletal data (or full body meshes for that matter…).
I believe that their general stance is to replicate the finger curl/splay values and blend between open/closed states for each finger based on the value. Something that likely is “good enough” with current controllers but would start to age with full gloves or leap motion style input.