I have been struggling with setting up IK on my character’s head, arms and legs for a few days now (it’s actually a VR project with Vive Trackers). The idea is that the characters’ mesh movement in the game follows the real movement on the real world of the player as close as possible. It works fine on single player, but the moment I tried to set everything up for networking, it gets worse and worse with each test ._.
Can anyone give me any insights on what is the propper way to do this? I don’t really know what setup was the least worse, but the idea I’ve been following is:
- Get the client to send to the server variables related to camera rotation and trackers position/rotation
- Load the variables above on the event graph of the anim BP, on the server and then multicast to other players.
- Use the variables to configure the IK
What really is the trick to replicate IKs?