Hi there, I am new in this specific engine so bear in mind that I might be missing something basic.
I am using the VRTemplate, and then setting the VRPawn to replicate:
Unfortunately when I start a multiplayer game players don’t see the other player’s hands/headset moving. I guess that the replication does not work that way but I couldn’t figure out what’s wrong exactly. I also noticed that only the headset’s static mesh appears in the VRPAWN and I can’t find the hand controller mesh in my project files which is super weird because they definitely appear in game, where are they?
I’m unsure about the specifics of what’s happening here, but I wanted to mention the VR Expansion Plugin. It’s my observation that this plugin is basically a prerequisite to implementing VR multiplayer.
The hands are populated by the headset itself I think. Its a plugin thing. It gets the controller model related to the headset and spawn them. If does not find it, them will spawn the default.
Try adding a simple mesh and set it to replicate. See if it works. I manage to make it work by adding a simple static mesh of a box on top of the player, under the camera component of the VR player pawn.
I also needed this.
After a bit of research i found that the best is to send the data via rpc to the server, and let the server replicate.
someone on discord helped me clarify this. and also a friend who knows about it.
" Vaei: The absolute minimum information you need to send is Location and Rotation The usable minimum probably also includes Velocity and possibly Acceleration.
It would be good practice to compress the information in some manner also - possibly using an FFastArraySerializer so you only replicate what/when you need - after you have it working.
This should all be contained within a struct.
Then you send it from Client (Auth) → Server via unreliable RPC on Tick.
And the server Replicates this data with COND_SimulatedOnly. Client auth is very simplistic.
Vaei: … (about replicating variables). replication is server to client only. So the client must send the information in an RPC instead of using replication. But then the server can replicate that to all other clients (i.e COND_SimulatedOnly)"
edit notes:
use tick intervals to control the frequency.
you can tween things on the simulated proxies.
i found an issue with unreliable multicast rpcs and/or unreliable server calls. i can’t remember how i solved it. but was something with a setting (like setreplicatesbydefault or setreplicates, or setactivebydefault).
the answer to your question in literal form would be : profiling.
the most helpful answer would be: tick interval. by changing the tick interval you can specify how often you do something.
i’ve created a component that sends the info from the client to the server, using unreliable rpc, with a tick interval of … 1sec or (configurable to your needs).
the server then forwards that to the other clients using unreliable multicasts, (or you can use property replication, which will incur in pretty much the same cost, but are less controllable (without extra code)).
then the simulated proxies will tween and predict on those values in their own ticks. each tick interval is controlled with a significance plugin (i’ve made my own but there’s also the one from tom looman on github), so it’s pretty efficient.
and with the added net-relevancy correctly configured, you only do as much calculations/network as needed.
the replication for other stuff doesn’t work much different than that.
edit: well done on thinking about network (maybe cpu) performance and keeping that in mind.
I have tested this in the last few days and basically it works. But an RPC call interval of 1 sec leads into a laggy update on the server and clients of the headset and hands transforms.
In my proof of concept project the RPCs are currently called with an interval of .2 sec and the updated transforms in between are interpolated. But after all I think it’s better to create a new VRPawn, one hat is not visually displayed as headset and single hands - at least for the clients.
yes lag is a problem i havent been able to solve. i think implementing prediction would help but that’s more complex, i think.
having a lower interval is also what i’ve been playing with.
i don’t understand why creating a new vrpawn is better, but good luck.
A continuous update of the headset rotation for example, would not be needed at all. You could call the updating RPC outside of tick, only when the player is turning his head on a special amount of degrees, just like as done with the ‘turn in place’ operation on a 3rd person character. Actually you could do the same thing with the hand rotation, maybe with the exception when grabbing something.
I have also something in mind about how to implement movement prediction, will keep you updated as soon as I have put together something that works. For now I will continue to make the default VR pawn working with multiplayer.
I see. For my needs the rpc is more than enough and simplifies the code, which is important in terms of development/maintenance code.
But if you want to optimize here are some ideas:
You can use property replication instead of an rpc. i know that some structs allow for partial data transmission (which is lighter and more efficient on the network) (i just don’t remember atm exactly how, maybe that’s the default behaviour). or you could have several properties.
if you go with property replication you could do a check, and if the difference is negligible you don’t set the property (or don’t send to the server), effectively preventing replication. you’ll still need an rpc from client to server to send the data afaik.
as for prediction, please do let me know when you find a solution.
To replicate VR movement, use motion controllers for hand tracking, sensors for positional tracking, and VR software to map real-world actions into virtual environments for immersive experiences.