I want to develop a multiplayer VR application that uses pixel streaming to provide the video and audio to the users over the internet. If it’s possible, how to do it?
Hi ZL1202 welcome to the forums,
Cool question, but I don’t think that pixel streaming would be able to provide the sort of latency required for comfortable VR experiences. (I’ve yet to see any VR pixel streaming OR Multiplayer pixel streaming apps.)
My thought is that If you change the direction you are looking in an HMD, needing to wait for your new look direction to be sent to a server, and for that server to compute the frame and send it back to you would feel delayed/sluggish and cause nausea. Also network hiccups might be ok in a 2D app, but might not be acceptable in VR.
I’d love to be proven wrong though! Surely one day in the near or distance future we’ll be feeding our AR/VR glasses with cloud-computed images.
about VR application and pixel streaming, I think it is possible by way:
- server render full dom (equirectangular ) like 360 video in youtube, It will solve delay when change the direction of HMD
- client using web-xr (threejs) to display video
- internet require stream big video(8k)
both server and client will using more resource in case 2d app
Do you have any experiences distributing your unreal engine scene to customers through cloud computing services like azure?