Hi, I was able to set up the PixelStreaming plug-in and stream audio/video both locally and over the internet as per the guide available online. However, it appears the audio is not spatialized when heard on the client i.e. all the spatialized audio sources in the engine project don’t sound spatialized on the client device (browser). It basically sounds like the mix has collapsed to mono.
Is this because the prerendered frames of audio from the engine (that are sent to the WebRTC proxy server) are themselves mono/not spatialized, or is it because on the client end the browser/WebRTC has some setting/limitation that outputs only mono audio?