Audio not spatial over PixelStreaming

Hi, I was able to set up the PixelStreaming plug-in and stream audio/video both locally and over the internet as per the guide available online. However, it appears the audio is not spatialized when heard on the client i.e. all the spatialized audio sources in the engine project don’t sound spatialized on the client device (browser). It basically sounds like the mix has collapsed to mono.

Is this because the prerendered frames of audio from the engine (that are sent to the WebRTC proxy server) are themselves mono/not spatialized, or is it because on the client end the browser/WebRTC has some setting/limitation that outputs only mono audio?


Never mind, figured it out. The default encoding being used by the WebRTC Proxy Server is OPUS, and the default offer sdp is mono. Updating the offer sdp to include stereo=1; sprop-stereo=1 worked.

Nice one! Im gonna look into making some PixelStreaming audio collaboration thing.