Playing SoundWave directly through pixel stream

Hi Guys,

I am trying to essentially play audio through a pixel stream. I want to manually play USoundWave assets from C++ through the streamer to the user’s browser.

However, whenever I try to package the game the audio data disappears from USoundWave->RawPCMData. By disappears it is always size 0.

How do I access the data in a packaged build from the sound wave to play?

Hi there!

The most important thing to note when it comes to audio through Pixel Streaming, is that all audio that’s captured is done so through the engines default audio submix.

This means if you’re playing audio through any other mechanism that does not go through the default submix, the Pixel Stream will not be able to stream that audio.

It is possible and I am doing it currently. I currently have this setup:

I have a camera in C++ that creates render targets to add multiple cameras that are all streamed out from inside Unreal Engine.

Second to that, I have 16 audio files each between 1000-8000RPM at 1000RPM intervals and 0 or 100 Load. I am using this to send Audio to the Pixel Stream: I Pixel Streaming Audio Input | Unreal Engine 5.4 Documentation | Epic Developer Community

This works really well but the part im struggling with is the interpolation between the audio samples based on RPM. I essentially have an async task that is reading current RPM and based on that interpolating between these waves. However, I get this really weird static noise in between I imagine because of the buffer is not complete.

Is there a better way to be doing this?

Furthermore

I just tried sending out a 440HZ tone and this is the sound received back. I send out new data every 10ms

Just as an update to this, I’ve managed to solve some problems.
I’ve modified the Unreal Engine code to give me access to the number of samples of the buffer so I dont over fill it but the audio still becomes degraded overtime. By degraded I mean that the amplitude is going down/blips in it