I am trying to essentially play audio through a pixel stream. I want to manually play USoundWave assets from C++ through the streamer to the user’s browser.
However, whenever I try to package the game the audio data disappears from USoundWave->RawPCMData. By disappears it is always size 0.
How do I access the data in a packaged build from the sound wave to play?
The most important thing to note when it comes to audio through Pixel Streaming, is that all audio that’s captured is done so through the engines default audio submix.
This means if you’re playing audio through any other mechanism that does not go through the default submix, the Pixel Stream will not be able to stream that audio.
This works really well but the part im struggling with is the interpolation between the audio samples based on RPM. I essentially have an async task that is reading current RPM and based on that interpolating between these waves. However, I get this really weird static noise in between I imagine because of the buffer is not complete.
Just as an update to this, I’ve managed to solve some problems.
I’ve modified the Unreal Engine code to give me access to the number of samples of the buffer so I dont over fill it but the audio still becomes degraded overtime. By degraded I mean that the amplitude is going down/blips in it