Route audio stream from UMediaPlayer to external audio system

I’m looking a way to redirect audio stream from media player to external audio system (Wwise).

We are making game to low-end mobile platforms (700Mb ram) and to reduce memory consumption we are disabled AudioMixer and MetaSounds in our project (AudioMixerModuleName is set to empty for all platforms and MetaSound plugin is disabled). Whole project uses Wwise except built-in UMediaPlayer + UMediaSoundComponent and now it’s working without audio.

Because it’s only one place in the project where native ue audio system is used I would like to try to manually get audio stream and put it into the Wwise if it’s not hard to implement. Wwise provides component AkAudioInputComponent (documentation) which can receive audio from external source but I’m not fully understand how is to properly collect audio samples from media player.

I spent some time to investigate how UMediaPlayer works and how audio stream can be obtained, this is what I was able to find out:

  1. UMediaPlayer owns FMediaPlayerFacade. Facade is responsible for playing media file. When player is started it makes a copy of audio samples from file to FMediaAudioSampleQueue in worker thread
  2. FMediaAudioSampleQueue is created in UMediaSoundComponent and it adds queue to FMediaPlayerFacade. Then FMediaAudioSampleQueue to FMediaSoundGenerator and looks like this generator is used for obtaining audio stream somewhere under the hood of the ue audio system
  3. I found only FMediaSoundGenerator::GetNextBuffer method for obtaining audio samples from generator but I don’t understand how to get audio buffer split it between different audio channels

Based on the above, I have the following questions:

  1. Could you recommend the best way to retrieve audio buffers from a media player for stereo sound?
  2. Need I process buffers somehow before sending to wwise?
  3. Is there anything specific I should pay special attention to in this solution, for example, to ensure synchronization between audio and video?
  4. Do you see any potential pitfalls in the approach I’ve chosen to solve this problem? In particular, I’m interested in whether there are any planned major changes to the player’s architecture that could require rewriting the component or significantly altering the current approach in the future.

UE’s native audio generally is responsible for consuming audio from the `FMediaSoundGenerator` if you are playing the media using the UMediaComponent.

Likely the most well supported route to take here is to use Audio Link. This is an API offered by UE for 3rd party tools to access audio being rendered in UE.

https://dev.epicgames.com/documentation/en\-us/unreal\-engine/audiolink?application\_version\=5\.3

I believe Wwise has an up-to-date implementation of a Wwise-AudioLink, but you’d have to check with them.

As per your approach, there may be some additional latency introduced in the Wwise-AudioLink layer as the 2 systems communicate. Though folks have generally found this is still within the range of perceptual acceptability. If you find that the audio is significantly trailing behind the video, I suggest experimenting with audio buffer sizes in UE and Wwise to minimize latency.

If AudioLink doesn’t work, I would suggest adding your own `FMediaAudioSampleSink` to the `FMediaPlayerFacade` and accessing the audio that way.