Metasounds - Subtitles don't randomise when using "random get" node

Hi, hopefully the repro steps explain what’s going on, but within Metasounds is there a way to get the index of a wave asset after shuffling them with “Random Get” and applying that to the subtitle index in the Metasound settings?

We have our dialogue system which takes a Metasound, plays the audio, and displays the subtitle information from that Metasound - but at the moment the subtitles only play whatever is in index 0, and don’t reflect the randomisation of the wave assets themselves. How would we do this currently?

Steps to Reproduce
Set up a Metasound with the “Random Get” (WaveAsset:Array) node, and plug in different soundwaves to enable it to randomly select them.

Right click your Metasound within the content browser, and add your Subtitle information via the Property Matrix editor

Ensure the subtitles indexes match the Soundwave indexes in your Metasound input.

Now trigger your Metasound: the wave is randomised but the subtitles are not, the subtitle plays its index 0 every time.

Hi Matthew,

The intention was for metasounds to not support subtitles currently. We blocked it out the fields for subtitles in the Metasound Editor, but it looks as though using Property Matrix editor erroneously exposes these fields again. Philosophically, we try to keep MetaSound graphs to be focused on audio rendering needs and not the needs of other systems.

I suggest you go the route of SoundCues if subtitles are an important aspect of your project.

There are alternative ways to make metasounds work which would require some custom setup. Roughly it would involve:

In the metasound

  1. Recreate the random wave node:
    1. Use a random int node that chooses an int between 0 and the “number of sounds” minus 1.
    2. Use the resulting int to select an index from array of sounds
  2. Create a metasound output that is an int and route the index to the output.

In a BP

  1. Use the UMetasoundGeneratorHandle to get the Output int value from the metasound
  2. Use that int value in BP to trigger choosing of subtitles.

Another way to do this is to put the random wave selection in BP and instead of within the metasound.

It’s a lot simpler to use SoundCues in this case, but there certainly is a path in MetaSounds.

Yeah, there is some rules of thumb we use for determining what to use. MetaSounds are really about audio rendering, and there is a lot of fast dynamic calculations that happen in the MetaSound graph. Basically we try not to have any logic happening within the MetaSound to affect the game. We kind of carve an exception around letting MetaSounds drive visuals, but beyond that we try not to have it driving outside systems. It’s mostly a performance and complexity.

Right now, it’s a little messy to choose USoundCue vs UMetaSoundSource. You can play a MetaSound from a SoundCue. But we’ve certainly felt the fuzziness of this issue and are working on a replace for soundcues to have a new asset which is clearly around event chaining and rendering multiple sounds at once. It’s in its nascent stage, but hopefully will make it simpler to implement what you need in the future.

Hi Phil,

Apologies for the delay, I was on AL last week so only saw this now.

Thanks for your reply, interesting to hear that this was an intentional step. So it sounds like SoundCues is going to be my best option, in which case I’m happy to do that. It will take a little tinkering to apply the desired FX, as I had that set up as a Metasound patch for ease, but shouldn’t be difficult to replicate.

Out of interest, is the intended application from Epic for people to use a combination of Metasounds alongside SoundCues going forward? I viewed Metasounds as the new and improved option for basically everything and, at the moment, every sound in our game is a Metasound of some kind - but it sounds like there are certain things that you guys have earmarked them to not use for? Are there any other aspects aside from Subtitles that Epic wouldn’t really recommend using Metasounds for? Just so I can make sure I’m following the intended usage as much as I can!

Gotcha, thanks for that! It sounds like I’m using them as intended then. I’m not really using Metasounds to drive any of the game (and the only thing I would consider would be visuals, as you say), so seems like we’re in a fine place for our project. I have multiple Metasounds set up to take in game parameters and drive the Audio from within the Metasound, but that seems like it’s well within the intended usage. I’ll keep an eye on updates and any future documentation in terms of USoundCue vs UMetaSoundSource, and if that workflow gets more clarification in future.

Thanks for your help Phil! Appreciate it.