Why does Metasound oneshot interface loop UE5.1?

Hi,

When using the UE source.oneshot interface, the on nearly finished output says you should remove this interface for looping sounds to avoid leaking the source. What does this mean? The sounds still loop if you tick loop on the waveplayer, even with the on nearly finished node connected. I do not understand the problem of using this interface to loop sounds…

Thank you

2 Likes

Sorry for a 2-year reply, but in case anybody else shows up here from a Google search, I’ll explain the issue here. I understand it’s a bit confusing to new people to game audio.

The issue with looping (indefinite duration) and one-shots in MetaSounds is analogous to the problem in particle systems (e.g. in Niagara). MetaSounds are fundamentally a procedural asset. Therefore, there is no duration you can automatically derive from a MetaSound. This is also a problem with Particle systems. You could attempt a guess based on the MetaSound topology (i.e., inspecting the graph), but technically speaking, there’s no way to do that in a way that is mathematically “correct”. The issue compounds when you consider that MetaSounds have an open and extensible Node API that any plugin can add to. You can’t make any assumptions on intent or functionality by inspecting existing nodes since any node in the future may play sounds in different ways, etc.

Even if you were to infer based on existing, specific default nodes (e.g. MetaSound WavePlayers finishing), you would still be making significant assumptions about the intended use of the MetaSound. For example, a sound designer may intend for the MetaSound to persist even if it’s not actively playing a sound because it has a series of input triggers, which it’s waiting for the game to send, in order to make audible sounds - and we certainly can’t inspect all of the Unreal Engine’s gameplay code (C++) and BP as to what system somebody wrote for their MetaSound and what the intended behavior is.

Because of that, when your prediction as to the asset duration is wrong, which is likely, it’ll be extremely confusing when your sound suddenly stops—even more confusing than this One-Shot interface.

So, to help with this issue, the One-Shot interface is a default but still optional interface that MetaSound Source types add so that it’s a bit less difficult to create a “stuck sound” when you play it. It’s a helping hand to remind you to think about OneShot vs not OneShot (looping).

To illustrate the issue of a stuck (or leaked) sound more concretely, if you were to play a MetaSound Source without ever indicating to the audio renderer that it’s done playing, and without a handle to that playing sound, the sound would literally continue rendering for as long as the application is alive. This is the exact same problem that you have with a “looping” sound before MetaSounds (i.e. with SoundCue or SoundWave playback). To play a looping sound, you need an audio component.

This is because an audio component is a handle to the sound and it will automatically stop the sound when the owning actor itself is destroyed. Or, because the audio component has a BP/C++ API that allows you to control the sound (e.g. stop it!), you can also clean up the sound whenever you want.

SoundCues and SoundWaves have an advantage in this ONE regard over MetaSounds. They are not fundamentally procedural and are instead built on the core concept of sound-file playback, which, by default, ends. You then optionally add “looping” in various ways, which indicate that the sound will repeat itself when it’s finished. SoundCue and SoundWave have default one-shot behavior, and there is no ability to truly generate deeply procedural (i.e., synthetic) audio with them. This, notably, is one of the many key reasons why SoundCues could not evolve to be something like MetaSounds – they are fundamentally about playing sound files in variant recombinations, not procedural generation. In graphics, the analog might be a strange tool that randomly picks textures, cross-fades them (alpha blending), and maybe applies different color tints, instead of a material/shader graph.

So, back to the OneShot interface. It’s designed to warn you in the MetaSound editor if you do not hook up the SoundDone output trigger. There are also warnings that log when you use various BP functions without this interface. If you play a MetaSound that does NOT implement the OneShot interface through a function without an audio component handle (e.g. PlaySound2D), it will warn you that the sound will get stuck unless you implement the OneShot interface.

1 Like