Why is Metasounds so "awkward", when compared to Blueprint?

Hello,

So, I tried to do some very simple things with Metasound, but I can not get it to work like I am used to when doing the same logic with Blueprint.

I simply do not understand why everything has a different name, why there are limitations that dont exist in Blueprint (you cant route a “play” execute into something that came before, even if it wouldnt be a loop etc.), why there is no obvious “order of execution” and other things.

It is also confusing that floats suddenly have the color closer to integers (in delays and are called “time”.), but floats still exist too but are used as “percent” instead. And then there is an integer too.)
So, we technically have 2 floats, in 2 different colors.

I dont know, I think the color-sheme needs some work. (yellow/orange etc. still exist, and vectors arent something expected to appear there anyway, or transforms.)

Obviously there also is the thing of “exec” now being able to work as a sequence, but with no sequence there, everything just happens at the same time?
On the other End, I cant put multiple “exec” into one Node without using “trigger any”, but these are hardcoded so that I have to constantly swap them, instead of just having the “add input”, similar to blueprints “sequence”. (This also is completely inverse to how Blueprint handles it.)

It feels like I have zero control over what is going on, despite me knowing exactly what I want and being able to recreate the same thing very quickly in normal Blueprint.
Idk, Metasounds “breaks” with every rule that Blueprint has taught us over the years, while it doesnt really need to.

==============
Is there a specific reason I am missing, that Metasounds is so awkwardly different from Blueprint, when it more or less does the same?

The short answer is, MetaSounds are not Blueprints thus do not behave the same way, similar to how a Material graph is not the same as a Blueprint.

MetaSounds are directed acyclic graphs (DAGs), not an execution graph, meaning that everything in the MetaSound happens on the same render block and there is no “order of execution.” This is simply the nature of DAGs.

A trigger is not an execution—they are samples in the audio buffer that are marked and cause other things to trigger on that sample. This is one of the main benefits of the DAG—and MetaSounds in general—sample-accurate precision for triggers.

I agree there are some pain points with MetaSounds being still so relatively new, but I’m rather optimistic about the UE audio roadmap for the future.

1 Like

Metasounds are like a Material for audio, rather than like a blueprint procedural graph.

If you have worked with other audio software, then Metasounds are more like a tool like Reaktor, and less like a language like Csound.

And if you haven’t played with Reaktor, and are into sound design, then it’s totally worth it to download a demo and give it a spin, or at least look at a few YouTube videos :slight_smile: Just beware that the company making it has trouble porting it to Apple ARM hardware, it’s x64 only. Which of course is currently not a problem on Windows.

2 Likes