Setting Metasounds Variable

Finally got a metasoundsource working and first thing I needed to do was be able to programmatically set parameters of the metasound. I started off with a simple proof of concept to figure out what I was doing. Start, Stop, change gain, change waves, and set variables from a Blueprint. I found examples from 5.0EA but they seemed to no longer apply as the API had changed in blueprint.

I was able to accomplish some of it, I can set Input Gains, I can Start the Audio and Stop it. “Execute Trigger Parameter” and “Set Float Parameter” worked great so I was in the ballpark.

I can’t seem to get either the Variable categories or Input Waves to take any values from a blueprint. I decided to ask if I’m doing it correctly. Here are three examples, one Starts the audio and works, the next sets an Int32 variable within the MetaSoundSource, and doesn’t work, the third sets a Float gain on an Input Gain parameter, and works fine too. Then I had another example of setting the Wave, that would not work whether setting an Input Wave, or setting a Variable Wave. I’d appreciate any guidance on how to get a MetaSoundSource to accept a Variable Integer, or any type of Wave. Here’s the BP examples and my debug MetaSoundSource. Since this is debug stuff, not everything is hooked up in the MetaSound, but I have had them hooked up and the Int32 Idleindex never gets set, and the Wave Asset and WA1 also refuse to take wave values. Just looking for whether the “Set Wave Parameter” or “Set Integer Parameter” are not the way to set their respective values. Ideas?



metasbp2

Couldn’t find any docs on these new BP Nodes to sort this out. As mentioned, some good folks documented the process on 5.0EA but those no longer exist in this version of the BP library.

Edit: TLDR; Synopsis of solution. “Set Integer Parameter” works if the Int32 is created as an Input and not as a Variable. Setting Wave parameters doesn’t appear to work in either context (might be a bug in 5.0P2).

1 Like

Set int param works but it looks like you’re casting to a variable (Idleindex) and not an input. So far for me MS variables only apply to values sent inside the Metasound graph/s and any external values need to have an input. I think the only way of making a variable work from a BP is to set a trigger before it in MS first. Not worked that much with the Waveplayer but I suspect it might be more straightforward to set and change waveassets in the MS via an array rather than from a BP.

Thanks for the response. Then the variables are not supposed to work? I thought it odd that setting the Input Wave would not work either since all other Input parameters do work (this was the OnContruction sample I showed which was hooked up to poke a wave into the Input Wave Asset. The importance is that my existing theme manager controls intros, outros, idle, combat and stingers from TArrays based on levels and player choices. I’ll try playing with the Input Wave Asset some more.

Metasounds doesn’t seem to have the ability to access external game arrays and thus wouldn’t likely be able to respond well to game events taking input from GameMode, GameInstance and various actors. I had assumed they would provide the ability to control not only triggers, but data, primarily wave data. It sounds like you are saying I should move my waves and control information from my level controlling arrays to individual MetasoundSource assets per level. Then use trigger execution to alter the what is in effect identical metasoundsource assets per level with different data according to events of game play? That would not have been my design because I was managing all that information in a consolidated data base, but I could see how to make it work.

Thanks, I’ll play with it some more and redesign my process to fit MS as necessary.

Np, glad to hear it was helpful at least to some extent. Tbh, I don’t know the answer to those questions as I’m working out most of these things myself and if there’s an official approach I’ve not been able to find any UE documentation yet. But I think it’s worth noting that Metasounds can work exactly like Soundcues (e.g., spawn at location, etc) but unlike Cues you can set up and run something like an entire music system in a Metasound with minimal (or super complex) BP input so how to approach this probably needs some rethinking from how it used to work but it opens some really cool and exciting new alternatives. It also seems that a lot of behaviour can be controlled via the control buses so you could control all the volumes, filtering, time of day for a new game level by simply loading a CB mix profile. And the new MS interfaces for managing attenuation also seems super useful.

I think Dan Reynolds videos goes a long way to demonstrate how different uses could be approached and I’m sure there’ll be more info once the official release rolls around.

As you mentioned, using an Input that is an Int32 (rather than a Variable category) does work and that integer can then be used to determine which sound is pulled from an internal array within the Metasound.

Hard to say if the inability to set the Variable from BP is by design or a bug. The reason I say that is because the Set Wave Parameter doesn’t appear to work in either context (Variable or Input) and the node wouldn’t exist unless they had some plan to allow setting the Wave from BP. But now I’m thinking this is irrelevant. Edit: the text in italics is false, Set Wave Parameter works fine. In my tests, I was getting flaky results where the set wave would not play the majority of the time. The set worked fine, the play was the problem in the way I was using it (a stop prior to the play). I’m also now realizing that there is no reason to access the variables even if you could.

I hadn’t messed with the CB mix profile which would take care of converting over my sound mixes, so I’ll check that out.

After our discussion, I’m evaluating whether MS should be set up so instead of having a theme manager as I built with Sound Cues and Mixes, with those tied to the level, I will instead create a single MS for every theme. Seems like MS will like that better and then I can easily swap in completely different themes within a level. This has benefit because more than one antagonist may be battled within a level and they would have their own idle and combat themes yet the level can retain its own default theme. It would solve a complaint I keep getting from the level designer, because each antagonist can carry his own MS into his role in the level. In other words, I will put all the numerous components of the theme as mentioned in the OP (intro, outro, idle, combat, and stingers), in the unique MS for that theme. Then using Input Integers and Bools, we still have external granular event control from the gameplay.

Thanks again, the discussion has been enlightening as I try to convert to this new method.

I am struggling in changing via int32 the index of an internal array from BP. So far I had no success following this workflow: Event Begin Play->set Index of Wave Asset Array in MS via Interface Parameter int32->trigger “Get Wave asset”. For debug purposes I am monitoring the state of the index and it does not appear to change. The fact is I have exposed such int32 as a variable in the BP to access it directly from editor. This because I have multiple actors featuring the same MS graph but I want different files plying. I really cannot believe I need to make several BPs each with a specific MS inside. Any suggestions or info on this???

I was trying to control the index to the arrays (and also tried to put the waves through the set Wave Parameter) and after beating my head against the wall for some time. I plug the arrays into MetaSounds with the blueprint node Set Object Array Parameters, then let the MetaSound handle the looping through the wave arrays I’ve provided it. I can’t even show the graph because it’s such a disastrous mess.

My latest attempt (and this is just music, I’ve not begun to deal with the non-music sounds from various actors) is that I have a single GameMode Blueprint->MetaSound that controls all themes based on GameInstance variables from other actors and just the level itself. When changing themes, it reloads that MetaSoundSource because I couldn’t reload the wave arrays easily and MetaSounds has no way to compress the array once created. This works but I still have timing issues that are troubling me but these are not related to the architecture of the blueprint->MetaSounds, just the internal loop restrictions on using outputs of a Wave to control the wave (which is a completely different rant).

I should point out that I tried another completely different method, again GameMode Blueprint but letting Quartz control everything through Play Quantized. This surprisingly worked perfectly but, at the start of every track, there was a momentary drop in gain and it destroyed the seamless nature of the compositions.

Bottom line is that Set Object Array Parameters was the only way I could reliably get waves into the MetaSound.

Hey Jabbinuz,

Is your MetaSound playing when you set the Index? MetaSounds only responds to changes in parameters once they have actually started - calling Set [X] Parameter on a Stopped Audio Component won’t do anything.

The other thing that might be worth doing is confirming that this Index is set as an Input in your MetaSound graph. Variables in MetaSound and Variables in Blueprints are different things - Variables in MetaSounds cannot be altered externally, but Inputs can.

You definitely don’t need to make several Blueprints with the same MetaSound, though. As long as each instance of the MetaSound is playing on its own Audio Component, you can have multiple instances of the same MetaSound running with different parameters.

“Is your MetaSound playing when you set the Index?” Yes. I have a box collision BP which sends via BPI calls to the referenced “sonic” BPs which in turn activate the MS components (in the BP) AND trigger the play state, I then send the int32 index value as INPUT, and I have implemented an extra trigger (Execute trigger - interface component) to be sure it gets set, and even cascaded a second trigger for playback. I also have other variables/INPUTS which I can successfully change, the only 1 not changing is the one associated with the index array in MS. These are all calls made after the player pawn gets inside the collision box. No success. I tried several aproaches. What I want to do is have several “sound” objects distributed around an area, which interact with the character. All these objects should feature a different soundfile, which I would like to set via a BP variable and then send it to the MS as input. So I should be able to just change the BP variable and change the index of the array containing the references to the audio files… I have say that this same scheme worked in UE5 EA, and Preview 1/2, but not in 5.0.1

If instead I implement the collision directly into the “sonic” BPs, bypassing the BPI situation, I can effectively change the BP variable and consequently the exposed MS Input, thus changing the playing file.

If its only the one parameter that’s not working, my guess is there’s either something unexpected in the MetaSound itself, or something’s going on with BlueprintInterfaces. (I’m assuming your Audio Component is auto-activated, i.e., playing at creation? If not, just setting “Set Active” on the Audio Component should not work, it needs to be playing for parameter changes to take. In that case, try replacing “Set Active” with the “Play” function.)

I can confirm that on my end, setting parameters on MetaSounds is working as expected, even when there are multiple instances of the same MetaSound. For example, if I put two instances of the following Blueprint (which contains an auto-activated Audio Component) in my level, and alter the Frequency in the Details Panel of only one instance, I hear two of the MetaSound in question, at different pitches.

Is that sort of setup sufficient to do what you want? I assume your setup is obviously more elaborate, I’m just trying to narrow down where the problem might be.

(I suspect the issue you’re seeing is something with the BPI, which I admittedly don’t have as much experience with. But if the only thing you want to change is an integer per Blueprint instance, you can do that via making the integer public and altering it in the instance-specific details panel.)

1 Like

First of all thank you for the support. I am not an expert in UE, and coming from the Max/MSP world, thanks to Metasounds, and my production experience in Immersive Sound, I am literally enjoying learning and making mistakes with this amazing environment. Said this, I can confirm (at least with this experience) that the issue is related to BPI and triggering Play in MS contained in BPs. Values change, what really does not work is the File selection mechanism as described above (BP public variable integer->MS input int32->Assets array->Wave Player) EDIT-> all other variables do work (like Frequency change, Bar/Tempo changes, etc.). This scheme (Changing File via Public variable->etc.) works instead if the BP is activated directly with its own collision box.

Weird. If you’re using the exact same method for changing Frequency and Bar Tempo, and that’s working, that might actually be something besides BPI (Are they the same param type? If not, can you test it with the same param type to see if that’s a factor?). If you can get the wave index to change in other circumstances, I’m guessing it’s not an inherent issue in the MetaSound, either.

Potentially weird suggestion: what happens if you change the name of the input param you’re trying to alter, on the MetaSound’s side? There were some weird pre-release bugs with name collisions and input name propagation, and I can’t think of much else that might be going on.

Hi, after a lot of testing I think I have a focus on the file issue.
What I want to do is trigger playback of several sound objects in space when I press a key. These are spatialized via attenuation. I agree with you, the problem is not the BPI per se. What I am encountering is that if I change the file index, the BP containing the Metasound Source will not pass this variable change unless the character/player is in the attenuation field==MS is activated. Can it have to do with the fact that the variable is passed only when the BP detects the character is in range and gets activated? I strongly suspect that.
As a matter of fact I tried the following:

  1. Put 2 BPs (containging the same MSound, which contain an array with 2 different sound assets/files)
  2. Change the variable from editor, variable is then passed to MS via set integer parameter on event begin-play.-> MS/Audio component is set to active on play, and via print-string I can check that the variable is triggered
  3. trigger key (send control via event dispatcher to Audio_BPs from character_BP)
  4. If character is NOT in the attenuation zone, the variable, even if changed, does not work==the file is not changed
    5 If character IS in attenuation zone, then it works.

I have tested this in many ways…

EDIT: MS are set to play when silent. I think this is all related to polyphony management, and activities related. The VERY VERY wierd thing is that variables are passed, but ONLY and solely the file change variable does not yield any effect unless character is in attenuation area

Edit_2: attaching pngs of BP




Oh! Ya, normally I would expect virtualization was a culprit if this was happening specifically in the attenuation zone, but you mentioned you already checked for Play When Silent.

That said, am I understanding correctly that the MetaSound variable is changing, it’s just not impacting behavior in the way you’d expect, i.e. the wrong file is still playing? Out of curiosity - is the relevant wave file itself set as Play When Silent? Might be something odd going on with how virtualization method is communicated. It might be something going wrong with the Array->Get system.

Play when silent is on, also on the wave file (after your suggestion). The variable is changing, but it does not impact the behaviour unless character is in attenuation zone. I tried with different attenuation zones and overriding it in BP. I am also experiencing out of sync behaviour even if audio component is triggered with Play Quantize (quartz clock set up correctly, if sounds are not spatialized everything is in sync, and I followed several epic tutorials on the subject). To make it short, even if quartz clock is on, the variable is passed and everything looks ok (auto-active, play when silent, play quantize, etc…) unless character/listener is in attenuation zone (cone or sphere) the audio component appears to be switched off, till, obviously, listener/character is in attenuation zone, where I need to re-trigger the commands to time allign and get a result from variable changes.

Hmm. Question: if you put au.debug.sounds 1 into the command prompt, do you see the MetaSound disappear from the list of playing objects at some point? That sounds consistent with weird virtualization behavior. At least on my end, attenuation by itself isn’t causing virtualization as long as things are set to Play When Silent . . . but active Concurrency settings can override that. (E.g., if a Concurrency rule kicks it out of a voice slot, the sound will virtualize regardless).

Yes the MS disappears regularly, reappears when I get into the attenuation area. Default Sound concurrency = None. And in the tests there are only 5 sounds playing
The problem appears to be the combination of Quartz+play quantized+MS as player+attenuation . I need the MS to be in sync with a master clock, this is why I am using the quartz system. If instead I use directly the asset sound file with play quantized everything is smooth and super tight in timing. If I do not use spatialization/attenuation even with play quantized + MS everything is fine. Triggering an MS without play quantized, even if I trigger via subscribed events, the MS is slightly off (I work too much with audio to not realize it :smile: ) but file is changed.

What you may have found would explain why I had to drop using Quartz Play Quantized. I mentioned this in another thread and it was suggested not to use Quartz but make the MS Standalone. This didn’t work well because controlling the waves from within a MS becomes impossible because of loop detection. You can’t use the outputs from a wave to control the stop of that wave and working around that creates indecipherable MS graphs.

But originally I was attempting to use Quartz. My finding was that I would experience a drop out between “Play Quantized” initiated waves. Your findings would suggest to me that the play quantized is starting a new MS and that could explain what I was calling a dropout.

Misinformed Comment Deleted here.

Hey eagletree,

Have you checked out using Variables with DelayedGet for dealing with loop detection? People using the end of a Wave Player to trigger another is a pretty common workflow, the syntax looks a bit like this:

But, yes, PlayQuantized involves making a new MetaSound. It needs some time to queue the sound in advance in order to play sample-accurately, it’s a scheduler. (I think Max wrote a really detailed description of this somewhere, let me see if I can find it.)

“Not integrating the product to UE” is definitely not a MS design goal, not sure where you’re getting that from. Has a bunch of stuff intended for Blueprint interactions.

9 Likes

Weird. I assume the same number of total sounds are present when Quartz is used vs. not? I can do a smoke to see if something weird is happening with Quartz and voice slots. That said, it does occur to me that because Quartz is a scheduler, I’m not sure how well it handles changing parameters on MetaSounds that have already been queued.