AudioComponent SetWaveParameter with PlayQuantized

I have an actor that plays AudioComponent->PlayQuantized()

Previously the actor would be initialized with a unique Metasound. On PlayQuantized() it would work as expected.

This approach required me to instrument many Metasounds, and instead I’d like to use a single MetasoundSource* that has parameterized inputs.

However, now when I run AudioComponent->SetWaveParameter I get unexpected behavior: Audio is being played as if not in the quantization audio thread- sounds are clipped, barely audible, basically sounds like PlayQuantized() isn’t actually working.

I’ve tried placing the SetWaveParameter call in different places- before and after the PlayQuantized() call, but there’s no change in behavior.

It does seem like SetWaveParameter is being called, and setting the correct sound wave, because I can hear sounds, but the sounds are being played as if its not on the quartz audio thread.

My code can be found here. The clock, quantizationBoundary, etc. all are working as expected, it’s just this weird behavior with PlayQuantized().

Is there something I’m doing wrong in setting the MS parameters?

I have two theories as to what’s going on:

  1. I’m wondering if the problem is I’m continuously overriding the Metasound. There was no problem with unique Metasounds per soundwave. I wonder if the quantization is all working correctly, but on each beat the Metasound is being triggered to SetWaveParameter for all conflicting beats.

I can test this. If its the case, I’ll need to find a way to clone/duplicate Metasound at runtime. Doesn’t seem particularly easy to do though.

  1. If it’s not this, my other thought is that its something to do with the MS isPlaying bool. Audio parameters can only be set when isPlaying is true for some reason.

My theory is that PlayQuantize essentially delays isPlaying until the audio thread plays it in queue. I’m wondering if the clipped audio I’m hearing is the SetWaveParameter happening on game thread while isPlaying is governed by audio thread.

I wonder if I can consume AudioComponent->OnPlayStateChanged, and trigger the SetWaveParameter function if the state is Playing.

I’ve confirmed the issue occurs even if there is not duplicate Metasounds playing, so I tried looking at my other idea.

I added the following code to consume the event that fires when the Playstate changes:

AudioComponent->OnAudioPlayStateChanged.AddDynamic(this, &AAudioCuePlayer::OnPlayStateChange);

void AAudioCuePlayer::OnPlayStateChange(EAudioComponentPlayState PlayState)
{
	switch (PlayState)
	{
	case EAudioComponentPlayState::Playing:
		UE_LOG(LogTemp, Display, TEXT("Playing"))
		AudioComponent->SetWaveParameter(SoundPlayerData.WaveAssetName, SoundPlayerData.WaveAsset);
	default: return;
	}
}

My theory was this would set the param when the sound should be definitely playing.

Unfortunately with this code, no sounds are playing. I’ve confirmed its hitting the Playing switch case, but it seems SetWaveParameter is either too late or not triggering… or something.

I dunno, I’ll have to come back to this and figure out what’s going on.

Ok, so trolling through the engine code to see if I can find where the issue might be.

PlayQuantized()

AudioComponent->PlayQuantized()() sets up the objects required to play or queue a quantized sound.

There is some logic about whether or not to “steal voice slot”. It triggers AudioComponent->PlayQuantizedInternal() if allowed or AudioMixerClockHandle-> QueueQuantizedSound()

My assumption is that in my case, I am queuing the event.

QueueQuantizedSound() calls UQuartzClockHandle->SendCommandToClock which does some QuartzSubscription->PushCommand()

PushCommand logic is non-obvious, but I’m going to make a leap and assume it somehow ends up in FQuartzTickableObject::ExecCommand(), which ends up leading back to our AudioComponent via ProcessCommand

AudioComponent->ProcessCommand() finally calls PlayQueuedQuantizedInternal()


PlayQueuedQuantizedInternal

So lets look at PlayQueuedQuantizedInternal

This is looping through the object sent by Quartz, trying to match CommandID.
It sets some vars then checks if Sound is set.

I’m assuming Sound is set in my case, set to a MetaSound. The clock handle should also exist. The sound should not be stopped, and I assume we have a valid command.

This leads us to AudioComponent->PlayInternal()


PlayInternal()

AudioComponent->PlayInternal() gets world, set our MetaSound as a local var.

What’s interesting is this UE_LOG giving us log info on the sound to play. I’ve never seen this in logs, perhaps I need to run Verbose or something? This would be useful information.

It checks if the AudioComponent IsActive, stopping the sound in cases where its looping. My case is a one-shot.

We do some more checks, sets, etc. Then theres a block of code for attaching this component(?) to a new parent.

We then do some Attenuation settings, set Volume, Pitch, Filters, Envelopes, more Attenuation and other settings.

There’s some interesting block in if (AudioDevice->IsBakedAnalaysisQueryingEnabled())

Now we pass in Quantized data NewActiveSound.QuantizedRequestData = InPlayRequestData.QuantizedRequestData;

Some more objects set and checked.

One thing that does jump out to me is

AudioComponent.cpp line 856:

	TArray<FAudioParameter> SoundParams = DefaultParameters;
	
	if (AActor* Owner = GetOwner())
	{
		TArray<FAudioParameter> ActorParams;
		UActorSoundParameterInterface::Fill(Owner, ActorParams);
		FAudioParameter::Merge(MoveTemp(ActorParams), SoundParams);
	}

	TArray<FAudioParameter> InstanceParamsCopy = InstanceParameters;
	FAudioParameter::Merge(MoveTemp(InstanceParamsCopy), SoundParams);

I noticed Merge in SetWaveParameter, which only actually sets if IsPlaying()

My theory is there’s some shenanigans happening with FAudioParameter. I suppose I should troll through this code next.

But I am weary. Due to the relative smoothness of my brain, tumbling this far down Engine code has left me tired.

Next Steps

There’s a couple things I can do here.

  1. Experiment to see if SetWaveParameter correctly when not playing the sound via PlayQuantized
  2. Continue investigating FAudioParameter, try and understand what this Merging stuff is.
  3. See if there’s a programmatic way to duplicate MetaSounds at runtime, directly assigning the SoundWave I’d otherwise use.
  4. See if I can use SoundCue’s rather than waves?
  5. Abandon all hope, revert this entire set of changes and just use 50 MetaSounds with direct soundwave assignment. Kind of defeats the purpose of MetaSounds, and forces me to do the equivalent by hand.
  6. See if there’s a way to submit this behavior as a bug. It feels like PlayQuantized and SetWaveParameter should work more-or-less out of the box.

Ok, Imma replicate this info on my devblog Building A Music Engine then get back to my day job.

Today I am investigating the block of code in AudioComponent::PlayInternal mentioned above where SoundParams are being set.

AudioComponent->PlayInternal()

This block (AudioComponent.cpp line 856) starts by pulling DefaultParameters to a local array:

TArray<FAudioParameter> SoundParams = DefaultParameters;

The AudioComponent header file has interesting comments about this:

	/** Array of parameters for this AudioComponent. Changes to this array directly will
	  * not be forwarded to the sound if the component is actively playing, and will be superseded
	  * by parameters set via the actor interface if set, or the instance parameters.
	  */
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = Parameters, meta = (DisplayAfter = "bDisableParameterUpdatesWhilePlaying"))
	TArray<FAudioParameter> DefaultParameters;

	/** Array of transient parameters for this AudioComponent instance. Not serialized and can be set by code or BP.
	  * Changes to this array directly will not be forwarded to the sound if the component is actively playing.
	  * This should be done via the 'SetParameterX' calls implemented by the ISoundParameterControllerInterface.
	  * Instance parameter values superseded the parameters set by the actor interface & the components default
	  * parameters.
	  */
	UPROPERTY(Transient)
	TArray<FAudioParameter> InstanceParameters;

In PlayInternal, DefaultParameters are pulled first, then some merge happens with the Owner actor, then the InstanceParameters are merged. Then the sound is sent to the AudioDevice with the fully formed sound parameters.

The thing with the Owner is interesting:

	if (AActor* Owner = GetOwner())
	{
		TArray<FAudioParameter> ActorParams;
		UActorSoundParameterInterface::Fill(Owner, ActorParams);
		FAudioParameter::Merge(MoveTemp(ActorParams), SoundParams);
	}

UActorSoundParameterInterface::Fill takes the Owner, and pushes any Audio parameters set if the Actor implements UActorSoundParameterInterface

The interface is described as /** Interface used to allow an actor to automatically populate any sounds with parameters */

I don’t know if Actors implement this out of the box. Potentially I can implement this interface in my custom actor class, forcing me to add a GetActorSoundParams… or maybe GetActorSoundParams_Implementation. I’d have to check which one.

Here’s the point, and this is the thing I probably should’ve picked up on earlier: All these merge functions, all this parameter setup is being done via:

FAudioParameter

This is a struct describing the properties related to an AudioComponent. Each property is essentially a wrapper around an enum of AudioTypes.

This is interesting because I recognize the types as the list of available properties that can be set as inputs in MetaSounds. It would make sense there would be C++ for this.

What is sus is the lack of “Soundwave” anywhere in here.

There is a single mention of SoundWave:

	// Object value (types other than SoundWave not supported by legacy SoundCue system)
	Object,

Now I recognize from other parts of the engine code I investigated that AudioComponent->SetWaveParameter is a wrapper around a SetObject.

So safe to say that at this level of the code, a SoundWave is just an object in terms of AudioParameters.

This struct isn’t all that interesting, which should generally be true of structs. But it does confirm that if I can get this to set, in theory the MetaSound will pick it up.

Next Steps

The problem I’m trying to solve is Setting Parameters for a MetaSound does not work intuitively with PlayQuantized()

The way I see it, I have three candidates for setting:

  1. Have my AudioCuePlayer class implement UActorSoundParameterInterface, see if this sets the value correctly.
  2. Attempt to wrap my SoundWave in a FAudioParameter and stuff it in AudioComponents.DefaultParameters
  3. Or do the same and stuff it in InstanceParameters
  4. Hell see if I can do both, or all three.

My hope is that setting an audio parameter this way avoids the IsPlaying() requirement that AudioComponent->SetParameters has- the theory being that PlayQuantized() sets IsPlaying to true on the audio thread timing, where SetParameters is being set on the game thread timing, resulting in the clipped audio I hear when testing.

If I can set these audio parameters before play is initiated, the theory is PlayQuantized() will play the sound and already have the juice it needs. You know I’m here for that juice.

I’ll report back tomorrow to see if this idea works.

Okie, I couldn’t figure out how to get my Actor to implement UActorSoundParameterInterface, as you can only inherit from a single UObject.

But it was easy enough to push the FAudioParameters into AudioComponent->InstanceParameters and DefaultParameter.

Unfortunately there was no change in behavior. The actual sounds in game are still clipped and missing.

This is unfortunate.

It makes me feel that assignment of the MetasSound variables is correct, and that the issue lies somewhere deeper in PlayQuantized’s relationship to MetaSound.

It could also be PlayQuantized relationship to Setting Audio Params, but I feel like if I’ve set these variables in Instance/Default Parameters, these vars should be set well in advance of sending the sound to quartz’s dedicated audio thread.

So my thinking is this is something related to MetaSound.

I didn’t find any specific code in AudioComponent engine code related to MetaSound- which is kind of interesting. I think I will troll engine code for MetaSound and see if there are clues there about how it’s actually (intended to be) working, see if I can find some useful comments or functionality.

Alas, a task for tomorrow.

The current problem is that AudioComponent->PlayQuantized has an unusual interaction with a MetaSounds if the Metasounds have input assignment.

I am assuming from my tests above that the problem is not AudioComponent setting parameters with SetWaveParameter. Today we investigate MetaSound engine code itself to see if we can determine what’s going on.

MetasoundSource

The first thing that jumps out is

virtual void InitParameters(TArray<FAudioParameter>& ParametersToInit, FName InFeatureName) override;

The code for this appears to validate if the incoming param is valid, then runs some ConstructProxies lambda(?) assigning valid params to a MetasoundVertex. I’m assuming a vertex is a node in a graph, with MetaSounds using a custom graph data structure.

That sounds dope as hell, and something worth investigating further, but doesn’t sound like it will reveal any relevant information to my current task.

The question is can I just call MetasoundSource->InitParameters() directly, instead of AudioComponent->SetWaveParameter()?

The answer appears to be no… No sound plays.

What’s weird is that the behavior of my test in the previous post also resulted in no sound- instead of the clipped sounds I reported. Makes me think I didn’t actually compile the test? But that feels slightly impossible as I was testing multiple methods.

I’m shook.

Point is, calling MetaSoundSource->InitParameters() does SFA, meaning I’m SOL.

The one weird thing about InitParameters is that it takes an array of FAudioParameters, which is normal, but also a FName InFeatureName which doesn’t appear to be used at all. I’m calling it with an empty FName(), since it seems unused, but there might be magic I’m missing somewhere causing my call to do nothing.

The problem I have now is that there doesn’t seem to be much left in MetasoundSource engine code that seems relevant to my current problem.

I’m still not clear exactly where the interaction points are between Metasound / Quantization stuff. I’m pretty sure they don’t really talk to each other directly- AudioComponent feels like a mediator between the two. But all my tests of setting audio parameters have failed.

I have one last trick I think- upgrade from 5.3 → 5.4 and hope this magically is fixed.

And that didn’t do nothing.

Cool.

Going to ping @MaxHayes and @lantz.anna_1 with the abridged description of the issues described in this thread:

I have a MetaSound with a single WaveAsset input connecting to a WavePlayer.

I have an Actor that sets the wave asset via AudioComponent->SetWaveParameter()

The Actor plays the sound via AudioComponent->PlayQuantized()


The sound is being set, but on play the sounds are clipped.

As far as my understanding goes, PlayQuantized() queues the audio to a dedicated audio thread- this is working correctly.

It feels like AudioComponent->SetWaveParameter, or the MetaSound param initialization itself, is being done on the game thread, ignorant of the audio thread.

The majority of this thread has been me trolling through engine code trying to find workarounds. That’s failed. I’m on 5.4.4 now, issue still happening.

I’ve trolled the forums/community discord/random blogs for leads.

Pinging directly is my last hail mary before backing out of these changes completely, and duplicating my MetaSound ~50 times and directly setting each wave asset. That would suck, but I’m just not clever enough to figure this one out.

Bug, if it is a bug and not me being dumb, can be recreated via blueprints like this:


With a simple MetaSound as:

If I make it Play() and not PlayQuantized(), my test works if playing every beat.

Unfortunately in my code, I’m counting every bar, then triggering PlayQuantized with a Quantization Boundary that will actually play the sound on say, the third beat of the current bar.

I could refactor my code to try and exploit the Play() workaround, but I figure that’s dangerous as now I’m playing the sound on the game thread, instead of the dedicated audio thread.

Well I’m dumb.

I had the Metasound Output → On Finished connected to the On Play, not on Finished.

Sometimes I surprise even myself.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.