Custom audio classes...day 4...

I’ve spent the last few days playing around with various audio classes in Unreal, and as of yet I’m unclear on how best to proceed. I would like to build a wrapper for an audio library that I use a lot so I started off by embedding it in a USoundWave class that I instantiated within a USoundNode class. This worked fine, I could drop it into the SoundCue editor and play back audio generated on the fly. But I can already see issues with it regarding to controlling things in real time. I don’t think this approach is the best as it is too restrictive.

My next approach was to create an AudioComponent derived class, that I assign my USoundWave class to using AudioComponent::SetSound(). I then wrap this in an AActor class. This looks like it might be the most robust approach. I can expose the AudioComponent member variable, and then call Play() on it from my Blueprint. I can also expose lots of my library methods this way too.

But there’s a problem. I get no audio when I play my AudioComponent member class. I’ve placed some break points around my source code, and I can see that triggering Play() is triggering my USoundWave’s GeneratePCMData() method. The samples are being written, they are just not being output. Does anyone have any ideas? I’ve stepped through my AudioComponent subclass, and its play method is also being called. Must be something stupid I’m overlooking?

In the end I just called one of the UGameplayStatics audio methods. It works fine. To be honest, I’m not sure there is any benefit to exposing the AudioComponent class. Time will tell.