Could the sound be going to both the speakers and the submix? Set the effect to +20 and see if speakers blow up.
Set audio to 100% send only to submix and it will be pure submix signal.
Or in this case, if it’s meant to be a master master limiter, we can add effects to the master submix(its always there by default), and it will pick up all sounds of game, then we can compress/limit etc there.
Actually I would rather be putting this limiter across a Master Submix, but I haven’t been able to figure out exactly where to get at that.
The docs mention something about an execute path, but I’m afraid I have no programming experience and I’m still reasonably fresh to blueprints as well.
I’ll check into the suggestions you gave, but if you could shed some light on me adding the limiter to the master bus I would be very grateful.
Cheers,
Quick edit to explain myself further:
I know how to add a “Add Master Submix Effect” to a BP and select the limiter within it, (as shown in the Sticky post) I just have no idea what blueprint I would put that in or what I would connect its Exec to.
Hey thanks, no problem!
Hmm, I think any blueprint is fine? And it could be from any exec, but probably want it from the start so could be on the “beginplay” exec node, and the BP could ve something that is active from early on…
If there is some BP that is used early, like loading screen or menu, we can do it on that. BeginPlay is called once when a BP is coming alive/being first used. Or for example “game instance” is another default BP thing that is always active from start to end when game is launched. Often used for persistent stuff, referencing variable/objects so that they are not forgotten and tossed out prematurely. They are unique per project, and set in world details or game mode options. Could be a neat place to do it…but at the same time for simplicity I’d do it on any regular actor for starters.
Ah amazing, thank you!
I think I had a fundamental misunderstanding of BPs (coming from a mostly DAW background) where I assumed there was some representation of signal flow to the audio listener that it would have to be hooked up to. But I think I get it now.
I will try this as soon as I have the chance and post back.
No problem:) Yeah that’s my main gripe about a lot of programming - lack of visual representation. Blueprints is the closest we can get to the sun for now.
For the structure of how sound will flow, it’s probably best to have it on a white-board and have it reflect content development.
Submixes can be used kinda like a track in a daw, or a send, or a summing bus, or an effect rack…Then we have effects that work on the source vs effects that work on the tracks/summing buses. Then it’s source buses that can do something very similar, while pretending to be a regular audio source in a level. And all these things can also be used in very untraditional ways!
The whole of the audio stuff together is like a very modular DAW in pieces, the abstract concepts and components are there, but we have to decide, design and build ourselves if this is supposed to work/look like a traditional DAW app, or a game audio driving system, a sampler plugin, a live set/studio production setup…whatever in the universe that doesn’t exist yet! But yeah, gotta make the visual representations ourselves! Except those blessed with the right kind of imagination.