Download

New Audio Engine: Quick-Start Guide

Yup yup! You’ll want to read the section above about Getting Started with Source Effects!

You just need a reference to your Source Effect.

I manage to do it! Thank you @dan.reynolds once again for your help and patience :slight_smile:

Somebody give that @dan.reynolds a raise! Wow, this is really neat stuff…!

teak

Thanks! :cool:

tumblr_o48vv9GIKT1sw4g05o1_500.gif

This is a great thread, thanks for the clear instructions!

Dan - great to hear your voice at QMUL yesterday. Look forward to trying this stuff out and seeing what I come up with. Mwhahaahaha

-Rob.

This is incredible! Thank you so much for all the hard work gone into the new audio engine and for taking the time to explain how to use the new features.

Is it possible to make a preset while simultaneously listening to the changes? Can I make a synth and run it, make a sound I like, then save the sound like a preset? Can I hook my keyboard up and use it to trigger notes and change CC values like filter etc?
Dream scenario: Use my keyboard to make a preset in realtime, then saving it for use in my game.

At the moment I am just using tick to update the preset while changing the preset in the bank, which works fine anyway :slight_smile: And an event timer to trigger note every second.

There is stuff in the thread about the midi plugin that will help you with a fair chunk of the stuff needed to get a keyboard working with the synth. ( https://forums.unrealengine.com/showthread.php?144312-Setting-Up-a-Blueprint-MIDI-Manager-with-4-14-version-of-MIDI-Device-Support-Plugin )

And a quick note to anyone who cant get the submix and chain to work: Open the editor using the -audiomixer command line! :wink:

Thanks! :d

Thanks! :slight_smile:

Request for a future addon: Some kind of mini-sequencer to use inside event graph, to make short melodies (like a power up melody). Something like Timeline, but intended for note sequences. Not for making music or anything, just for those short sequences of notes.It could offcourse be used for other float values in blueprints also, and therefore sync synth values with visual values.

Usually for this I just have an array of notes and then I loop through them.

Hi Dan.
Sorry if this is a stupid question. Can the synths only in a BP or is there/will there be a way to use the synths in Sound Cues?
Key parts of the implementation systems in our game have been set up with data assets with exposed variables to insert sound cues. I would really like to start using some procedural aspects with the modular synth, would I be able to do that or would I need to have the data assets trigger custom events and use those in a synth BP or something?

Also in the now famous presentation you guys did about the new system, the synth had a nice and friendly GUI to work with, is this in there somewhere and I’m missing it?

Thanks

Ashton

Hey Ashton,

The synths are components, not assets, so there aren’t any ways to trigger them from within SoundCues right now and frankly, I’m not completely sure how that would be accomplished or what parameter control would even look like–parameter passing in SoundCues is extremely painful and SoundCues are so obfuscated from the rest of the game logic that my hearty recommendation is to use SoundCues sparingly and only for cases when similar logic would be less convenient in Blueprints–generally speaking, Blueprints are far superior in terms of logic and generally will perform better. SoundCues are very much a legacy system and the entire SoundCue graph is evaluated every tick, which in cases of complex SoundCue graphs is actually really terrible for performance (compared to the execution oriented BPs).

As for the GUI, we’ve considered trying to include it in a Content Example some time down the line, but it was just a runtime UMG interface I made for modulating synth parameters in real time. It had limitations because the Modular Synth is modular and the UMG widget wasn’t. With that said, we are interested in exploring ways to edit preset data in the Editor and hope that we can improve interfacing with Audio assets, effects, synths, etc. overall–at the moment, Slate is something we’ll need to learn more about to achieve this, so I don’t know when we’ll be able to get to it–but we want to do it, for sure!

Thanks Dan. This is really useful to know.

So you recommend using SCs for things like random containers and modulators, but most things can be done easier in BP?

More dumb questions:

-do both oscillators have to be routed through the filter?
-Can you have multiple mod synth components on one BP?

and similarly, patching LFO to gain of one oscillator instead of the gain of the whole patch?