Announcement

Collapse
No announcement yet.

New Audio Engine: Early Access Quick-Start Guide

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Hi no7hing, I describe the problem here in my slide-show on making an interactive music system in blueprints:

    Leave a comment:


  • replied
    Originally posted by pfaudio View Post
    Hi,

    Excited to get into the stuff in the new audio engine. I have a couple questions involving the best way to build a music system in BP that I think tie into that.

    Currently we are on UE4.17 and planning to jump to 4.19 when it’s out. I note that timing stuff was covered in this thread back around post #73 from @drfzjd.

    Probably the most critical timing thing for me is tracking playback time of a music file, and stopping it at designated “exit points” where we then play/stitch an “ending stinger” Cue.

    To track timing for the currently playing music cue, we are multiplying % of Cue’s progress by its duration. So for instance 43% complete * 1:12.434. We have a binding from the audio component’s OnAudioPlaybackPercent event to multiply the Percent float that it outputs by the duration of the sound cue (https://docs.unrealengine.com/latest...aybackPercent/).

    This brings me to my first question: Is this the most accurate way to monitor a music Cue’s time?


    Also, I just watched the “Procedural Audio in the new Unreal Audio Engine” video from May of last year. At about 43 minutes in, Aaron mentions that he addressed some stuff where the old audio engine was not queueing up and executing events at the same time.

    Next question: He mentions this was done for 4.16, but is it in the new audio engine that you have to enable or part of the default one at this point?


    Ultimately I’m hoping to be able to stop a track and play an ending stinger with <20ms of latency, so not exactly “sample accuracy”. Still testing, but may already be there. One thing that appeared to be causing the end stinger cues to play late is if the game requests to stop current Cue, and next exit point is not far enough away. After some experimentation it looks like it’s best to skip an exit point and go to next if it’s <0.5 seconds after the request.


    Final question(s):

    If we switched to new audio engine now with 4.17:
    • Are things pretty much the same, stability-wise if we aren’t using any of the new plugins?
    • Will existing audio related BP or Sound Cue nodes change in functionality at all?
    Thanks
    I'd also be super interested in getting answers to pfaudio's questions as I'am in a similar spot: trying to play two sound cues back to back and getting ~18-20ms of additional time (sound cue duration + said additional time) between starting a sound cue and receiving OnAudioFinished. I'am using a single AudioComponent on an otherwise empty actor, alternating between two Sound cues when receiving the OnAudioFinished delegate in C++.

    What is the recommended way of playing e.g. a list of WAV with as little delay as possible between one ending and the next one starting? Are sound cues still the way to go or should I use a granular synth, even though I'd play the source file start to finish?

    I'am currently on 4.18.3 but could change to a later version if really required. Not using the new Audio Plugin right now - but would switch to it, if required to decrease current 20ms delay.

    An answer or pointers into the right direction to do this would be really appreciated.
    Last edited by no7hing; 06-08-2018, 03:27 PM.

    Leave a comment:


  • replied
    I managed to get it to compile! I forgot to add it to the AdditionalDependencies inside the .uproject file. Adding "Synthesis" in the .uproject file, to PublicDependencyModuleNames in .build.cs, and including "SynthComponents/EpicSynth1Component.h" got it to compile properly.

    Leave a comment:


  • replied
    How would you get the UModularSynthComponent added to an Actor in C++? I've got everything set up in Blueprints already, but I wanted to bring the logic down to a lower level and the compiler can't find the header file. I tried adding "Synthesis" to my PublicDependencyModuleNames in the .build.cs file, but it still has issues finding the header files.

    Leave a comment:


  • replied
    I just enabled this and it fixed all my android issues.

    Leave a comment:


  • replied
    Originally posted by dan.reynolds View Post

    -audiomixer still works. But if you want it on by default for YOUR project, then you need to set up the config override. You can check ini files into your version control so that everyone gets them. I can't imagine how you're collaborating otherwise.
    very easy, you only source control the project itself and everyone has to use the launcher engine (the package build is done from engine source tho)

    Thx for your answer, sadly I still have problems with steam audio on the current launcher version, maybe switching the engine source might be an idea now

    Leave a comment:


  • replied
    How do I complete the following step on a Mac? do I use terminal?

    Launching the Editor with the New Audio Engine on via Command Line:

    1. Open the project using command line flag: -audiomixer

    Leave a comment:


  • replied
    Originally posted by Dudester01 View Post

    Thanks, Dan
    I know the blueprints and stuff. But I don't know for example what are the Note on, Note Off, how the sound-thing works, etc google is silent on this matter.
    Note On and Note Off are functions called on the Modular Synth Component, they mimic MIDI Note On and Note Off events, which have been a standard for hardware synth communication since the 1980s.

    They start a synth note and end a synth note, respectively.

    Leave a comment:


  • replied
    Originally posted by dan.reynolds View Post

    The white connections are EXECUTIONS.

    They come from EVENTS.

    I would recommend a generalized Blueprints tutorial, something like turning on and off a light or opening and closing a door. The gist is that an event happens and it executes a series of functions and macros. These happen in a specific order, a sequence determined by the order in which your functions are connected. Sometimes there are macros that alter the execution pathway (like a Branch or a ForEach, etc.) in those cases, you might have several possible execution pathways. How you construct these pathways determines your game logic.

    The question of what Event prompts a Note On for a Synth or a Set Frequency for a Filter is not one posed to me, that is up to you.

    Maybe the Note On plays when your BP interprets a MIDI Event, or maybe it plays when your character walks into a Trigger Volume. Maybe you set the Filter Frequency of your Source Effect when your player loses the game or when you enter a menu.

    The point is, I don't know what comes before those any more than you do.
    Thanks, Dan
    I know the blueprints and stuff. But I don't know for example what are the Note on, Note Off, how the sound-thing works, etc google is silent on this matter.

    Leave a comment:


  • replied
    Originally posted by Dudester01 View Post
    Good feature, but there a lot of things in the guide that I don't understand, like on the modified screenshots that I attached.
    The guide doesn't go step by step, I don't see connections, and there is no screen-shot showing some complete blueprint set-up like in Midi tutorial
    Sorry if my question is very basic, I just never had any experience with sound-thing
    Please halp,
    Kind Regards
    The white connections are EXECUTIONS.

    They come from EVENTS.

    I would recommend a generalized Blueprints tutorial, something like turning on and off a light or opening and closing a door. The gist is that an event happens and it executes a series of functions and macros. These happen in a specific order, a sequence determined by the order in which your functions are connected. Sometimes there are macros that alter the execution pathway (like a Branch or a ForEach, etc.) in those cases, you might have several possible execution pathways. How you construct these pathways determines your game logic.

    The question of what Event prompts a Note On for a Synth or a Set Frequency for a Filter is not one posed to me, that is up to you.

    Maybe the Note On plays when your BP interprets a MIDI Event, or maybe it plays when your character walks into a Trigger Volume. Maybe you set the Filter Frequency of your Source Effect when your player loses the game or when you enter a menu.

    The point is, I don't know what comes before those any more than you do.

    Leave a comment:


  • replied
    Good feature, but there a lot of things in the guide that I don't understand, like on the modified screenshots that I attached.
    The guide doesn't go step by step, I don't see connections, and there is no screen-shot showing some complete blueprint set-up like in Midi tutorial
    Sorry if my question is very basic, I just never had any experience with sound-thing
    Please halp,
    Kind Regards
    Attached Files
    Last edited by Dudester01; 04-23-2018, 09:18 AM.

    Leave a comment:


  • replied
    Originally posted by Th120 View Post
    alright, so the -audiomixer parameter has no impact anymore?
    I am working with a small team and for some even setting the right ini value in the launcher engine might be a challenge.
    -audiomixer still works. But if you want it on by default for YOUR project, then you need to set up the config override. You can check ini files into your version control so that everyone gets them. I can't imagine how you're collaborating otherwise.

    Leave a comment:


  • replied
    alright, so the -audiomixer parameter has no impact anymore?
    I am working with a small team and for some even setting the right ini value in the launcher engine might be a challenge.

    Leave a comment:


  • replied
    Originally posted by Th120 View Post
    Is the new audio engine already default @ 4.19?
    Is it possible that I can run it by default? The last time I tried it I had to start the ed with -audiomixer and after that i had to open my project; I usually just open by clicking on the project file (I tried to put it in the projectfile link, but it did not work back then).
    It is not on by default, we want to road test it on Fortnite before we consider it properly released--that's just our due diligence. Follow the instructions at the top of the thread for adding the config file, this will make sure it's always on for your project.

    Leave a comment:


  • replied
    Is the new audio engine already default @ 4.19?
    Is it possible that I can run it by default? The last time I tried it I had to start the ed with -audiomixer and after that i had to open my project; I usually just open by clicking on the project file (I tried to put it in the projectfile link, but it did not work back then).

    Leave a comment:

Working...
X