Announcement

Collapse
No announcement yet.

New Audio Engine: Early Access Quick-Start Guide

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by rasamaya View Post
    for the android and ios .ini files could I force mute if there is no headphone detected? I saw a workaround for Unity, but cant get this to work with Unreal. I basically dont want audio to play or just have zero volume, if there are no headphones being used. Any help would be super.
    Hi Rasamaya!

    You will need to take advantage of some kind of device notification message. You will probably need to look into the APIs for the various devices, as they will differ.

    You can create a mute button though and use the SoundMix system to establish 0.0f volume audio on the Master SoundClass.

    Leave a comment:


  • replied
    Originally posted by ArjunTheMiella View Post
    Hi there quick question about patches:

    I've wired a very simple patch with Source->envelope and Destination->gain and it works great - the envelope correctly affecting the patched destination (in this case, overall gain). However, when I change the destination to be Destination->osc 1gain it seems to have no effect. Is there something that I'm missing? I'm seeing a similar behavior with any of the individual osc parameters (gain, freq, etc)

    I'm trying to use osc 1 and osc 2 to make a sort of 808 sounding bass kick where there's the clicky sound (noise in osc1) and the resonating bass sound (sine in osc2). In order to do that I'm trying to apply different adsr envs to each oscillator for the two different parts of this sound. I assume that changing the gain on osc 1 independent of osc 2 is possible, otherwise I don't see why there would be a distinction between osc 1 and 2 offered in the patch destination dropdown.

    I'm very new to synthesis but have been doing lots of outside reading to learn the basics. Is there one simple step that I'm missing or perhaps a parameter that is obviously set incorrectly but I wouldn't know?

    Thanks,

    - Arj
    Hi Arj!

    Yeah the patch system can get a bit weedy. When I made my drum kit for our GDC floor demo, I conceded to having two synthsizers per kit piece. A bit pricier, but it was way easier to program.

    Leave a comment:


  • replied
    Originally posted by Tomavatars View Post
    After investigation it is due to the start/stop behavior. You need to use the stop node if you want to avoid editor freeze. Which is tricky.
    Hi Tomavatars! Thanks for the report, I'll talk to Ethan if he has an idea about this!

    Leave a comment:


  • replied
    After investigation it is due to the start/stop behavior. You need to use the stop node if you want to avoid editor freeze. Which is tricky.

    Leave a comment:


  • replied
    Hey Dan!
    I just upgraded to 4.18 on Mac, and when I play in editor a synth patch that works, when I stop the game, it freezes the editor. Am I doing something wrong and is it only a Mac problem?
    Thanks!

    Leave a comment:


  • replied
    Hi there quick question about patches:

    I've wired a very simple patch with Source->envelope and Destination->gain and it works great - the envelope correctly affecting the patched destination (in this case, overall gain). However, when I change the destination to be Destination->osc 1gain it seems to have no effect. Is there something that I'm missing? I'm seeing a similar behavior with any of the individual osc parameters (gain, freq, etc)

    I'm trying to use osc 1 and osc 2 to make a sort of 808 sounding bass kick where there's the clicky sound (noise in osc1) and the resonating bass sound (sine in osc2). In order to do that I'm trying to apply different adsr envs to each oscillator for the two different parts of this sound. I assume that changing the gain on osc 1 independent of osc 2 is possible, otherwise I don't see why there would be a distinction between osc 1 and 2 offered in the patch destination dropdown.

    I'm very new to synthesis but have been doing lots of outside reading to learn the basics. Is there one simple step that I'm missing or perhaps a parameter that is obviously set incorrectly but I wouldn't know?

    Thanks,

    - Arj

    Leave a comment:


  • replied
    for the android and ios .ini files could I force mute if there is no headphone detected? I saw a workaround for Unity, but cant get this to work with Unreal. I basically dont want audio to play or just have zero volume, if there are no headphones being used. Any help would be super.

    Leave a comment:


  • replied
    Originally posted by mindridellc View Post

    Hi Dan,

    So, if I understand you correctly, it's not really possible to manually specify the output of the Unreal audio engine output (new OR old) in VR Preview. I've booted my project both with and without the -audiomixer flag, and changing the device setting as you showed in your screenshot never stopped piping Modular Synth Component output and Play Sound 2D output to my Rift so long as I was previewing in VR.

    Maybe it's different with Vive, but this seems to be how the Rift behaves no matter what. Maybe I'll have to use a virtual mixer like JACK or VoiceMeeter to intercept the Rift audio output on the way to the monitor output.
    Hi Mindridellc,

    So the Modular Synth should only work in the new Audio Engine, and discussing this with Minus_Kelvin, it looks like this is something that still needs to be worked out in the new Audio Engine. With that said, specifying the Windows Target Device in your project manager should override the HMD output in the old Audio Engine. Are you certain you have checked in the old Audio Engine as well?

    Leave a comment:


  • replied
    Originally posted by dan.reynolds View Post

    Hey Mindridellc! Are you using the new Audio Engine or the old Audio Engine? (-audiomixer)

    And you're saying that when you do VR Preview, the synth is coming out of your Rift and when you do regular PIE it comes out of your Windows default?

    And you say that when you do VR Preview, sounds being used by another engine come out of your Windows Default and NOT the Rift?


    The expectation from Unreal Engine's perspective is that if you're doing a VR Preview, the audio should come out of your Rift's headphones (or whatever VR device you have active), and that regular PIE should come out of whatever your computer default is OR whatever you've set as your default output for your Windows Project Settings:

    [ATTACH=CONFIG]149610[/ATTACH]
    Hi Dan,

    So, if I understand you correctly, it's not really possible to manually specify the output of the Unreal audio engine output (new OR old) in VR Preview. I've booted my project both with and without the -audiomixer flag, and changing the device setting as you showed in your screenshot never stopped piping Modular Synth Component output and Play Sound 2D output to my Rift so long as I was previewing in VR.

    Maybe it's different with Vive, but this seems to be how the Rift behaves no matter what. Maybe I'll have to use a virtual mixer like JACK or VoiceMeeter to intercept the Rift audio output on the way to the monitor output.

    Leave a comment:


  • replied
    Very nice!
    But i have a problem. Im using it on an android app and when i suspend the app the music is still running in background. I need to kill the app process to kill the music. Can anybody help ?

    Leave a comment:


  • replied
    Originally posted by dan.reynolds View Post
    Aaron and I gave a presentation a few months ago to PA Now about how to get into using Blueprints to create Procedural Audio and Aaron's portion of the discussion is focused on how to transition from using Pure Data or MAX/MSP to using Blueprints:

    <iframe width="560" height="315" src="https://www.youtube.com/embed/auh342KUUoM" frameborder="0" allowfullscreen></iframe>
    Fantastic presentation, very impressive.
    Still we can't do much dynamic content with timeline, without sample accurate timing. As a heavy REAKTOR user i'm very excited

    Leave a comment:


  • replied
    Originally posted by chrisjdarcy View Post
    Could this be used in a similar way to Pure Data for sound design? I'd love to convert my patches to this.
    Hey ChrisJDarcy,

    Aaron and I gave a presentation a few months ago to PA Now about how to get into using Blueprints to create Procedural Audio and Aaron's portion of the discussion is focused on how to transition from using Pure Data or MAX/MSP to using Blueprints:

    Leave a comment:


  • replied
    Could this be used in a similar way to Pure Data for sound design? I'd love to convert my patches to this.

    Leave a comment:


  • replied
    Sorry, that stuff about the Start node is a load of rubbish. Long day. An Activate node has to be used if Auto-activate has been disabled (obviously); a Start node needs to be used on BeginPlay (or whatever) before you do anything else, so that the synth will be ready to make a sound when instructed to, but you need to trigger the NoteOn to instruct it

    Leave a comment:


  • replied
    The Start node is only for activation of the Modular Synth if the "Auto Activate" option in the Details pane has been unticked. I think it's ticked by default when you add a Modular Synth to your Blueprint, so using a Start node will have no effect.

    Try setting your Synth Preset on BeginPlay (or some other game environment-driven event), deleting the Start, then hooking up your keypress event to the NoteOn.

    If you want the frequency of the tone changed *directly* by a variable, you might try mapping that variable to pitchbend. What you're doing here isn't starting the synth with Osc 1's frequency set to 2000Hz; NoteOn sets the oscillator's base frequency by converting the incoming MIDI note to Hz, then preset or modulator data is factored in to account for semitone/octave/LFO adjustment. SetOscFrequencyMod is intended for FM synthesis, and while I haven't tried this out yet, I think it needs to work in conjunction with another modulation source whereas this just sets the modulation scale factor.

    Also bear in mind that while oscillators in the preset structure are numbered from 1, I *think* that you have to count them from 0 when addressing them in relevant BP nodes. Don't quote me on that, though!

    Leave a comment:

Working...
X