Announcement

Collapse
No announcement yet.

New Audio Engine: Early Access Quick-Start Guide

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #91
    Originally posted by vvlay9090 View Post
    hi, Dan.
    you made a super good things for us.
    I have a simple question...
    How to do this step '1. Open the project using command line flag: -audiomixer' ...
    vvlay9090, good question, you can open the project using a command prompt OR if you are using the Github version, you can enter the command line into your visual studio project build to ensure that it loads the Audio Mixer module.

    You can learn more about command prompts here:
    https://docs.unrealengine.com/latest...LineArguments/
    Dan Reynolds
    Technical Sound Designer || Unreal Audio Engine Dev Team

    Comment


      #92
      Originally posted by dan.reynolds View Post

      vvlay9090, good question, you can open the project using a command prompt OR if you are using the Github version, you can enter the command line into your visual studio project build to ensure that it loads the Audio Mixer module.

      You can learn more about command prompts here:
      https://docs.unrealengine.com/latest...LineArguments/
      Thanks Dan. Gocha..

      Comment


        #93
        Is there a way to get the frequency of a certain spectrum range on tick ? Want to be able to pick out the brilliance from the sound.

        Comment


          #94
          Originally posted by Onarbest View Post
          Is there a way to get the frequency of a certain spectrum range on tick ? Want to be able to pick out the brilliance from the sound.
          Maybe use a highpass-filter? If you need to hear the rest of the sound too, duplicate the sound but change it to a fitting low-pass filter so they together sound good. If you're using it with envelope follower it can cause complications though, because we don't have dummy-channels yet(I think), so the env foll can only react to stuff that player hears. So the split would be necessary depending on your case.
          Last edited by ArthurBarthur; 08-27-2017, 10:26 AM.

          Comment


            #95
            Originally posted by kjekkbart View Post

            Maybe use a highpass-filter? If you need to hear the rest of the sound too, duplicate the sound but change it to a fitting low-pass filter so they together sound good. If you're using it with envelope follower it can cause complications though, because we don't have dummy-channels yet(I think), so the env foll can only react to stuff that player hears. So the split would be necessary depending on your case.
            Thanks alot for the quick reply, gonna try that asap!

            Comment


              #96
              Hi, I have a question about working with the modular synth. If there's a better place to ask it, please direct me.

              My first issue is that I don't seem to understand how the "Start" node works. The description says "starts the synth generating audio," but in my blueprint it doesn't seem to be doing anything. The only way I can get the synth to make sound is if I use the "Note On" node, which I think I'd rather not do.

              The second issue, probably stemming from the first, is that I can't get the "Set Osc Frequency Mod" node to affect the sound. I assume this is related to the fact that so far I can only hear audio when a specific MIDI note is being played. Ultimately, my goal is to use the to have the frequency of the tone change based on a variable.

              I assumed that "starting" the synth with the Osc Frequency set to 2000 would generate a 2000Hz tone until I told it to stop, but clearly I'm not getting something. I don't have much experience with modular synthesizers, so maybe that ignorance is showing

              In my Blueprint, the Set Osc Frequency Mod Node is based on a tick event and the Set Synth Preset (and thus Start) is set to a button press event. Note On works if I connect it to Start.

              Thanks!

              Click image for larger version

Name:	synth blueprint.png
Views:	7
Size:	191.1 KB
ID:	1349723

              Comment


                #97
                The Start node is only for activation of the Modular Synth if the "Auto Activate" option in the Details pane has been unticked. I think it's ticked by default when you add a Modular Synth to your Blueprint, so using a Start node will have no effect.

                Try setting your Synth Preset on BeginPlay (or some other game environment-driven event), deleting the Start, then hooking up your keypress event to the NoteOn.

                If you want the frequency of the tone changed *directly* by a variable, you might try mapping that variable to pitchbend. What you're doing here isn't starting the synth with Osc 1's frequency set to 2000Hz; NoteOn sets the oscillator's base frequency by converting the incoming MIDI note to Hz, then preset or modulator data is factored in to account for semitone/octave/LFO adjustment. SetOscFrequencyMod is intended for FM synthesis, and while I haven't tried this out yet, I think it needs to work in conjunction with another modulation source whereas this just sets the modulation scale factor.

                Also bear in mind that while oscillators in the preset structure are numbered from 1, I *think* that you have to count them from 0 when addressing them in relevant BP nodes. Don't quote me on that, though!

                Comment


                  #98
                  Sorry, that stuff about the Start node is a load of rubbish. Long day. An Activate node has to be used if Auto-activate has been disabled (obviously); a Start node needs to be used on BeginPlay (or whatever) before you do anything else, so that the synth will be ready to make a sound when instructed to, but you need to trigger the NoteOn to instruct it

                  Comment


                    #99
                    Could this be used in a similar way to Pure Data for sound design? I'd love to convert my patches to this.

                    Comment


                      Originally posted by chrisjdarcy View Post
                      Could this be used in a similar way to Pure Data for sound design? I'd love to convert my patches to this.
                      Hey ChrisJDarcy,

                      Aaron and I gave a presentation a few months ago to PA Now about how to get into using Blueprints to create Procedural Audio and Aaron's portion of the discussion is focused on how to transition from using Pure Data or MAX/MSP to using Blueprints:

                      Dan Reynolds
                      Technical Sound Designer || Unreal Audio Engine Dev Team

                      Comment


                        Originally posted by dan.reynolds View Post
                        Aaron and I gave a presentation a few months ago to PA Now about how to get into using Blueprints to create Procedural Audio and Aaron's portion of the discussion is focused on how to transition from using Pure Data or MAX/MSP to using Blueprints:

                        <iframe width="560" height="315" src="https://www.youtube.com/embed/auh342KUUoM" frameborder="0" allowfullscreen></iframe>
                        Fantastic presentation, very impressive.
                        Still we can't do much dynamic content with timeline, without sample accurate timing. As a heavy REAKTOR user i'm very excited

                        Comment


                          Very nice!
                          But i have a problem. Im using it on an android app and when i suspend the app the music is still running in background. I need to kill the app process to kill the music. Can anybody help ?

                          Comment


                            Originally posted by dan.reynolds View Post

                            Hey Mindridellc! Are you using the new Audio Engine or the old Audio Engine? (-audiomixer)

                            And you're saying that when you do VR Preview, the synth is coming out of your Rift and when you do regular PIE it comes out of your Windows default?

                            And you say that when you do VR Preview, sounds being used by another engine come out of your Windows Default and NOT the Rift?


                            The expectation from Unreal Engine's perspective is that if you're doing a VR Preview, the audio should come out of your Rift's headphones (or whatever VR device you have active), and that regular PIE should come out of whatever your computer default is OR whatever you've set as your default output for your Windows Project Settings:

                            [ATTACH=CONFIG]149610[/ATTACH]
                            Hi Dan,

                            So, if I understand you correctly, it's not really possible to manually specify the output of the Unreal audio engine output (new OR old) in VR Preview. I've booted my project both with and without the -audiomixer flag, and changing the device setting as you showed in your screenshot never stopped piping Modular Synth Component output and Play Sound 2D output to my Rift so long as I was previewing in VR.

                            Maybe it's different with Vive, but this seems to be how the Rift behaves no matter what. Maybe I'll have to use a virtual mixer like JACK or VoiceMeeter to intercept the Rift audio output on the way to the monitor output.

                            Comment


                              Originally posted by mindridellc View Post

                              Hi Dan,

                              So, if I understand you correctly, it's not really possible to manually specify the output of the Unreal audio engine output (new OR old) in VR Preview. I've booted my project both with and without the -audiomixer flag, and changing the device setting as you showed in your screenshot never stopped piping Modular Synth Component output and Play Sound 2D output to my Rift so long as I was previewing in VR.

                              Maybe it's different with Vive, but this seems to be how the Rift behaves no matter what. Maybe I'll have to use a virtual mixer like JACK or VoiceMeeter to intercept the Rift audio output on the way to the monitor output.
                              Hi Mindridellc,

                              So the Modular Synth should only work in the new Audio Engine, and discussing this with Minus_Kelvin, it looks like this is something that still needs to be worked out in the new Audio Engine. With that said, specifying the Windows Target Device in your project manager should override the HMD output in the old Audio Engine. Are you certain you have checked in the old Audio Engine as well?
                              Dan Reynolds
                              Technical Sound Designer || Unreal Audio Engine Dev Team

                              Comment


                                for the android and ios .ini files could I force mute if there is no headphone detected? I saw a workaround for Unity, but cant get this to work with Unreal. I basically dont want audio to play or just have zero volume, if there are no headphones being used. Any help would be super.

                                Comment

                                Working...
                                X