Announcement

Collapse
No announcement yet.

New Audio Engine: Early Access Quick-Start Guide

Collapse
This is a sticky topic.
X
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #76
    Never mind, I didn't see the Source Effect (Some Effect) Object type before, only the Source Effect (Some Effect) Preset type.

    Comment


      #77
      Yes, they are globally controlled at the moment, but we're looking at ways we can modulate instances of the Source Effects without having to create a bunch of Instanced assets (like with Materials).
      Dan Reynolds
      Technical Sound Designer || Unreal Audio Engine Dev Team

      Comment


        #78
        Hi Dan, another quick one for you. Is it possible to control which audio source the Modular Synth Component outputs to? Whenever I test my patch while in VR, the rest of my audio (which is coming from a different engine entirely, never mind that) is coming out of whatever my Windows System Setting is (my monitor) while just the Synthesizer component is playing only on my Rift headphones. If I PIE, it comes out of the monitor as expected.

        Thanks

        Comment


          #79
          Originally posted by mindridellc View Post
          Hi Dan, another quick one for you. Is it possible to control which audio source the Modular Synth Component outputs to? Whenever I test my patch while in VR, the rest of my audio (which is coming from a different engine entirely, never mind that) is coming out of whatever my Windows System Setting is (my monitor) while just the Synthesizer component is playing only on my Rift headphones. If I PIE, it comes out of the monitor as expected.

          Thanks
          Hey Mindridellc! Are you using the new Audio Engine or the old Audio Engine? (-audiomixer)

          And you're saying that when you do VR Preview, the synth is coming out of your Rift and when you do regular PIE it comes out of your Windows default?

          And you say that when you do VR Preview, sounds being used by another engine come out of your Windows Default and NOT the Rift?


          The expectation from Unreal Engine's perspective is that if you're doing a VR Preview, the audio should come out of your Rift's headphones (or whatever VR device you have active), and that regular PIE should come out of whatever your computer default is OR whatever you've set as your default output for your Windows Project Settings:

          Click image for larger version

Name:	WindowsAudioDeviceProjectSettings.JPG
Views:	1
Size:	81.3 KB
ID:	1131517
          Dan Reynolds
          Technical Sound Designer || Unreal Audio Engine Dev Team

          Comment


            #80
            Originally posted by dan.reynolds View Post
            Yes, they are globally controlled at the moment, but we're looking at ways we can modulate instances of the Source Effects without having to create a bunch of Instanced assets (like with Materials).
            This would be great. Right now I have a Blueprint Actor which, when one of its variables is set on some type, it uses the Effect Chain, and when that variable is set to another type, it bypasses the chain. Swapping it out at runtime is not an option, since the Effect Chain is global, so one instance of that object's message to the chain overrides the other.

            Comment


              #81
              Originally posted by mindridellc View Post
              This would be great. Right now I have a Blueprint Actor which, when one of its variables is set on some type, it uses the Effect Chain, and when that variable is set to another type, it bypasses the chain. Swapping it out at runtime is not an option, since the Effect Chain is global, so one instance of that object's message to the chain overrides the other.
              Either making this instanceable, or making the ability to set the chain per Actor scriptable in Blueprint would work for me. I'm sure there are other workarounds I can do in the meantime, I just need to approach it with a fresh head, heh.

              Comment


                #82
                Hi [MENTION=524867]dan.reynolds[/MENTION],

                In your GDC presentation, you had some materials that seemed to be influenced by audio waveforms. I'm trying to build a HAL 9000-esque interface that will modify color & size based upon spoken dialogue.

                Has anything like that been exposed in these 4.16 experimental features?

                Thanks!

                Comment


                  #83
                  Not Dan here. Do you have the spoken dialogue ready as audio file, or do you need it to be reacting to the users voice, live? If it's audio files you can do it in blueprints Set up the 'Envelope Follower' source effect thing. Instructions are in the first or second post of this thread.
                  Live voice is trickier(for now...dun-dun-duuun), but if you are cool with c++ you can do it.

                  Have fun!

                  Comment


                    #84
                    Originally posted by kjekkbart View Post
                    Not Dan here. Do you have the spoken dialogue ready as audio file, or do you need it to be reacting to the users voice, live? If it's audio files you can do it in blueprints Set up the 'Envelope Follower' source effect thing. Instructions are in the first or second post of this thread.
                    Live voice is trickier(for now...dun-dun-duuun), but if you are cool with c++ you can do it.

                    Have fun!
                    Oh, that's perfect! Yes, I'm using audio files and, somehow, just completely missed this. Thanks so much!

                    Comment


                      #85
                      [MENTION=21219]SVR33[/MENTION]

                      Yup, we used the Envelope Follower to drive Instanced Material Parameters to create a reactive effect.
                      Last edited by dan.reynolds; 08-04-2017, 04:28 PM. Reason: Spelling
                      Dan Reynolds
                      Technical Sound Designer || Unreal Audio Engine Dev Team

                      Comment


                        #86
                        Originally posted by dan.reynolds View Post
                        [MENTION=21219]SVR33[/MENTION]

                        Yup, we used the Envelope Follower to drive Instanced Material Parameters to create a reactive affect.
                        I got it working! I'm really psyched. Thanks!

                        Comment


                          #87
                          When I make one Pawn and one Actor who each have their own Modular Synth, and try and run it Unreal crashes.

                          I am using the same preset bank for both though.

                          Am I not able to build more than one instance of a Modular synth?

                          If so, is there a good way for many actors to have their own preset playing from one Modular Synth simultaneously ?

                          Thanks.

                          Comment


                            #88
                            You can have many, everywhere!
                            Just tried with a couple actors and a pawn, each playing a note on their own synth. Works great!
                            Maybe it's in your preset-setup or somewhere else?

                            Comment


                              #89
                              Thanks kjekkbart... Good to know... I copied some objects from the other... gonna rather rebuild the second one, see how it goes.

                              Comment


                                #90
                                hi, Dan.
                                you made a super good things for us.
                                I have a simple question...
                                How to do this step '1. Open the project using command line flag: -audiomixer' ...

                                Comment

                                Working...
                                X