Announcement

Collapse
No announcement yet.

New Audio Engine: Early Access Quick-Start Guide

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by dan.reynolds View Post

    Instead of using the Envelope Follower Source Effect, you can use the Envelope Follower Delegate that is available on each Audio Component. Your two characters just need to be speaking on separate Audio Components. For lip-flap implementation, you can just put an Audio Component on the head/face/mouth of each of your characters and use that Audio Component to drive your lip-flap. Then just make sure you play all your dialog sounds on that Audio Component.
    That's exactly the answer I needed, thank you so much!! I didn't realise it was possible to use the envelope follower this way.

    Leave a comment:


  • replied
    Originally posted by anti_zer0 View Post
    Hi, I'm trying to build a basic system to animate the mouth of my characters using the envelope follower.

    If I just have one character in a scene it works well, but if I have more than one then they all move their mouths together because they all use the same envelope follower preset.

    Is there a way I can automatically create a new envelope follower for each character? There's no limit to how many characters can be in a scene so all I can think of is to create a new one for each character like I am trying to do in this image, but it doesn't work at all!


    Click image for larger version

Name:	Annotation-20191106-141810.jpg
Views:	53
Size:	274.6 KB
ID:	1683399
    Instead of using the Envelope Follower Source Effect, you can use the Envelope Follower Delegate that is available on each Audio Component. Your two characters just need to be speaking on separate Audio Components. For lip-flap implementation, you can just put an Audio Component on the head/face/mouth of each of your characters and use that Audio Component to drive your lip-flap. Then just make sure you play all your dialog sounds on that Audio Component.

    Leave a comment:


  • replied
    Hi, I'm trying to build a basic system to animate the mouth of my characters using the envelope follower.

    If I just have one character in a scene it works well, but if I have more than one then they all move their mouths together because they all use the same envelope follower preset.

    Is there a way I can automatically create a new envelope follower for each character? There's no limit to how many characters can be in a scene so all I can think of is to create a new one for each character like I am trying to do in this image, but it doesn't work at all!


    Click image for larger version

Name:	Annotation-20191106-141810.jpg
Views:	53
Size:	274.6 KB
ID:	1683399

    Leave a comment:


  • replied
    Originally posted by RianStephens View Post
    Hi, I am trying to dynamically change the source effect settings with blueprints and the set settings node doesn't seem to be working for me. It seems that someone else has experienced a problem with this but I couldn't seem to find an answer. https://answers.unrealengine.com/que...dToView=925369

    any help or information is greatly appreciated
    What engine version is it? There was a temporary bug about this around 4.21 or so, but it's been fixed for a few versions!

    Leave a comment:


  • replied
    Hi, I am trying to dynamically change the source effect settings with blueprints and the set settings node doesn't seem to be working for me. It seems that someone else has experienced a problem with this but I couldn't seem to find an answer. https://answers.unrealengine.com/que...dToView=925369

    any help or information is greatly appreciated
    Attached Files

    Leave a comment:


  • replied
    It's worth noting that for me the component did not have "auto activate" on by default, so no sound played.

    Leave a comment:


  • replied
    that's so useful

    Leave a comment:


  • replied
    Originally posted by Alan Edwardes View Post

    Fantastic, thanks for the explanation Dan. This sounds like a massive step forwards. I released an update switching my Early Access game Estranged: Act II over to the new audio engine yesterday (mainly to alleviate AMD compatibility issues). I had to re-work all of the reverb, but from what you're saying this was worth doing.
    Yes, it was definitely worth doing.

    With that switch over, you will also open up the ability to apply Source Effects or Submix Effects as well (so if you wanted different EQ settings on two different sources, you could do this now). Or if you wanted to add Master Bus Compression to your Master Submix or to just a group of sounds, this is now possible.

    Leave a comment:


  • replied
    Originally posted by dan.reynolds View Post

    The reverb in the old engine was not supported on every platform and on ones where it was supported, were different sounding for each platform. That is because the reverb was an effect created by that platform. So on Windows and XBox, it was Microsoft's own XAudio2 reverb effect. On PS4 it was Sony's NGS2 reverb. On Mac is it was Apple's AudioUnit reverb.

    The New Unreal Audio Engine is Epic's Reverb, it renders within our own DSP pipeline, and so it will sound exactly the same on every platform you deploy to--the old one will not.

    As far as redesigning the old Engine Content, we're aware of this issue and we're looking at the best approach to take.
    Fantastic, thanks for the explanation Dan. This sounds like a massive step forwards. I released an update switching my Early Access game Estranged: Act II over to the new audio engine yesterday (mainly to alleviate AMD compatibility issues). I had to re-work all of the reverb, but from what you're saying this was worth doing.

    Leave a comment:


  • replied
    Originally posted by Alan Edwardes View Post
    Hi,

    I switched to the new audio engine and noticed that reverb (the "classic" reverb-asset based reverb that you can apply to audio volumes) is a lot more pronounced (more echo-y?) than with the stock audio engine.

    Is that expected due to the algorithm now being different, or will the discepency be fixed in a future update?

    Edit: I should mention that some of the engine content reverb settings are too echo-y now, for example Engine/EngineSounds/ReverbSettings/Mountains

    Thanks!
    Alan
    The reverb in the old engine was not supported on every platform and on ones where it was supported, were different sounding for each platform. That is because the reverb was an effect created by that platform. So on Windows and XBox, it was Microsoft's own XAudio2 reverb effect. On PS4 it was Sony's NGS2 reverb. On Mac is it was Apple's AudioUnit reverb.

    The New Unreal Audio Engine is Epic's Reverb, it renders within our own DSP pipeline, and so it will sound exactly the same on every platform you deploy to--the old one will not.

    As far as redesigning the old Engine Content, we're aware of this issue and we're looking at the best approach to take.

    Leave a comment:


  • replied
    Hi,

    I switched to the new audio engine and noticed that reverb (the "classic" reverb-asset based reverb that you can apply to audio volumes) is a lot more pronounced (more echo-y?) than with the stock audio engine.

    Is that expected due to the algorithm now being different, or will the discepency be fixed in a future update?

    Edit: I should mention that some of the engine content reverb settings are too echo-y now, for example Engine/EngineSounds/ReverbSettings/Mountains

    Thanks!
    Alan
    Last edited by Alan Edwardes; 07-23-2019, 04:34 PM.

    Leave a comment:


  • replied
    Originally posted by ArthurBarthur View Post

    I did this today, it has several presets(in-game buttons for changing presets coming soon). The blueprint is very rough/elaborate and sorely needs more elegant solutions: https://youtu.be/aumCMQFEFoQ



    PS Nice guide! Appreciate it. Many many thanks!
    This is such a cool idea

    Leave a comment:


  • replied
    Sorry if this has already been answered (did a search). I want to sidechain/envelope follower type of ducking. Not a "push mix/pop mix" thing. but dynamically ducking things. How would i go about that? The compressor has no sidechain input.
    But i'm guessing i really dont need that.
    All I want to be able to do is to control a submix audio volume, base on the envelope follower output.
    How would I go about doing that?

    Also another case: Modulating EQ frequencies, based on distance.

    I have read about "Dynamically Setting Source Effect Values in Blueprints", in the beginning of this thread, but I'm pretty confused.
    A short youtube video about these things would be super helpful. But I perfectly understand if that isnt being made.

    Thanks

    Leave a comment:


  • replied
    Is anyone else having issues with hot reloading while using the new audio engine? My editor keeps crashing with this error:
    Exception thrown at 0x00007FFDD47D9349 (UE4Editor-AudioEditor.dll) in UE4Editor.exe: 0xC0000005: Access violation reading location 0xFFFFFFFFFFFFFFFF

    I have to close the editor, recompile, and then relaunch the editor every time I change any c++ code

    Leave a comment:


  • replied
    Originally posted by ArthurBarthur View Post

    That's a current bug in the latest launcher version(4.22.2 atm) of UE. If you wait for next launcher version, or grab a fresh master off github right now, dynamic source effect parameters should work normally again.
    Thanks for the info! Just updated to the latest hotfix 4.22.3, and updating the source effect values at runtime now works. Still ran into the issue where setting the values at begin play on a media sound component is buggy on the first launch. My workaround is just to use a 0.5 second delay before setting the values.

    Leave a comment:

Working...
X