Announcement

Collapse
No announcement yet.

Modular Synth Widget and Example Project

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Modular Synth Widget and Example Project


    Hey Everyone,

    I wanted to get to grips with the Modular Synth so while we wait for an official example I wrote a Widget Interface similar to the one shown at GDC 2017. It allows configuring and saving presets, but doesn't allow playing midi files, or playing from an external source (keyboard).

    The source for an example project is available here; https://github.com/graysan/ModularSynthUI

    The example project plays two different notes in a loop by default from within the SynthActor BP. If you want to interrogate the preset details after saving you will need to view the Preset Bank from the editor. It uses the default settings until a preset is saved.

    I haven't played with synth's before, nor used UMG so it's pretty much bare bones. Feel free to use as you wish. If you make any improvments to the project please commit the changes.

    Known Issues;

    1) The Reverb dial adjusts the master reverb send amount for the preset but appears to have no audio effect. Probably something simple, maybe you can solve that.
    2) I wasn't able to create an executable as any build with a ModularSynth component fails to cook on my computer. Not sure if it's a universal issue or just local. https://answers.unrealengine.com/que...k-failure.html

    Thanks!

    Click image for larger version

Name:	ModularSynthUMGWidget.jpg
Views:	1
Size:	119.6 KB
ID:	1351617
    ModularSynthUI - UE4.17.1 Example Project with ModularSynth Widget

  • #2
    Hah! Awesome!

    It's a good exercise in learning both--If you want to ensure solid grasp, recreating full functionality would be an awesome exercise!

    Hint: for the GDC MIDI Playback, I built the MIDI playback system in Blueprints by converting my MIDI files into Datatables (using a free tool called mid2csv) and importing them as Datatables and creating a sequence player that fires off "MIDI" like events to the Synth each frame.
    Dan Reynolds
    Technical Sound Designer || Unreal Audio Engine Dev Team

    Comment


    • #3
      Dan - this is huge, thank you! I'm in the early stages of writing my own Node.JS parser to turn MIDI files into some kind of custom format that I can stuff into UE structs, but now I think I'll abandon that and concentrate on Datatables!

      I'd been really struggling to get the Procedural MIDI plugin's file reader to keep accurate time, even when pre-physics tick priority was set and not much else was going on in my level. Did you have any timekeeping issues with your technique? If so, do you recall any workarounds?

      Comment


      • #4
        Keep the framerate super high!

        So yes!

        I think my average frame had 10-30 MIDI ticks.

        A proper MIDI playback system would probably have its own thread and communicate with the Audio Thread more directly or something. Driving streaming data from the Game Thread will always present timing problems.
        Dan Reynolds
        Technical Sound Designer || Unreal Audio Engine Dev Team

        Comment


        • #5
          Wow tremendous work and thanks for sharing!

          Comment


          • #6
            Thanks Dan, most interesting! But would a MIDI playback system with its own thread even need to communicate with the Audio Thread? All I want it to do is fire note events at regular intervals; these events are picked up either by my 2Dsound-based sample instrument or by Modular/Granular Synths; I don't care too much about tracking the progress/drift of each note as long as it fires off when it should. If it could also fire off longer sound cues to keep *them* in sync, that would be great.

            What I think/thought I need to do is ask our programmer to get me a dead-on regular tick, independent of framerate and the audio thread, that I can consume in the Blueprint according to some BPM logic. Am I underestimating how complex or wrongheaded this might be?

            Comment


            • #7
              Blueprints run on the Game Thread, which is synchronized to the frame rate. If you want a tick that is asynchronous to the framerate, then you'll need a separate thread. The Audio Thread is where synth logic happens and Synth and DSP rendering happens on the Audio Rendering thread which is separate.

              Game Thread, Audio Thread, Audio Rendering Thread.

              You would want MIDI to interact with the Audio Thread as efficiently as possible, so that's where you'd want proper MIDI playback to occur.

              I built my MIDI playback system in Blueprints because I am versed in it and I didn't want to burden a programmer.
              Dan Reynolds
              Technical Sound Designer || Unreal Audio Engine Dev Team

              Comment


              • #8
                Very helpful, thanks again! I hadn't fully understood the distinctions between Game, Audio and Audio Rendering threads. This all makes sense and should get us started - cheers!

                Comment


                • #9
                  There's also the issue of "sample accurate" timing with audio -- i.e. being able to "schedule" an audio event at the precise intended time. Like if you set a clock to run at 120 BPM, you'll want each tick/drum/event to precisely occur at that rate. Right now such a BPM clock running in BP (which one can easily create in BP), would actually get quantized to the game thread tick rate. If you wanted, for example, to make a fast bebop track at 320 BPM, you'd probably run into stuttering issues as sub-frame events would cluster at the game-thread tick boundary. We found for our GDC demo, as you can see in our GDC talk, that as long as the game thread frame rate is steady and relatively fast, you actually don't really notice this clustering. But we were generally way above the rate at which it would happen.

                  To work around this issue, what would be needed is a separate mechanism to schedule events to be later executed at the appropriate time on the audio render thread (sub-render buffer). You could theoretically write something like that using a synth component or submix/source effect now since that gives you access to the audio render thread outside of the audio mixer or engine modules (i.e. you can write audio render thread-level features in your game project now). You'd then want to create a seperate schedular component which would count audio render thread frames and begin playing audio at the precise moment you'd need. It'd be an advanced audio programming project though since you'd also need to emulate audio rendering, voice management, etc in a synth component, but it's doable.

                  It's on my list of features I'd like to eventually make -- basically a "MusicComponent" (or maybe call it a AudioSchedularComponent to be more general) which provides utilities for precise scheduling of audio events. But it's not a major priority at the moment.
                  Last edited by Minus_Kelvin; 09-22-2017, 02:55 PM.

                  Comment

                  Working...
                  X