Announcement

Collapse
No announcement yet.

Setting Up a Blueprint MIDI Manager with 4.14 version of MIDI Device Support Plugin

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Setting Up a Blueprint MIDI Manager with 4.14 version of MIDI Device Support Plugin

    Setting Up a Blueprint MIDI Manager in UE4 with the 4.14 Beta version of the MIDI Device Support Plugin

    This is a demonstration of the Blueprints based MIDI Manager I built for our GDC 2017 presentation. While MIDI input and support is something we would like to support more robustly, this is a demo of how one might set up a MIDI Manager in Blueprints using the current version of the MIDI Input Plugin.

    Important: The MIDI Device Support plugin is an Experimental plugin which must be activated in the Plugin Manager. The MIDI Device Support plugin collects MIDI Device data from your OS on startup. Devices must be connected before hand in order to be used.




    My Approach, or Why I Built It The Way I Did:

    There are a lot of different ways to build a MIDI Manager using the MIDI Device Support plugin and this is just one approach to be taken. For my manager, I wanted it to be an Actor that would communicate via a Blueprint Interface to an unknown number of Synth Actors in my Scene. I wanted to be able to specify the Name of the MIDI Device and I wanted to be able to Override Channel Assignments so that I could direct the traffic to a specified Synth Actor for manual auditioning.

    In this way, I could quickly drag and drop my MIDI Manager Actor BP into my scene and start controlling my Synths.

    At the same time, I let my Synth Actor Blueprints determine how to interpret Interface Functions or Event Messages locally.



    Let's Take a Closer Look:

    Here is a closer look at the variables for my Blueprint. You'll notice that I've made MIDIDeviceName, bDebug, bOverrideChannelOut, and OverrideChannel public. This is so I could serialize which MIDI Device Input I wanted (allowing me to have multiple managers for multiple input devices) and it allowed me to specify an override channel if I wanted to target specific recipients or if I wanted to use the MIDI Channel reported on the event. I also had a Debug toggle that would print useful information if turned ON.




    1. Initial Setup:
      1. I want to collect and store all the MIDI Devices reported by the OS

        1. If I've enabled Debug mode on my BP, I like to print out a list of all the connected devices





      2. Then I want to search for my specific MIDI Input Device and once found, assign it as my MIDI Device




      3. Then I collect all the Actors in the scene with my custom Blueprint Interface and store the list as an array




      4. Then I bind my MIDI Device to my On MIDI Event





    2. On MIDI Event
      1. Store all the MIDI Data for evaluation or to pass in other parts of the Blueprint

      2. Switch on MIDI Event Type
        1. Account for special condition where instead of Note Off, the MIDI Device sends a Note On Event with a Velocity of 0 (some devices/software does this, some do not)




        2. Parse MIDI Pitch Data (hacky, but it does the trick)




      3. Send out data to scene via Event Dispatcher or some kind of custom Blueprint Interface (I actually do both for flexibility)

      4. Print Event Data if Debug is ON

    Attached Files
    Last edited by dan.reynolds; 05-07-2017, 08:49 PM.
    Dan Reynolds
    Technical Sound Designer || Unreal Audio Engine Dev Team

    #2
    You guys are wizards -- this is brilliant, thanks for the deep dive, I can't wait to try it!

    SINGMETOSLEEP / PARASOMNIA / DARKDRIFT / ETERNAL APEX / PRETTY ABRASIVE MUSIC / TWITTER @ACATALEPT

    Comment


      #3
      I really want to make the midi plugin actually reasonably work-able out of the box, but we haven't had the time to translate these BP scripts to C++ code for the plugin. The Midi plugin is a pretty raw/thin layer that just feeds you direct midi data into BP.

      Comment


        #4
        Looks great, will this workflow support system exclusive and MIDI clock messages as well as MIDI out in the future? I've currently been using and extending the free marketplace MIDI plugin to support this - https://www.unrealengine.com/marketp...rocedural-midi and https://github.com/Geromatic/Midi-Unreal.

        Comment


          #5
          Thanks very much for the excellent guide. My crude initial attempts before this guide existed worked, but I like the way you have implemented things to communicate with other actors so I set about following your guide. It's all working well so far and I learnt much about blueprints along the way (relative UE4 newbie here).

          One thing I would like is an option not to trigger a MIDI event from the plugin if unknown message types are received. I plugged in a device that was probably outputting midi clock data and it seemed a bit wasteful to see the event triggering continually.

          I'm also curious as to what, from an UE4 efficiency point of view, would be the best way to deal with stuff where I need to deal with specific pairs of events together? I'm thinking specifically of the high-res version of MIDI cc data that a handful of controllers support - they work by sending two 7 bit cc messages using specific pairs of controller numbers, and combining the values into a 14 bit message. ( example of tech details of this http://little-scale.blogspot.co.uk/2...tion-midi.html )

          edited to add - as a UE4 blueprint newbie I'm also considering what the most efficient way of normalising values that come in as 0-127 to a float ranging between 0-1. I figure I'm quite likely to be lerping a lot with controller and velocity values and so it might make sense to do this step in the master midi blueprint rather than do it in each actor. And I'm not yet too familiar with the most efficient ways to do maths in blueprints. I can make something that works, I just dont know if I am being wasteful!
          Last edited by SteveElbows; 05-09-2017, 12:14 PM.

          Comment


            #6
            Originally posted by SteveElbows View Post
            Thanks very much for the excellent guide. My crude initial attempts before this guide existed worked, but I like the way you have implemented things to communicate with other actors so I set about following your guide. It's all working well so far and I learnt much about blueprints along the way (relative UE4 newbie here).

            One thing I would like is an option not to trigger a MIDI event from the plugin if unknown message types are received. I plugged in a device that was probably outputting midi clock data and it seemed a bit wasteful to see the event triggering continually.

            I'm also curious as to what, from an UE4 efficiency point of view, would be the best way to deal with stuff where I need to deal with specific pairs of events together? I'm thinking specifically of the high-res version of MIDI cc data that a handful of controllers support - they work by sending two 7 bit cc messages using specific pairs of controller numbers, and combining the values into a 14 bit message. ( example of tech details of this http://little-scale.blogspot.co.uk/2...tion-midi.html )

            edited to add - as a UE4 blueprint newbie I'm also considering what the most efficient way of normalising values that come in as 0-127 to a float ranging between 0-1. I figure I'm quite likely to be lerping a lot with controller and velocity values and so it might make sense to do this step in the master midi blueprint rather than do it in each actor. And I'm not yet too familiar with the most efficient ways to do maths in blueprints. I can make something that works, I just dont know if I am being wasteful!
            Pitch Bend is a 14-bit value of two combined 7 bit messages (a little number and a big number), you can use the same math I did for the other types of messages you're receiving.

            For my purposes, I just used a Map to Range Clamped to map 0-127 to 0.0f-1.0f, but it's a linear map. If you want some logarithmic or other type of mapping function, you'll have to play around a bit.

            I've been playing around with Float Curves in a Timeline for non-linear custom mapping.
            Dan Reynolds
            Technical Sound Designer || Unreal Audio Engine Dev Team

            Comment


              #7
              Thanks! I may return to some of those subjects another time.

              For now I just wanted to share how I am using the pitch values that your blueprints send with my Roli Seaboard controller and the experimental synth that is in the 4.16 preview.

              I could save myself some maths if I changed things in your blueprint but the example below keeps your math intact and just modifies stuff in the separate actor that I've made which implements the midi interface and has a synth.

              Click image for larger version

Name:	PitchToSynth.PNG
Views:	1
Size:	96.5 KB
ID:	1127847

              The Roli, with standard recommended settings, has a pitch bend range of 48 semitones. This is because you can move your finger left and right across the whole range of the key bed. And the note where your finger is over when bending should correspond to the same pitch as you have bent to. In the above blueprint, the multiplication by 96 is dealing with this (twice 48, because in previous steps I end up with a range of +/- 0.5). People with more normal controllers may want to bend by a much smaller range in total, so play with that number to suit your needs. Also in this example I am doing exactly the same semitone stuff to both the synth oscillators.

              I will post a video of this stuff in action at some point but it will have to wait until I have decided whether to try and make polyphony with MPE work (a topic for another day) and hook something visually pleasing up.

              edit - oops upon inspection my maths explanation is a little screwy because I actually end up with values of +/- 0.25, which when multiplied by 96 gives me +/- 24 semitones which is correct for this device. But I need to try with a normal midi controller and double-check my implementation of your pitch value macro because with this instrument the pitch values I get from your stuff are not using the full range of 0-16384 but maybe I am confusing myself. Seaboard not the easiest device to start with this stuff but worth the challenge, and whatever the underlying explanation my implementation does give exactly the right pitch results regardless of what note I start on and how far I move my finger to bend.

              edit again - ok I checked with a normal controller and not getting the full range is just down to the way the roli bend works, normal devices will give the full range. So people with normal controllers will want to use a value way smaller than 96, eg 4.
              Last edited by SteveElbows; 05-09-2017, 04:38 PM.

              Comment


                #8
                Thanks to the design of your Midi Manager it was trivial for me to setup 15 instances of the synth (each set to use 1 voice) for use with the Roli! With the Roli each key pressed has it's own unique parameters, so every note is sent on a separate midi channel (but channel 1 is reserved for overall controls that are not per-note, eg the x-y pad on the left). Since I dont have 15 fingers I should really setup some kind of pooling system that can reuse a smaller collection of synth instances, but at this stage I am just testing the concept and so far even without a pool it is working surprisingly well. For example I can hold down 3 notes and bend the pitch of one of them up, one of them down, and not affect the pitch of the 3rd note at all.

                Comment


                  #9
                  Originally posted by SteveElbows View Post
                  Thanks to the design of your Midi Manager it was trivial for me to setup 15 instances of the synth (each set to use 1 voice) for use with the Roli! With the Roli each key pressed has it's own unique parameters, so every note is sent on a separate midi channel (but channel 1 is reserved for overall controls that are not per-note, eg the x-y pad on the left). Since I dont have 15 fingers I should really setup some kind of pooling system that can reuse a smaller collection of synth instances, but at this stage I am just testing the concept and so far even without a pool it is working surprisingly well. For example I can hold down 3 notes and bend the pitch of one of them up, one of them down, and not affect the pitch of the 3rd note at all.
                  Would love to see a video of this posted. I love the ROLI stuff.

                  Comment


                    #10
                    Originally posted by Minus_Kelvin View Post
                    Would love to see a video of this posted. I love the ROLI stuff.
                    I promise I will post one one day, but it will have to wait until 4.16 is out and some of the nvidia gameworks stuff like flex and/or flow work with 4.16 since that is the stuff I plan to use to do the visual side. And dont expect me to actually be able to play the roli properly

                    Comment


                      #11
                      Although maybe I will manage to knock out a much cruder example without the visuals done in the meantime. IF you or anyone fancies sending me a nice synth preset to try that would probably help motivate me

                      Comment


                        #12
                        Originally posted by SteveElbows View Post
                        Thanks to the design of your Midi Manager it was trivial for me to setup 15 instances of the synth (each set to use 1 voice) for use with the Roli! With the Roli each key pressed has it's own unique parameters, so every note is sent on a separate midi channel (but channel 1 is reserved for overall controls that are not per-note, eg the x-y pad on the left). Since I dont have 15 fingers I should really setup some kind of pooling system that can reuse a smaller collection of synth instances, but at this stage I am just testing the concept and so far even without a pool it is working surprisingly well. For example I can hold down 3 notes and bend the pitch of one of them up, one of them down, and not affect the pitch of the 3rd note at all.
                        That's awesome, man!
                        Dan Reynolds
                        Technical Sound Designer || Unreal Audio Engine Dev Team

                        Comment


                          #13
                          OK I managed to punch well above my weight and got nvidia Flow working with a recent github 4.16 version so I should be able to start working on the demo sooner rather than later

                          Comment


                            #14
                            Originally posted by SteveElbows View Post
                            OK I managed to punch well above my weight and got nvidia Flow working with a recent github 4.16 version so I should be able to start working on the demo sooner rather than later
                            Can't wait!!! The nvidia physics have been calling me too, but I've already chosen VR as my big performance-eater.

                            Comment


                              #15
                              Can you play .mid files using the plugin?

                              Comment

                              Working...
                              X