Setting Up a Blueprint MIDI Manager with 4.14 version of MIDI Device Support Plugin

**Setting Up a Blueprint MIDI Manager in UE4 with the 4.14 Beta version of the MIDI Device Support Plugin
**
This is a demonstration of the Blueprints based MIDI Manager I built for our GDC 2017 presentation. **While MIDI input and support is something we would like to support more robustly, this is a demo of how one might set up a MIDI Manager in Blueprints using the current version of the MIDI Input Plugin.
**
*Important: The MIDI Device Support plugin is an Experimental plugin which must be activated in the Plugin Manager. The MIDI Device Support plugin collects MIDI Device data from your OS on startup. Devices must be connected before hand in order to be used.

https://forums.unrealengine.com/attachment.php?attachmentid=140146&stc=1

**My Approach, or Why I Built It The Way I Did:
**
There are a lot of different ways to build a MIDI Manager using the MIDI Device Support plugin and this is just one approach to be taken. For my manager, I wanted it to be an **Actor **that would communicate via a Blueprint Interface to an unknown number of Synth Actors in my Scene. I wanted to be able to specify the Name of the MIDI Device and I wanted to be able to Override Channel Assignments so that I could direct the traffic to a specified Synth Actor for manual auditioning.

In this way, I could quickly drag and drop my **MIDI Manager Actor BP **into my scene and start controlling my Synths.

At the same time, I let my Synth Actor Blueprints determine how to interpret Interface Functions or Event Messages locally.

https://forums.unrealengine.com/attachment.php?attachmentid=140148&stc=1

**Let’s Take a Closer Look:
**
Here is a closer look at the variables for my Blueprint. You’ll notice that I’ve made MIDIDeviceName, bDebug, bOverrideChannelOut, and OverrideChannel public. This is so I could serialize which MIDI Device Input I wanted (allowing me to have multiple managers for multiple input devices) and it allowed me to specify an override channel if I wanted to target specific recipients or if I wanted to use the MIDI Channel reported on the event. I also had a Debug toggle that would print useful information if turned ON.

https://forums.unrealengine.com/attachment.php?attachmentid=140149&stc=1

2 Likes

You guys are wizards – this is brilliant, thanks for the deep dive, I can’t wait to try it!

I really want to make the midi plugin actually reasonably work-able out of the box, but we haven’t had the time to translate these BP scripts to C++ code for the plugin. The Midi plugin is a pretty raw/thin layer that just feeds you direct midi data into BP.

Looks great, will this workflow support system exclusive and MIDI clock messages as well as MIDI out in the future? I’ve currently been using and extending the free marketplace MIDI plugin to support this - Procedural Midi in Code Plugins - UE Marketplace and GitHub - Geromatic/Midi-Unreal: Midi for Unreal Engine.

Thanks very much for the excellent guide. My crude initial attempts before this guide existed worked, but I like the way you have implemented things to communicate with other actors so I set about following your guide. It’s all working well so far and I learnt much about blueprints along the way (relative UE4 newbie here).

One thing I would like is an option not to trigger a MIDI event from the plugin if unknown message types are received. I plugged in a device that was probably outputting midi clock data and it seemed a bit wasteful to see the event triggering continually.

I’m also curious as to what, from an UE4 efficiency point of view, would be the best way to deal with stuff where I need to deal with specific pairs of events together? I’m thinking specifically of the high-res version of MIDI cc data that a handful of controllers support - they work by sending two 7 bit cc messages using specific pairs of controller numbers, and combining the values into a 14 bit message. ( example of tech details of this little-scale: Ableton Live and High Resolution MIDI Control )

edited to add - as a UE4 blueprint newbie I’m also considering what the most efficient way of normalising values that come in as 0-127 to a float ranging between 0-1. I figure I’m quite likely to be lerping a lot with controller and velocity values and so it might make sense to do this step in the master midi blueprint rather than do it in each actor. And I’m not yet too familiar with the most efficient ways to do maths in blueprints. I can make something that works, I just dont know if I am being wasteful!

Pitch Bend is a 14-bit value of two combined 7 bit messages (a little number and a big number), you can use the same math I did for the other types of messages you’re receiving.

For my purposes, I just used a Map to Range Clamped to map 0-127 to 0.0f-1.0f, but it’s a linear map. If you want some logarithmic or other type of mapping function, you’ll have to play around a bit.

I’ve been playing around with Float Curves in a Timeline for non-linear custom mapping.

Thanks! I may return to some of those subjects another time.

For now I just wanted to share how I am using the pitch values that your blueprints send with my Roli Seaboard controller and the experimental synth that is in the 4.16 preview.

I could save myself some maths if I changed things in your blueprint but the example below keeps your math intact and just modifies stuff in the separate actor that I’ve made which implements the midi interface and has a synth.

The Roli, with standard recommended settings, has a pitch bend range of 48 semitones. This is because you can move your finger left and right across the whole range of the key bed. And the note where your finger is over when bending should correspond to the same pitch as you have bent to. In the above blueprint, the multiplication by 96 is dealing with this (twice 48, because in previous steps I end up with a range of +/- 0.5). People with more normal controllers may want to bend by a much smaller range in total, so play with that number to suit your needs. Also in this example I am doing exactly the same semitone stuff to both the synth oscillators.

I will post a video of this stuff in action at some point but it will have to wait until I have decided whether to try and make polyphony with MPE work (a topic for another day) and hook something visually pleasing up.

edit - oops upon inspection my maths explanation is a little screwy because I actually end up with values of +/- 0.25, which when multiplied by 96 gives me +/- 24 semitones which is correct for this device. But I need to try with a normal midi controller and double-check my implementation of your pitch value macro because with this instrument the pitch values I get from your stuff are not using the full range of 0-16384 but maybe I am confusing myself. Seaboard not the easiest device to start with this stuff but worth the challenge, and whatever the underlying explanation my implementation does give exactly the right pitch results regardless of what note I start on and how far I move my finger to bend.

edit again - ok I checked with a normal controller and not getting the full range is just down to the way the roli bend works, normal devices will give the full range. So people with normal controllers will want to use a value way smaller than 96, eg 4.

Thanks to the design of your Midi Manager it was trivial for me to setup 15 instances of the synth (each set to use 1 voice) for use with the Roli! With the Roli each key pressed has it’s own unique parameters, so every note is sent on a separate midi channel (but channel 1 is reserved for overall controls that are not per-note, eg the x-y pad on the left). Since I dont have 15 fingers I should really setup some kind of pooling system that can reuse a smaller collection of synth instances, but at this stage I am just testing the concept and so far even without a pool it is working surprisingly well. For example I can hold down 3 notes and bend the pitch of one of them up, one of them down, and not affect the pitch of the 3rd note at all.

Would love to see a video of this posted. I love the ROLI stuff.

I promise I will post one one day, but it will have to wait until 4.16 is out and some of the nvidia gameworks stuff like flex and/or flow work with 4.16 since that is the stuff I plan to use to do the visual side. And dont expect me to actually be able to play the roli properly :wink:

Although maybe I will manage to knock out a much cruder example without the visuals done in the meantime. IF you or anyone fancies sending me a nice synth preset to try that would probably help motivate me :wink:

That’s awesome, man!

OK I managed to punch well above my weight and got nvidia Flow working with a recent github 4.16 version so I should be able to start working on the demo sooner rather than later :slight_smile:

Can’t wait!!! The nvidia physics have been calling me too, but I’ve already chosen VR as my big performance-eater.

Can you play .mid files using the plugin?

This version of the plugin only provides support for device input. We would like to expand support for files eventually.

Cool, looking forward to that then.

I’m no programmer, so as a work-around, I was able to get MIDI data into UE4 by using a utility to convert MIDI to CSV and then import the music as a datatable, and then I programmed some blueprints to read it off like a sequencer.

It worked out!

I was able to prototype a track by hooking Ableton Live! into UE4 via the MIDI device input (and a free utility to create virtual MIDI cables) and write MIDI in Ableton, then I was able to export from Ableton then convert it to CSV then import it into UE4 and then play it back!

1 Like

Yeah, I remember something like that from the GDC demo. Very interesting! I might give this a try but I have never programmed any kind of a sequencer so I would be thankful if you could share how you set that part up.

Each MIDI song has a tick per quarter note resolution. Usually it’s something like 480, but I’ve seen 96 in some cases. All your MIDI data will have tick time in one of the columns. Basically, what I would do is every frame I would get the delta time (which is how long your last frame was) and then I would get how many ticks I needed for that frame based on how many seconds (and the tempo and MIDI ticks per quarter note).

Then I would just read the data table starting from my last position until the MIDI tick was higher than how far I’ve gotten so far–any MIDI events that occurred during that time, I would send all the data to an array, and then I would fire off all the events from that array.

Then I’d do that every frame.

So basically, it’s like, let’s say my tempo is 120, which means each quarter note is 0.5 seconds long, then I have 480 MIDI ticks per quarter note which means that each MIDI tick is 0.001… seconds long. Then let’s say my project is running at 30 frames per second, so the delta time is 0.033… seconds. Then I go: DeltaTime / SecondsPerMidiTick = NumberOfMidiTicks. In this case, the number of MIDI ticks would be 33, so I would add 33 to my total number of MIDI ticks counted so far, let’s say I’m about 5000 MIDI ticks into my song so far, then I look at my data table and say, hey, give me any MIDI events that happened between ticks 5000 and 5033. Let’s say there are two that happened. Then I will add those to an array and then fire off those events in a loop–then start over.

The trick is to not read the whole data table each frame, but only read starting from the last MIDI event you fired off.

1 Like