Setting Up a Blueprint MIDI Manager with 4.14 version of MIDI Device Support Plugin

@MagicNono

Output is coming to 4.22:

Can i use Vsti and vstā€™s, put them in chains, insert fxā€™s, create sends, a master chain? Without a Daw, all in UE4 and ideally all the vstā€™s used put in the content folder, so the gamer just launch the game?

If you have partnered with a VST developer, then they could build their instruments into your project. I do not know if anyone has made a VST wrapper for UE4, certainly not Steinberg who owns the VST format.

Hi everyone,
I have been playing around with the MIDI plugin and I must say itā€™s an amazing thing to have inside the engin. I can now play with the Synthesis plugin using my phone as a MIDI controller!

I have a question, when using the MIDI plugin, is it built so that an android device running UE project could find a MIDI device like a DAW? or am I completely off? :slight_smile:

So far Iā€™ve only been able to read MIDI from an android device with other apps that are controlling my UE project on a mac, but I want to be able to use my own app that I developed in UE to send MIDI data to a DAW. Is it possible?

Im wondering how do you get the midi listener class used in get all actors with interface and how to use that towards the bottom part of the blue print here Screenshot by Lightshot

I donā€™t suppose you could post screenshots of the MIDI Interface and an example actor that implements that interface could you ? Iā€™m really new to this. Iā€™ve read the interface documentation, but Iā€™m having problems grocking it.

Hey everyone,
I am really new to this kind of stuff, so I apologize for every inconvenience due to my noobquestions.

I set up the midi manager and it can find my digital piano. So far so good.
But how can I get the input to trigger events?

Actually I want to visualize every key on the digital piano (76 keys) to one special color (ex. a colored point light or a emissive material) . So when I play I can see my playing visualized on a screen.

So I have the midi manager and the set up for the light, but I donā€™t know how I can get the part with the input and the keymapping?

Was that understandable? :slight_smile:

UPDATE:

I figured out, that I wrote the input device wrong. So now I can get a input, but I canā€™t seperate the different keys. So it fires the event ( pointlight) with all keys. I looked into the output log for the Control ID, but somehow if I call the note on or note off function and write a controlID into it, it doesnā€™t change.
Can someone help me, how can I get the Control ID in the right way?

Okay, I fixed it myself. If anyone have the same problem, I can help you out now.
^.^
@.reynolds Great setup by the way. Totally works well for me!
If I am done with the artistic stuff, I will post a video.

Hello, could you provide the list of the new bluprint nodes needed in 4.22 to output midi infos ?

Hi,

i am very new to MIDI in Unreal and Blueprints. I tried to recreate the MIDI Manager BP in UE4 4.22 but i can not create the correct communication with the interface.

I always get something this:

Instead of ā€œUsing Interface MIDIListen_Cā€ which it is supposed to be i guess.

I first created the Interface named ā€œMIDI_Listenā€ and the Functions (see blow) then after that the MIDI_Manager_BP very close to the one created by :

I read through the various chapters about Blueprint Interfaces but somehow i can not get it working.

Any help, tips or new tutorial links for MIDI and the new 4.22 implementation is very appreciated!

Below how it looks so farā€¦

Best
Stanley

Implementing the VST SDK would be amazing, but maybe a UE5 thing. I havnā€™t looked at the synth stuff in the new audio engine but automating reaktor in unreal you could probably do everything from sub mix effects, sound design, scoring.

Exactly. Until that, iā€™m thinking about using VCV in SaviHost, and control it with UE4 via the new 4.23 OSC. Apart the release note, for now there is apparently no tutorial about it. Iā€™ve never used OSC, so iā€™m not sure where and what to look at.

Is there any kind of tutorial with how to build this out currently?

Hey there,

I realize this is an old topic and thereā€™s probably a much easier way of doing this stuff by now. However this thread is the only place that gave me something to on so I thought, Iā€™d try my luck here.

I followed the blueprint structure in this thread and I can work with the MIDI Inputs to trigger stuff in my scene. So far so good.
The problems start when I try to use the wireless functionality of my MIDI device. Since I have to go through loopMIDI and MIDIberry to properly connect to windows, the device name is changing.
The implemented debug functionality tells me, that itā€™s now called loop MIDI port, but if I use that as the standard value for the MIDIDeviceName string variable, the AND bool check fails.

Debug.PNG
This is what the debug gives me, Iā€™d really appreciate some Input on this.

EDIT:

fixed this by literally spelling the name correctly. Now I just added a second Or-Bool to allow for both Input device names (usb and wireless).
So now everything works as expected in the editor.

However, there are two minor issues that I still have with th MIDI Plugin.

1: While everything runs fine in the editor, the inputs are not recognized in a build. Do I have to specifically include experimental Plugins somewhere?

2: This is regarding the interface that is created in this blueprint. Whenever I change something in a blueprint that the interface function is called in (One of the actors in the scene) I have to manually recompile my MIDI manager as well. I dont really see why this happens, am I doing something wrong here?

Thank you!

Thank you for setting all this up in the first place!
Cheers

Hi there, has anyone created this file to share? Iā€™d love to have midi I/O in my project but it feels a little daunting to have to manually copy and recreate the nodes from the image, esp considering nodes can be copy/pasted as text?
regardless, im glad we have this available in unreal, I hope to find the time to implement it, or OSC when either becomes more beginner friendly

&stc=1

There is a newer simplified version with in/out, check 's Twitter link earlier in thread! Hereā€™s a basic implementation example project I made: Arthurs Audio BPs - Audio - Epic Developer Community Forums

Hello all! So Iā€™m having trouble getting the midi data from Unreal to my hardware synth. I can get data from the synth into Unreal, but not the other way around. Even when trying out Arthurs ā€œMidi input/outputā€ example I still ran into the same problem. I can get midi data into the synth through a DAW so I know that it does work. Just have no idea why itā€™s not receiving the data from Unreal. Iā€™ve included a photo with what Iā€™ve done, maybe it can provide some clues as to why itā€™s not working for me? Any help is greatly appreciated!

Also, I realize I donā€™t need the ā€œinteger to floatā€ and ā€œtruncateā€ nodes, that was left over from some previous versions, all cleaned up now!

Just a quick hello from the guy on the thread that was going on about Roli seaboard stuff 3 years ago and then never did anything. I wont bore you with the full story, lets just say I went down a dead end with certain engine customisations by nvidia, and eventually decided to retreat and see how various engines evolved. I like a bunch of things that have happened since so now I am back. Iā€™m not sure how much I will actually use midi this time around, I need to evaluate things like the niagara audio stuff, and I also have some scenarios where I might consider using Open Sound Control instead or as well as midi and audio. Anyway I am very rusty with UE so it will take me a while to get back up to speed, but I liked what I saw in that long video about the audio engine in 4.25 and am about to go and try the samples from that. Given the non-midi tech that will be involved in parts of my work going forwards, I suppose next time I talk about this here I might start my own thread for it, but I just wanted to say hello again first. I have some time to complete my mission as some of the stuff Iā€™m doing is aiming to be used with the forthcoming Osmose expressive keyboard, and even the early bird for that has been pushed back to autumn.

Hey @HaensPaenda

Could you share with me how did you manage to make specific Control ID value to trigger events? Iā€™m trying to do the same virtual piano thing as you did.

Iā€™m receiving all kinds of MIDI data as the 's guide, but would like to filter some of them to make each pressed key on my MIDI controller to trigger a specific event in Unreal.