How you get the actors is up to you. If they’re all the same class, then you can Get All Actors By Class. For me, the Interface meant I could send messages to all Actors regardless of Class and let them sort out which message belonged to which.
Hello, I want to bind events on some midi input
But I could not find any midi events like keyboard events to bind events to. Can you please help? I am desperate.
There is currently only one Event to bind, it spits out generic MIDI data along with some parsed data, but if you’re looking for anything out of the ordinary, you’ll want to parse it yourself.
A question in relation to optimization and correct (logical) setup:
Controllers IDs:
As these represent each buttons/knobs and I have 56…
**Where will be the best spot to “switch on strings” **(I use strings, as IDs start are a weird ranges. 13>20 29>36 105>108 etc…) (and not switch on integer)
, you set up your Event Dispatchers (Transmits) just after the Note is ON/OFF/Controller changes etc… (So you have 7 kinds)
So in your case I would have to Switch on string where the E.Dispatch is called. (So ending up duplicating these in many places)
Wouldn’t it be better to create the MANY(56) E.Dispatcher after the Switch on String results? (one for each IDs)
Like this, it’s all in one Master BP in the level that distribute the Events Dispatcher?
At the moment, the MIDI implementation is for input support only. I would love to see MIDI output support so we can transmit MIDI events to my devices and synthesizers though. I think Blueprints is really fun to program in and it would be cool to make some procedural music systems in BP and have them control my hardware.
In general, we would like to revisit the MIDI implementation entirely. I have heard some good things about Open Sound Control implementations, but I haven’t tried them personally (all 3rd party stuff). Might be worth looking into if you need some more i/o for the time being.
As far as event dispatchers, I didn’t want to over define how my BPs were going to use or parse the information, so I just packaged the CCs into a single Event. That allowed my receiving BPs to deal with that information however they want.
Hey guys I’m confused and new to the MIDI world. Is there a way to get the note being played by the keyboard/midi device in unreal? Figured it out, it was the Controller ID which gives you the number of the key pressed and hence the note. Made a table to translate the key num to note.
Hi!
I might be missing something obvious since I am new to Blueprint programming, but from where do you get the Midi Listener interface from?
When I search in that dropdown there is no suck element. Is it a custom actor or did you create a c++ interface somewhere?
Hi all,
Using this plug in, it’s amazing I love it. I was just wondering is there some way to read MIDI tempo somehow?
I’m not that familiar with working with MIDI, so I am unsure if this is actually a property that exists, but surely it must?
As a plan B I guess I could have a dummy metronome MIDI input and reverse engineer the BPM that way, however that’s not ideal.
I hope this post is relevant to this thread.
Hey everyone who has contributed to this. Thanks so much for the awesome work!
I’m pretty new to midi, and very rusty on UE4, so please have patience
I am trying to build a simple game like https://www.synthesiagame.com/ to help my daughter practice drums.
We have a affordable electric drum set, and I have managed to assign each drum hit to play a sound in UE4.
This was a bit harder than expected, because the drum sends 3 events on each hit, and I had to use a combination of two of those events to identify each drum.
The problem I’m having right now is LATENCY
There is about half a second delay between me hitting the drum, and hearing the sound.
Any ideas on how to address this?
Thanks