Simultaneous MIDI notes controlling custom data values through Harmonix

Hello, I am trying to recreate a real life light installation in Unreal Engine. There are around 1500 lights (which are just emissive meshes in the engine) which are controlled by a MIDI file. I have discovered the Harmonix plugin`s MIDI triggers and have set up these outputs in the MetaSoundSource for each channel of the MIDI file

I then get these in my blueprint actor of the instanced static mesh, and use this code to drive their animation.

Each MIDI note corresponds to one light which has the same #. When a new channel is added I just add 127 to the indexlight so the right notes are connecting to the right lights.

Problems are the following: this all messes up when multiple notes are hit at the same time. Also the MIDI note off event won1t hold the info for its midi note if another one is hit meanwhile and therefore it would call the event or the last active one. Which then makes all the lights go on but rarely go off. The MIDI file is the one that is animating the lights irl and therefore full of notes being hit at the same time and overlapping. I can tell that the midi note # I can get output from the MetasoundSource is a single value integer, so it doesn`t account for multiple ones. Is there no other way to do this than to filter each note one by one and output them individually?

I understand this is a new plugin and its already wonderful, and Im asking a lot, so I might have just run into limitations. However I am also not too versed in programming nor metasounds so maybe someone could offer a solution which would avoid making 1500 graph outputs and custom events.

So, long story short ‘the midi note trigger’ is not polyphonic and will only pick out the lowest note if you have several playing at the same time, this is a limitation of the node, fortunately though, you don’t need to use it at all in this scenario, just promote the Midi Stream to an output and then use the Metasound Output Watch subsystem to get this output from the metasound, when receiving the midistream in BP using the output watch subsystem (and the ‘get midi note info’ method) you’ll get all the notes in the stream and you can use them for polyphonic triggers.

Promoted the midi stream to an output but all I can access from it in the blueprint is the index of tracks and the tempo via the timestamp thing, don`t see a way to get the information about notes being played let alone which one.

thank you yes that works to get all the midi notes that are simultaneously hit. Idk if there`s a way to use these with the note on or off events cos those are going to be key for my use case. I might just stick with the workaround that seems to give the best results so far which is converting the midi file to a csv and using the data from there to tell which lights to go on and which to go off at specific times

There’s also a ‘is note on’ and ‘is note off’ method. You can 100% visualize midi data with the midistream, I don’t know what alternative solution you already made but the harmonix framework is pretty robust and it supports this exact usecase.

Here’s a small example I made using the harmonix midistream output driving a blueprint keyboard visualizer - https://www.youtube.com/watch?v=jldTWFm_YBk&list=PLnC4434gGyWHi6OtNmQmJqOdjTMzaERWO&index=7

Okay yeah thanks for sharing. According to this it should be possible to set this up. Do you have all your keys on the keyboard in one blueprint as isntanced static meshes, or is there only one key in one blueprint and that is the one instanced all over the place? Would you mind sharing a few things on how you set up the logic after watching the output of the midi stream?

It’s all up on github, I’m not sure if that 3d piano BP works on the latest main cause I’m been working on other things and thing I broke it a while ago but generally all the objects used to set it up are still there - GitHub - Amir-BK/unDAW: Editor and Runtime modules that aim to add some Digital Audio Workstation capabilities to Unreal Engine 5, requires Unreal Editor 5.4

In general I set up one actor per midi track, either outputting a filtered (one track) midistream from the metasound itself or filtering the ‘full midi’ to tracks in BP, either option isn’t too difficult as you can get the midi track and channel from the output watch, the piano actor itself is just some spline that spawns however many mesh actors needed, I didn’t actually do any instancing, each mesh actor is mapped to a note and has a simple interp-to movement logic that gets moves it to position on note-on and then plays the release animation on note off (using the velocity of the note to determine the speed of the release), I found that this has a pleasing effect. of course I also used a dynamic material instance on each mesh for the glow activation effect thingie.

Thank you for all the help, I have got it working to a point where it was with my workaround so gonna mark this the solution. Right now Im struggling with having the set custom data node value not just be a straight set but interpolate for all instances. That is a bit of a different one but if you have any experience with that I wouldnt mind some wisdom. I haven`t tried separating them into different actor blueprints, which I think would work, but if somehow I could have it work this way, all in the instanced static mesh actor that would be amazing. Thanks again!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.