Announcement

Collapse
No announcement yet.

Setting Up a Blueprint MIDI Manager with 4.14 version of MIDI Device Support Plugin

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Originally posted by cyaoeu View Post
    Can you play .mid files using the plugin?
    This version of the plugin only provides support for device input. We would like to expand support for files eventually.
    Dan Reynolds
    Technical Sound Designer || Unreal Audio Engine Dev Team

    Comment


      #17
      Originally posted by dan.reynolds View Post
      This version of the plugin only provides support for device input. We would like to expand support for files eventually.
      Cool, looking forward to that then.

      Comment


        #18
        Originally posted by cyaoeu View Post
        Cool, looking forward to that then.
        I'm no programmer, so as a work-around, I was able to get MIDI data into UE4 by using a utility to convert MIDI to CSV and then import the music as a datatable, and then I programmed some blueprints to read it off like a sequencer.

        It worked out!

        I was able to prototype a track by hooking Ableton Live! into UE4 via the MIDI device input (and a free utility to create virtual MIDI cables) and write MIDI in Ableton, then I was able to export from Ableton then convert it to CSV then import it into UE4 and then play it back!
        Dan Reynolds
        Technical Sound Designer || Unreal Audio Engine Dev Team

        Comment


          #19
          Originally posted by dan.reynolds View Post
          I'm no programmer, so as a work-around, I was able to get MIDI data into UE4 by using a utility to convert MIDI to CSV and then import the music as a datatable, and then I programmed some blueprints to read it off like a sequencer.

          It worked out!

          I was able to prototype a track by hooking Ableton Live! into UE4 via the MIDI device input (and a free utility to create virtual MIDI cables) and write MIDI in Ableton, then I was able to export from Ableton then convert it to CSV then import it into UE4 and then play it back!
          Yeah, I remember something like that from the GDC demo. Very interesting! I might give this a try but I have never programmed any kind of a sequencer so I would be thankful if you could share how you set that part up.

          Comment


            #20
            Each MIDI song has a tick per quarter note resolution. Usually it's something like 480, but I've seen 96 in some cases. All your MIDI data will have tick time in one of the columns. Basically, what I would do is every frame I would get the delta time (which is how long your last frame was) and then I would get how many ticks I needed for that frame based on how many seconds (and the tempo and MIDI ticks per quarter note).

            Then I would just read the data table starting from my last position until the MIDI tick was higher than how far I've gotten so far--any MIDI events that occurred during that time, I would send all the data to an array, and then I would fire off all the events from that array.

            Then I'd do that every frame.

            So basically, it's like, let's say my tempo is 120, which means each quarter note is 0.5 seconds long, then I have 480 MIDI ticks per quarter note which means that each MIDI tick is 0.001... seconds long. Then let's say my project is running at 30 frames per second, so the delta time is 0.033... seconds. Then I go: DeltaTime / SecondsPerMidiTick = NumberOfMidiTicks. In this case, the number of MIDI ticks would be 33, so I would add 33 to my total number of MIDI ticks counted so far, let's say I'm about 5000 MIDI ticks into my song so far, then I look at my data table and say, hey, give me any MIDI events that happened between ticks 5000 and 5033. Let's say there are two that happened. Then I will add those to an array and then fire off those events in a loop--then start over.

            The trick is to not read the whole data table each frame, but only read starting from the last MIDI event you fired off.
            Dan Reynolds
            Technical Sound Designer || Unreal Audio Engine Dev Team

            Comment


              #21
              Originally posted by dan.reynolds View Post
              Each MIDI song has a tick per quarter note resolution. Usually it's something like 480, but I've seen 96 in some cases. All your MIDI data will have tick time in one of the columns. Basically, what I would do is every frame I would get the delta time (which is how long your last frame was) and then I would get how many ticks I needed for that frame based on how many seconds (and the tempo and MIDI ticks per quarter note).

              Then I would just read the data table starting from my last position until the MIDI tick was higher than how far I've gotten so far--any MIDI events that occurred during that time, I would send all the data to an array, and then I would fire off all the events from that array.

              Then I'd do that every frame.

              So basically, it's like, let's say my tempo is 120, which means each quarter note is 0.5 seconds long, then I have 480 MIDI ticks per quarter note which means that each MIDI tick is 0.001... seconds long. Then let's say my project is running at 30 frames per second, so the delta time is 0.033... seconds. Then I go: DeltaTime / SecondsPerMidiTick = NumberOfMidiTicks. In this case, the number of MIDI ticks would be 33, so I would add 33 to my total number of MIDI ticks counted so far, let's say I'm about 5000 MIDI ticks into my song so far, then I look at my data table and say, hey, give me any MIDI events that happened between ticks 5000 and 5033. Let's say there are two that happened. Then I will add those to an array and then fire off those events in a loop--then start over.

              The trick is to not read the whole data table each frame, but only read starting from the last MIDI event you fired off.
              Thanks! Things got a bit awkward because my midi file had note on events and fake note off events (note on with 0 velocity) on the same tick and the Find node only outputs the index from the first instance found in an array. I wish there was a FindAll node but I guess it's not that hard to make yourself. I should be able to get something working at least.

              Comment


                #22
                Here's what I got for now: https://gfycat.com/WarmMinorAgama
                Self playing piano kind of (no sound yet though)

                Comment


                  #23
                  Originally posted by dan.reynolds View Post
                  This version of the plugin only provides support for device input. We would like to expand support for files eventually.
                  Do you think midi out is also part of the plan at some unknown future point?

                  I hope to have a very rough and simple demo of my roli-UE4 stuff at some point over the extended weekend that is coming up here in the uk. Main problem I have right now is that I havent got a way to properly record me using the roli & the screen showing the results at the same time and my attempts at improvising something have failed so far.

                  Comment


                    #24
                    Originally posted by SteveElbows View Post
                    Do you think midi out is also part of the plan at some unknown future point?

                    I hope to have a very rough and simple demo of my roli-UE4 stuff at some point over the extended weekend that is coming up here in the uk. Main problem I have right now is that I havent got a way to properly record me using the roli & the screen showing the results at the same time and my attempts at improvising something have failed so far.
                    When we were preparing our GDC demonstration, we got help with our reactive materials from one of the Technical Artists here and to help him design the material, we set him up with a MIDI keyboard so he could audition interactivity.

                    This blew his mind!

                    Click image for larger version

Name:	776.gif
Views:	1
Size:	849.0 KB
ID:	1128492

                    One of the things we would love to support is generic device interaction (MIDI included). We think it would be great for all designers to incorporate other types of device interaction: faders, knobs, soft-keys, etc. So this interests us a lot. It's just a matter of human resources and time. We have a lot of big things still on our plate yet to do, but this is on our radar.

                    As far as your problem, I have had some success with OBS. You can mix multiple video sources (web cam and screen share) as well as multiple audio sources.
                    Dan Reynolds
                    Technical Sound Designer || Unreal Audio Engine Dev Team

                    Comment


                      #25
                      Thanks for the tips. When it comes to changing parameters with knobs I quite like the Ableton Push 2 controller because of its display. Some months ago I wrote a node.js app which runs on a raspberry pi which is connected to the push 2. The app does a number of things including sending midi cc stuff from the physical knobs on the push 2 (including hi-res version which combines two cc numbers in a similar way to how standard pitch bend uses a high and low set of bits). But the real magic comes from its control of the display and the use of OSC - I can send the names of parameters I would like a 'page' of knobs to control, min & max ranges etc from any app that supports OSC, and send that data back to the app using either midi or OSC. When I have got to grips with UE4 more I would like to use it with UE4 in a manner that makes it very easy for me to wire up variables for control and have their names etc show on the display without too much messing around with templates and manual configuration in more than one place. I already got the UE4 OSC plugin that someone wrote working with 4.16 so I'm pretty confident I can achieve it without too many sticking points.

                      Thanks for the tip about OBS - since my initial demo is not intended to be polished in any way I am going to try recording the roli and my monitor running UE4 in the same shot, and have ordered an ipad tripod to achieve this which will hopefully arrive before the weekend.
                      Last edited by SteveElbows; 05-24-2017, 05:26 PM.

                      Comment


                        #26
                        Can Anyone help me? I have got this MidiDevice(http://www.akaipro.com/product/mpkmini#specs) and I followed all stepps of [MENTION=524867]dan.reynolds[/MENTION], using the version 4.14. Now there this BP and I placed it into the Viewport. Well now I am clueless how to make it work. I was thinking of triggering events like using the computer keyboard. Lets say: Press "G" > Drops one Cube at a certain point. But Instead of using the computer Keyboard, I would like to use one of those Pads of this MidiDevice.

                        Comment


                          #27
                          Originally posted by Speck97 View Post
                          Can Anyone help me? I have got this MidiDevice(http://www.akaipro.com/product/mpkmini#specs) and I followed all stepps of [MENTION=524867]dan.reynolds[/MENTION], using the version 4.14. Now there this BP and I placed it into the Viewport. Well now I am clueless how to make it work. I was thinking of triggering events like using the computer keyboard. Lets say: Press "G" > Drops one Cube at a certain point. But Instead of using the computer Keyboard, I would like to use one of those Pads of this MidiDevice.
                          Assuming you've built it exactly like mine, and you're successfully receiving MIDI event data, then you can start experimenting with your device to understand what event data corresponds with your device input.

                          Hit a pad and then find out what kind of data you're receiving into your BP.

                          Once you know that, then you can set up BP code that receives that type of data and when the corresponding data is found, it does its thing.

                          So for example, let's say you hit a pad and it lists it as Note On data and the Note On data has Pitch 48.

                          Well then you have a BP that says, when I receive Note On data of Pitch 48, then spawn a cube at this world location.
                          Dan Reynolds
                          Technical Sound Designer || Unreal Audio Engine Dev Team

                          Comment


                            #28
                            Hi DanReynolds, Thank you for your quick reply!

                            When I hit any pads or midi_keys there is somehow no response like the ones you mentioned.
                            It seems like the data_stream is for some reason blocked?

                            (https://www.dropbox.com/sh/4i8cbj82f...GBG-6sMDa?dl=0)
                            Those are two Screenshoots, essential all I did + hitting pads

                            Comment


                              #29
                              I guess there is one or more mistakes in the BP of my screenshot, can you see it?

                              Comment


                                #30
                                You left your MIDI Device Name blank? It's trying to match the name you gave it for a MIDI controller with the ones available. You said the name was nothing, so it will fail every time:


                                Click image for larger version

Name:	user_midi_error.JPG
Views:	1
Size:	145.9 KB
ID:	1128633
                                Dan Reynolds
                                Technical Sound Designer || Unreal Audio Engine Dev Team

                                Comment

                                Working...
                                X