Synchronizing a trigger to Music/Volume/Frequency

Okay so, I apologize for asking here before going through UE4’s full documentation and resources, and before familiarizing myself with the software, but I’m in the very early conceptual stages of a game project and I’m still in the “Pick your tools” phase.

Basically I want to know if the following is possible to do in UE4 Blueprints and if there’s a preferable method of doing so:

You have a piece of music.
You have an event: Shoot gun.

I want to trigger Shoot Gun to the exact rhythm of an instrument in that specific piece of music (instead of when “MouseButton1 is pressed”).

I imagine there’s two ways of doing this, conceptually:

Either I make an “input sheet”, where I lay down each rhythmic step of the instrument manually, like a MIDI File or similar and then feed it to Shoot Gun.
(Or would I have to manually synchronize everything with exact time-markers? Which would really take forever)

Or I use some waveform-reading feature (I hear there’s an Audio Visualization plugin for UE4), edit the track’s frequency so that I isolate the instrument as best as possible, set a volume threshold and let it trigger whenever it passes that.

Out of the two, I’d prefer the manual input method, because this would be a very intricately scripted experience (each instrument would serve as a trigger for a behavior in-game, and it would be very closely related to how the song progresses).

Is any of this possible in UE4 with Blueprints (without spending tens of hours to synchronize each song)?

Would really appreciate any insight and, once more, sorry for asking about a fairly complex matter without delving deep into the source material, but from a time-efficiency perspective, knowing if this is possible beforehand would be very, very useful to me.

Have a good one.

I don’t know if this is exactly what your lookin for but maybe. If you have the bpm of the sound you want to trigger the gun, then maybe setup a retrigger to fire in intervals of the bpm.

Sadly that’s a bit too simple in regards to what I’m aiming for, though, if it can be manipulated flexibly throughout the duration of the song it might prove quite useful.

Thanks for replying!

I think the midi way might be the best, like you said. Consider when contracting composer, make this part of the contract because you need the separated trackes for different instruments.
The wave form transformation is called FFT, from time domain to frequency domain, which have limited capability by itself unless you also have the signature from the instrument itself. And usually takes time to preprocess, like AudioSurf.

If you have midi file, it would be easier to parse into your own format and then trigger the event with a looped timer call, or actually generate a track for a timeline and then use timeline tick to drive events.
(some people claim that timeline is probably the most accurate node you can get from blueprint side, I have no experience trying to sync both music and events before, so I’ll let someone else to answer.)

A-HA!

Timelines! That’s what I’ve been looking for.

Getting hold of the MIDIs and/or Isolated instruments is the easy part, as I’m making the music myself.
Now all I need to do is figure out an efficient workflow for converting MIDI to Keyframes/Events. Think I saw a MIDI-to-Keystroke Converter mentioned somewhere in the Answers hub, though I’m not sure how that’ll work out.

Thanks a lot for the insight! Very much obliged!