Yeah, game engine audio latency is always an issue. It’s not optimized for input latency (for performance), but for stability and rendering capability with minimal CPU impact.
Are you using the audio mixer? The synth actually works in the old audio engine, so make sure you know for sure you’re using the audio mixer!
The issue is BP code is executed on the game thread tick (so has a max latency of your FPS due to that), then it hands of a message to the audio thread (which is currently locked to the game thread update tick), then the audio render thread consumes audio thread messages.
The worse-case output latency then due to thread communication is:
33 ms (GT and AT assuming 30 FPS), 23 ms (ART, 1024 frames at 44100) = 56 ms
If you run in the editor (which your video indicates), where the AT doesn’t exist (because we write to UObjects in editor-mode), audio updates happen strictly after the GT update. In this case you’ll get an even worse worst-case latency:
33 ms x 2 + 23 = 89 ms. <- this is super bad of course!
If you run with the AT (launch with -game), and can get your game running at 60 FPS, that’ll cut down the GT/AT latency by half.
If you reduce the framesize to 512, that’ll cut that in half:
16 ms + 12 ms = 28 ms <- this is getting into the realm of acceptability.
Then, you have to account for midi-input latency. Of course, even with low-latent devices, it’ll add to it.
The problem then is jitter. As you know, jitter is ultra important for midi performance and we absolutely do not optimize for reducing jitter. We’d want to add latency and schedule events to make sure they’re consistent.
If you play UE4 games, you’ll find that keyboard input (or controller input) to SFX output is in an acceptable range for games, but probably not for a musical performance.