[Support] Orquesta Cue

Hi,

from the interfance/tooling side, sound cue definitely was the inspiration how to set it up. It would haven been awesome to just add quantization to sound cues, and my whole tool would haven been obsolete :stuck_out_tongue: but its not that easy. Because I don’t have the resources to replicate all the sound cue features and “only” add quantization, I focused on the music only.

Quick fire questions:

  • The Blueprint is like any other Blueprint. You can definitely call Subsystems.
  • I wouldn’t put gameplay logic into the OrquestaBP, you have to see it more like a Director Blueprint of the OrquestaCue. You create the logic for musical transitions here, and then create functions for them, which you call from an actor BP or a global MusicDirectorSubsystem.
  • I only have a windows PC without VR, so I couldn’t test it on other platforms, but it should run.

The OrquestaComponent used to play the assets, is based on the engines SynthComponent, which is the base for all procedural audio generators epic created, e.g. the GranularSynth or the TimeSynth. Internally it uses the AudioComponent. Therefore you can do everything what AudioComponent does, SoundClasses, Submixes etc. In more detail the new AudioMixer allows you to hook in the audio stream, and generate your own audio. Like this, I can count the samples and control exactly when to start clips.

To summarize, the keyword is quantization. Imagine you have two audio tracks, you want to play perfectly aligned. If you use the default audio component, and start them at the exact same time, they would be aligned. But if you want to trigger one track later, depending on what happens in your game, it is not possible to control the alignment, perfectly. The reason is, that the audio is rendered on a separate thread, not on the game thread, like in old times. So there are little latencies, when you trigger an audio component, which you can hear, depending on the performance of your game. If you have a very low frame rate, and you trigger at that moment, your music can be completely misaligned.
For audio that doesn’t need that, e.g. ambient sounds, one shot sounds or music that is not layered and can be transitioned with simple fade-in and fade-out, you do not need this and therefore my plugin would be absolute overkill :-).

(1)
What you plan to do is totally feasible with AudioComponents and Blueprints, if you do not need this quantized alignment. You could use a GameSubsystem as your music director, which knows all the music playing audio components in your game, and become creative ;-).
(2)
Epic has developed a plugin for this: https://docs.unrealengine.com/en-US/…sia/index.html