I want to have a blueprint that can act as a machine which holds the time synth component and from here I can send the events to many many other actors that are connected by links. I have setup something very similar without the time synth component but the main appeal is being able to lean on the audio thread for the more accurate / “Correct” way of calling sounds in a quantized format.
Here’s a bit of an example of what i’ve done already so you can get an idea of why I am trying this madness.
Right now, I am having a rough time with getting the audio thread to even activate. (I can force it by modifying the code using
, but that feels wrong) So essentially the time synth seems like its not really working correctly since the audiothread will default to spitting it immediately out on game thread if inactive …
Is doing -audiomixer with the proper ini setups and plugins active not enough to get the audio thread to activate??
Am I missing a checkbox somewhere?
Also from what I’ve seen before there is not just the audio thread, but there is also the audio render thread. So I can get the time synth to fire off its tick in the editor, but the **EventQuantizer **is reporting a playback time of 0.0 forever in the editor vs in-game where it spits out the actual game time. Also, anything using **SynthCommand **seems to not be pumping its messages either in the editor. So my suspicion is that both audio threads are inactive therefore not allowing me to call and consume any of this audio in the editor… which sounds… well fun to go in and see if I can get it to happen. What would speed my research up dramatically is if I could get some clarification on your guys end as to how to activate these threads and if they are even available for editor use or not… or if I am even in the ballpark of trying to solve this.
The final Ideal workflow is to be able to live edit the node structure and have the main machine blueprint just worry about calling the sounds needed based off the node information and the current playback time. Thats the goal I am shooting for at least… if its not with time synth… I’ll try to find another way.
The Audio Thread is disabled on platforms that do not support threading and in the Editor. This is because editing certain audio features during runtime on a separate thread (like editing the SoundCue Graph) is not threadsafe.
If you want to audition your game with the audio thread enabled, you use the argument -game.
If you’re launching from the Editor, you can launch using standalone game.
Thanks @dan.reynolds ! And that makes a ton of sense. I’m digging a bit more and I am getting closer and closer but am still fighting just getting the time synth component to make sound happen in the editor similar to audio actors having that preview button. I’ve been hacking away at a few things and I have confirmed that the audio thread is working now in-game (I got it in the editor too by hacking away at LaunchEngineLoop.cpp line 2164… but I imagine with things like the sound cue editor this will be a crash-fest long term given the information you just provided me with. The event quantizer is also still outputting 0 playback but that is because its reliant on OnGenerateAudio to be running to update NotifyEvents to get proper playback time from frametime. Can you think of another instance that is blocking the synth components audio in-editor? Seems to be something being blocked somewhere in there just slowly been hacking away through the code to find it. … I am determined to break things for sure and appreciate any help in doing so XD. I feel like I am actually making strides though today vs yesterday so thats… better.
Oh my god… I think I found it. So after hours of digging today I finally deduced that the issue was that the main mixer source manager was not getting initialized in the editor along with many other things. Tracking it down lead me to AudioDevice.cpp
void FAudioDevice::Update(bool bGameTicking)
So uhh I force bGameTicking to always be true and BAM SOUND IN THE EDITOR! I am a bit scared about what kind of pandora’s box that is likely to unleash but at the very least this is the road I am on right now. I’ll update this if it ends up being horrible or not.
I’m running into a pretty fun issue… or maybe its intentional given the way the system works. When I call Play Clip using 120 BPM, and 1/4 quantization delegate event with the clip also playing at 1/4 quantization it will always play 1 beat after I call play clip. This seems to be a timing thing where the event calls are either perfectly lined up with the quantization and I need to offset it or… maybe there is already an offset that I can utilize? I am unsure, but this is definitely something I am gonna need to solve so that I can consistently rely on using those events as a heartbeat for all of these nodes… hopefully without making something like a pre quantization event event. Any input on this is appreciated