Time Lagging in Quartz

Hey guys and especially @MaxHayes, first of all thank you so much for taking the time to answer. The thing is we need a really accurate and stable clock for our project. We can work with an approximate maximum of 4 milliseconds of drift so we thought we should implement the quartz clock in C++. With some difficulty (admittedly mostly because of my inexperience with UE), we managed to get the clock to run and it is mostly running in microsecond accuracy but the issue is we are experiencing occasional time lags up to 90 milliseconds between our delegate calls. I am measuring the elapsed time with std::chrono and I will share the measurements below. While I was reading the docs to get my delegate to
SubscribeToAllQuantizationEvents() up and running, I noticed the docs stating that dynamic delegates are slower than regular delegates. Could this be the problem that causes the lags? If so, how could I work around having to use the FOnQuartzMetronomeEventBP delegate (the “BP” suffix kinda worried me too). Or would you suggest anything else that would help with the lagging? Again, thank you so much for taking the time.

Chrono Tests in Seconds:

1 Like

Hello! This is a fun topic :slight_smile:

I apologize if this is verbose or is information you are already aware of, but hopefully its useful for multiple people.

With Quartz it is important to keep a few things in mind:

  • Blueprints run at the game frame rate (i.e. 60 frames/sec).
  • The Quartz Clock is ticked by the Audio Engine, which is generating audio in buffers.

At a high level, Quartz does 2 things:
1.) It sends events back to the game thread so VFX can line up w/ scheduled audio
2.) It allows for the sample-accurate scheduling of audio rendering

Neither of these things adhere to the “wall-clock” accuracy it seems you think you need.

for 1.) As soon as you’re receiving notifications back from Quartz on the Game Thread (i.e. in BP), you are dealing with error as high as tens of milliseconds. This is not avoidable. If you needed sub-4ms accuracy in BP, your game would have to maintain 250+fps.

for 2.) It is important for your use case to understand what Quartz is actually doing and what “sample accurate” means mechanically:

Say Quartz receives requests to play sounds on a certain boundary. Say it decides that boundary is 2058 audio frames in the future. And lets say the audio engine is generating buffers of 1024 audio frames.

The order of evets is as follows:

  • Quartz receives a request to play sound X on the next “bar”.

  • Quartz calculates this means the sound should start in 2058 audio frames

  • The Audio Engine is about to render 1024 samples, it ticks quartz forward 1024 frames.

  • Quartz says, this sound should not play in this buffer (2058 > 1024), but now it should start in 1,034 frames (2,058-1,024)

  • Some time in the future, the Audio Engine is about to render the next 1024 samples, it ticks quartz forward again, 1024 frames.

  • Quartz says, this sound should not play in this buffer (1034 > 1024) BUT we are getting close so Quartz sends a command back to the game thread to trigger that delegate, (so any VFX that want to appear in sync with the sound can start on the next video frame.) The sound should now start in 10 audio frames (1,034 - 1,024)

  • Some time in the future, the Audio Engine is about to render the next 1024 samples, it ticks Quartz forward again, 1024 frames.

  • Quartz says, this sound SHOULD play in this buffer (10 < 1024), so it stages the sound for playback in the upcoming buffer render, and sets it up so the sound will render through a 10-sample delay line (this is how the playback of the sound becomes SAMPLE accurate and not just BUFFER accurate).

For the Metronome delegates: those are also fired when Quartz is ticked. When Quartz is notified that 1,024 samples are about to be played, it sees if any metronome boundaries (i.e. quarter-note) are “occurring” during the chunk of time that next buffer represents. If multiple occurrences of a metronome boundary are occurring in that buffer, only one delegate will be fired. (So if your BPM is fast and/or you are subscribing to a small boundary such as a thirty-second note, you may not be getting all the delegates you are hoping for).

Additionally, the digestion of BP delegates is a victim to fluctuations in the framerate of your application. Any game-thread stutters or hangs will delay the digestion.

So, Quartz is not a wall-clock, end-all, time keeper that will notify your blueprint with <5 milliseconds of accuracy. It is a system that lets you interact w/ the audio engine in a sample-accurate way by scheduling ahead, and it will notify BP as accurately as possible so that gameplay and VFX can appear in sync with the sound that is playing.

I hope that is helpful and clears up temporal expectations. Feel free to follow up w/ more info on your specific needs, but at a high-level it sounds like your system design might need to shift from thinking the gameplay logic can be “real time” (not possible) to your gameplay logic “thinking ahead” and scheduling things with the audio engine.

If you need things accurately executed on the Audio Render Thread w/ Quartz, you can inherit from IQuartzQuantizedCommand and schedule it like any of the commands already provided in the engine.

Here is the implementation of a quantized Play Quantized command, you can see how it interacts w/ the Audio Mixer Source manager to control the onset of the target sound:

	TSharedPtr<IQuartzQuantizedCommand> FQuantizedPlayCommand::GetDeepCopyOfDerivedObject() const
	{
		TSharedPtr<FQuantizedPlayCommand> NewCopy = MakeShared<FQuantizedPlayCommand>();

		NewCopy->OwningClockPtr = OwningClockPtr;
		NewCopy->SourceID = SourceID;

		return NewCopy;
	}

	void FQuantizedPlayCommand::OnQueuedCustom(const FQuartzQuantizedCommandInitInfo& InCommandInitInfo)
	{
		OwningClockPtr = InCommandInitInfo.OwningClockPointer;
		SourceID = InCommandInitInfo.SourceID;
		bIsCanceled = false;

		// access source manager through owning clock (via clock manager)
		FMixerSourceManager* SourceManager = OwningClockPtr->GetSourceManager();
		if (SourceManager)
		{
			SourceManager->PauseSoundForQuantizationCommand(SourceID);
		}
		else
		{
			// cancel ourselves (no source manager may mean we are running without an audio device)
			if (ensure(OwningClockPtr))
			{
				OwningClockPtr->CancelQuantizedCommand(TSharedPtr<IQuartzQuantizedCommand>(this));
			}
		}
		
	}

	void FQuantizedPlayCommand::OnFinalCallbackCustom(int32 InNumFramesLeft)
	{
		// Access source manager through owning clock (via clock manager)
		check(OwningClockPtr && OwningClockPtr->GetSourceManager());

		// This was canceled before the active sound hit the source manager.
		// Calling CancelCustom() make sure we stop the associated sound.
		if (bIsCanceled)
		{
			CancelCustom();
			return;
		}

		// access source manager through owning clock (via clock manager)
		// Owning Clock Ptr may be nullptr if this command was canceled.
		if (OwningClockPtr)
		{
			FMixerSourceManager* SourceManager = OwningClockPtr->GetSourceManager();
			if (SourceManager)
			{
				SourceManager->SetSubBufferDelayForSound(SourceID, InNumFramesLeft);
				SourceManager->UnPauseSoundForQuantizationCommand(SourceID);
			}
			else
			{
				// cancel ourselves (no source manager may mean we are running without an audio device)
				OwningClockPtr->CancelQuantizedCommand(TSharedPtr<IQuartzQuantizedCommand>(this));
			}
		}

	}

	void FQuantizedPlayCommand::CancelCustom()
	{
		bIsCanceled = true;

		if (OwningClockPtr)
		{
			FMixerSourceManager* SourceManager = OwningClockPtr->GetSourceManager();
			FMixerDevice* MixerDevice = OwningClockPtr->GetMixerDevice();

			if (MixerDevice && SourceManager && MixerDevice->IsAudioRenderingThread())
			{
				// if we don't UnPause first, this function will be called by FMixerSourceManager::StopInternal()
				SourceManager->UnPauseSoundForQuantizationCommand(SourceID); // (avoid infinite recursion)
				SourceManager->CancelQuantizedSound(SourceID);
			}
		}
	}

	static const FName PlayCommandName("Play Command");
	FName FQuantizedPlayCommand::GetCommandName() const
	{
		return PlayCommandName;
	}

TLDR: Quartz is sample accurate relative to Audio Rendering, and only when on the Audio Render Thread (i.e. inside of a Quantized Command object).

The Blueprint side of Quartz is not sample accurate, and is limited by your game’s frame-rate

Neither are “wall-clock” accurate, as audio is rendered in blocks of samples (of a few ms), and Quartz is updated at this rate.

9 Likes

Thank you so much, and the verbosity is much appreciated as there is not much information regarding how Quartz works in the docs.

Unfortunately I have to have a real time metronome in the 128ths of the BPM to check if the real time input is on time with the beat including 32nds, so I need an accurate wall clock. I may be veering off topic here but it would help me so much if you could tip me in the right direction. Before I tried this with Quartz, I had implemented my own wall clock but ran into errors as the play functions I called was not from the game thread but from my own clock thread. And i suppose the errors will not stop at just the play sound functions. How can I use my own clock to call various functions on the game thread and the audio thread. Even simple answers such as “read up on subsystems and try to implement one” would be greatly appreciated, as I feel a little lost right now with my rudimentary UE knowledge.

Certainly,

Just to understand your problem space a bit more, is your “real time input” being processed by Blueprint?

Or are you doing something specialized (using special hardware and/or a real-time operating system?).

from my post:

at a high-level it sounds like your system design might need to shift from thinking the gameplay logic can be “real time” (not possible) to your gameplay logic “thinking ahead” and scheduling things with the audio engine.

Maybe you can provide a hypothetical example of why this is not the case mechanically? The game thread is updated at your frame rate, and the audio engine is rendering chunks of audio in preparation for them to be submitted to the operating system for playback on request. This rendering is done on its own asynchronous thread.

This makes real-time sample/frame-accurate A/V sync in games complicated. These are not limitations of Unreal, this is the case for any application on a normal time-sharing operating system. (i.e. for a beat-matching type game it requires scheduling things ahead of time for the audio engine, and deciding how to interpret user input within a tolerance). This is exactly what Quartz was designed to facilitate.

Let me know if you’re doing something highly-specialized with specialized hardware / OS / user input. Otherwise my advice would be to make very sure Quartz as I have described it is not usable for your situation. If your user input is processed/interpreted on the Game Thread, your current expectation of “real time” is, by default, un-obtainable and it is worth fully understanding the physical computing limitations Quartz lets you work around.

The intro to the Quartz documentation talks a bit about these limitations as a pre-cursor to why it works the way it does.

If you can shed more light as to how you’re capturing user input outside of Blueprints / Game Thread I can see if I can point you in the right direction for your use-case :slight_smile:

3 Likes

I am actually in the very early stages of prototyping, but what I hope to do is this:

I am working on a music game that you can freely play, as long as you are playing on beat, including subdivisions up to 32nds. My plan is to have an accurate clock count in roughly 128ths (19ms for 100bpm) and set an isTime boolean true on correct time intervals and false on incorrect time intervals. Then when the user inputs the key presses for the notes, I will check the boolean and reward or punish accordingly. Player inputs will be simple keyboard/gamepad key presses. And possibly MIDI device inputs in the future (which I guess Unreal will have no problem handling). I will be implementing the logic regarding the key presses in c++ instead of blueprints, but I guess that was not what you were asking as Blueprints probably can handle key press events as accurately as c++.
I am aware that the key presses are processed in the game thread, and will be processed only as fast as my frame rate. But I did some calculations and I think I can make it playable with some anticipatory wiggle room in the isTime intervals, taking into account the delay caused by frames per second. Scheduling sounds that the player plays ahead of time makes a lot of sense, but as you can see I need a really accurate clock to manage the isTime state in real time. I do not know if I can also schedule the state changes ahead of time, but I feel like even if I scheduled them I would again need an accurate clock to check if we have arrived at the scheduled time.

My current plan is to:
1- Run my clock on my thread, somehow pass the boolean created by the clock to the game thread
2- Process key presses on the game thread according to the boolean
3- Play the sounds via Quartz, detached from my own clock
4- Hope that there will not be much discrepancy between my clock and the Quartz clock
4.1- If there is much discrepancy, take steps in mitigating it until the game feels playable

I hope I managed to communicate my ordeal properly, and I hope my plan is not doomed to fail. Or at least you can suggest a way to save it if it is. I know I have thanked too much already but I really appreciate how much time an audio staff member spends for an UE beginner, so thanks again.

I see. Well you are asking the right questions early!

There is no benefit in introducing the complexity of an “accurate clock” running on its own thread other than to satisfy the human desire to haves something be “real.” There is no spoon, my friend :slight_smile:

As soon as your accurate clock’s flag is processed by the game-thread, it is quantized, so its existence was futile. You can obtain do math (like you were doing to try to measure time delta’s w/ Quartz). See FGenericPlatformTime.

You can then do the math to determine the delta (as far as you can tell on game tick) between when you received the event and the “perfect” 128th note.

Since you know your BPM, you could calculate out what CPU cycle each 128th note should land on from here to eternity. You don’t need a running “async wall clock” to keep track of that.

The other issue you will probably run into is trying to schedule sounds w/ Quartz at the last second. Quartz lets you schedule ahead. If the sound is supposed to play right when you get user input (which is supposed happen right on the beat), that might be too late for Quartz to hit the deadline. Especially if you’re going to allow for players hitting behind the beat (i.e. after the sound should have technically started.)

1 Like

I’m considering moving the rhythm game prototype I’m developing into UE and am concerned with the same questions being discussed here so I thought I’d jump in with two cents. I haven’t put much time into researching UE’s audio as frankly the documentation is rather opaque so forgive me if I’m getting the wrong end of the stick here.

There’s a tradition of rhythm game sync design (for the typical rhythm game that syncs user input with the playback of an audio file) which stresses against using any kind of system or game-wide clock as a sync source. Instead the “ground truth” of sync must come from the audio subsystem’s record of how long it has been playing the audio file (minus its reported audio latency). If the audio engine only updates this time in chunks, the time since the last update can be added to the reported time (this is how Godot does it).

This sync is constantly monitored by the game and used to generate a series of beat events which any user input can be measured against (usually you’d keep the song_position in seconds as a float and the song_position_in_beats as a floored integer, derived by dividing by secs_per_beat).

So I’m wondering - does Quartz replace the need for generating a beat event stream by automatically integrating with the audio subsystem and maintaining a beat (while taking both subsystem time-chunking and latency into account)? Failing that, does UE5 offer a way of accessing the necessary audio playback timing info to do it in the usual way?

Looking at the one article in the help, it does sound like Quartz Metronome will provide an accurate record of audio events, but the article is focused on starting playback on a sample accurate basis rather than providing a hyper-accurate measure of how far playback has progressed.

Once we have a “ground truth” of beat timing established, any user input can be measured against this and any synced animation can be lerped between states. All updates of animation positions etc need to be lerped or incremented with care taken to avoid letting frame timing have any creeping influence - see blogs below for more detailed and far superior explanations of this.

Below are some blog posts covering the logic of sync in rhythm games - how to create and maintain it as well as the pitfalls to avoid.

https://shinerightstudio.com/posts/music-syncing-in-rhythm-games/

This one is written by the maker of the excellent Rhythm Doctor:
https://www.reddit.com/r/gamedev/comments/2fxvk4/heres_a_quick_and_dirty_guide_i_just_wrote_how_to/

4 Likes

Sure thing! Yes, I would also advise against using a system clock as a sync source. My point was just if one were to insist on doing that, there is no benefit to doing it on a separate thread.

does Quartz replace the need for generating a beat event stream by automatically integrating with the audio subsystem and maintaining a beat (while taking both subsystem time-chunking and latency into account)? Failing that, does UE5 offer a way of accessing the necessary audio playback timing info to do it in the usual way?

Yes. There are 2 major components to Quartz: 1.) sample accurate onsets of sounds, and 2.) Communicating back to the game thread any timing information it needs (driven by the audio engine).

After you set the BPM on a clock and start it, there are notifications you can opt into through the Clock Handle (independent of playing any sounds).

These are called “Metronome Events,” and will call a Blueprint delegate every time one of these events transpires. These events can be used for your own musical time-keeping. The idea being that if you start your Audio clip on the beat w/ Quartz & “PlayQuantized,” then these events will allow you to track and respond to the musical duration you care about.

These events include a Transport Timestamp in the format {Bars : Beat : Beat Fraction } - represents the precise musical time as of the last audio engine buffer render.

Quartz is actively being worked on and improved (we have big plans for it), but it is tightly integrated with the audio renderer, so if there is something else that would be helpful to expose for rhythm games, it shouldn’t be hard to do so.

2 Likes

Thanks for the reply!

So in theory, multiple queries of this value could return the same time? This is where a “time since last update” property would be useful, so it could be added to the reported time to get a more accurate position. Though I guess…if there’s an event for each chunk then I could measure that variable “manually”.

I have noticed in my researches that there is a significantly higher bar in terms of input latency, audio output latency (& predictability of) and video output latency/consistency when it comes to the “enthusiast-grade” and competitive rhythm games - a competitive game that doesn’t get these things right wont be taken seriously by the community.

I’m not sure if my project will need to go to those lengths but if UE5 could meet that standard then it will be the first game engine to do so without modification and/or third party addons.

1 Like

You don’t query this value, it’s more of a time stamp on the event.

You can track stuff yourself in between events on the gamethread, and query the duration of a musical duration in seconds.

In fact, we use Quartz for automatic weapons in Fortnite (sample-accurate and decouples fire-rate from gameplay frame rate). We had an issue where when the player stops firing, the Quartz-driven audio can stop right away, but the VFX might let another gunshot through (if the VFX already hit the GPU).

So we track (as a percentage of a beat) how close we are to the next “gunshot” (32-nd note) and we have a configurable threshold to decide if we think the VFX lag is going to let another shot through or not. And we use that to decide if we should cancel now, or after the next gunshot is playing.

Quick BP example to show what I’m talking about. You can replace those Play Quantized calls with whatever logic you want.

You’ll also want to tweak the audio project settings (buffer size & number of buffers to queue) to minimize latency more than a regular game would care about.

if UE5 could meet that standard then it will be the first game engine to do so without modification and/or third party addons.

That is the plan :slight_smile: And like I said, if something is lacking to facilitate this, our tech is setup to expose whatever else we need quickly.

2 Likes

I found this topic searching for quartz issues. I am using a master clock in a BP, and a slave clock via clock-handle in another BP. The point is to have the possibility of syncing several BPs to a master clock. Al works well till the slaved metronome suddenly stops, even if the master clock is still running. Anyone have some idea? Any help would be great!
In the scond picture I subsrcibed to specific quant events, but I tried also with subscribe-to-all-quantization (like your metronome BP example)

Yup, issue is in your second pic.
Need to store the clock handle as a variable in your blueprint.

Issue is there are no active references to it, so eventually the Garbage Collector deletes it, which implicitly unsubscribes it on tear down.

very very much appreciated ! thanks!

1 Like

I may be wrong here, but as far as I know all the player inputs are handled in the game thread, which depends on the FPS rate. So even if Quartz manages to get really precise time with dsp stuff, the input handling would quantize the inputs at the frame rate, which would probably not be acceptable in a competitive game as it would provide higher FPS with higher input time resolution.