FActiveSound PlaybackTime is wrong

Hi,

I need to sync my game to the music. The problem is, after a while, both get async. I’m using ActiveSound->PlaybackTime to get the current playback time of the music track and trigger events when it reaches e.g. 1s, 2s, 3s, … of the track. I’m not sure what causes it, but after a few minutes the PlaybackTime does not really represent the current time of the track anymore, but is off by half a second or so.

I can restart the audio component playing the music at the time it should actually be to resync music and gameplay, but you can hear the music skip for a moment if I do this.

I’m still at 4.7. Any ideas? Thanks!

Hey anteevy-

Can you post a screenshot of how your audio is being used either in code or blueprints? Can you also test a copy of your project in 4.11 and let me know if the error occurs there as well?

Hi! Here is a more detailed explanation about what I’m trying to do: ActiveSound->PlaybackTime out of sync after game loses focus - C++ - Unreal Engine Forums

Though since I posted this, I noticed that on some PCs the sync issue also occurs over time, not only when minimizing/maximizing the game, so my workaround doesn’t work as you can obviously hear when I call Play() on the audio component while it’s playing.

I’m not sure if this occurs only when the track automatically restarts when a loop is over (it seems to loop fine from a hearing perspective), or if there are small (unhearable) skips in the music now and then.

Hey anteevy-

Can you provide any reproduction steps or a sample project so that I can take a look at the exact issue you’re reporting? Also, if you open a copy of your 4.7 project in 4.11.2, do you still have the issue occurring there as well?

Moving this to 4.11 will take me a few days as there’s lots of stuff that doesn’t work anymore.

In the meantime - I noticed that the sync-issue only happens when vsync is off + using true fullscreen. And even then mostly only when moving around the level and not just standing still doing nothing.
Is the PlaybackTime variable frame-dependent? This could explain why it’s not really 100% representing the actual play time of the music track. Maybe using timelines or timers to keep track of the play time could solve this issue?

Or maybe you (or somebody) have an idea on how to keep track of the actual current play time of a music track that is played from an audio component?

So the playback time is used to cull/stop active sounds in the audio engine update loop and is not an accurate representation of the audio-thread timing. There is currently no way to get the actual playback percent/time since that information needs to be determined from an audio device thread callback. Work like this is not platform independent and would need to be done on every platform and is duplicate work to the stuff I’m currently working on (writing a multiplatform mixer) so we’ve decided to punt on this feature for the existing back-end audio engine. In my experience, the best way to get accurate timing for music is to 1) make sure you game frame-rates are consistent (non jittery), 2) use music that has BPM timings that are a multiple of the frame-rate (i.e. 33 ms) so that the game thread update ticks align with your musical beats, 3) retrigger music segments/loops manually at bar-boundaries or some other mechanism.

Thanks for the explanation, this was really helpful! Could you go bit more into detail with your 3rd suggestion?

I’m currently resyncing the music to the gameplay when the user pauses+unpauses the game, as the music stops in the pause menu and I can silently sync without the user noticing it. If I do this while it is playing (e.g. every minute or something), you can hear the music skip (obviously).

Another idea I had was to resync while another sound is playing, so you don’t hear the resync - but that would only work if the user hasn’t muted the sound volume in the settings.

If there was a way to at least find out the moment when the music track loops (using a loopable wave player in a sound cue to play it from an audio component), I could do it the other way around and resync the gameplay to the music start. But this is also not possible, right? OnAudioFinished only fires when a non-looping sound ends.

Do you know if using the FMOD plugin could help me here? I found http://www.fmod.org/docs/content/generated/FMOD_Studio_EventInstance_GetTimelinePosition.html which looks like what I need, but maybe I’ll only get the same results as with UE4’s PlaybackTime?

Thanks again!

Yeah, looks like at one point we had some sort of support for sending notifications when a sound loops, but I don’t think it’s currently hooked up to BP and wave instances are only setting loop modes to either LOOP_Forever or LOOP_Never rather than LOOP_WithNotification.

You could try and add a feature to get notifications when a sound loops – there is some branched code in the xaudio2 implementation that looks like its intending to eventually result in a delegate callback but it’s not fully hooked up. To do that, I’d follow the existing OnAudioFinished code and add case for getting something like OnAudioLooped, and then allow a new looping mode on Sound Waves or some other way to indicate to the WaveInstance that you want to broadcast a notification on loop boundaries.

I’m not 100% aware of the intricacies of FMOD’s plugin with UE4, but something like this should be doable in FMOD (as its doable in UE4 with some code spelunking).

If I were you, rather than play longer files, I’d create a BP mechanism that plays smaller music fragments on bar boundaries (using a timer or a delay node or something). With this, your music won’t get too out of sync, you can restart whenever you need, you can switch to other music fragments, layer them, etc.

I tried out the LOOP_WithNotification thing and broadcasted an OnAudioLooped event from inside FWaveInstance::NotifyFinished, which is called on loop when a FWaveInstance has LOOP_WithNotification as looping mode. This worked, but I encountered two strange issues:

a) I had the idea of adding an unhearable “sync track” to my sound cue that plays the music, with the length of 1 beat, so I get a loop callback for each beat and can sync the gameplay accordingly. Strangely if the wave player’s track in a sound cue has a 5s duration or less, it gets out of sync with the other wave players in this sound cue (I can test this e.g. when moving the UE4 editor around while the cue is playing - the game and all waves pause except for the one with <=5s duration).

b) So the other idea was to simply resync the gameplay to the base music track on loop (the tracks are around 2-4 minutes long). When the wave has a duration > 5s, it stays in sync with the other wave players in the cue, but somehow NotifyFinished and therefore my loop event is not really called on loop, but instead maybe half a second earlier.

Any ideas on what this 5 seconds limit is all about?


If I understand correctly, you’re suggesting splitting the music into several parts and playing them one after the other using a timer/delay node? I tried this, but I’m not sure this can be done seamlessly? Every time a new part starts playing you can hear the transition as timer/music are not perfectly in sync.

Yeah, that’s because sound files longer than 5 seconds do what’s called Real-Time decoding. In 4.7, RT decoding had a major flaw that resulted in decoded buffers being submitted to a playing source voice from the main thread. This means that any main-thread timing/delay (i.e. grabbing the window and dragging) results in buffer underruns for RT-decoded sources. This is a BAD thing and made it impossible to play music through loading screens for example. The mechanism you suggest for timing is not a good one.

We’ve been doing some experiments internally for music syncing for an interactive music system with good results. Our primarily mechanism is to use a BP timer to retrigger/sequence music on beat-boundaries (e.g. per measure, etc). If your game runs at a relatively smooth frame rate, a bar-line sync point with music that runs at a tempo that has a quarter-note tempo that is a whole-number multiple of 33 ms usually does the trick.

I tried this, but I’m not sure this
can be done seamlessly? Every time a
new part starts playing you can hear
the transition as timer/music are not
perfectly in sync.

You’ll want to use multiple audio components so that as you play a new audio component, you can fade-out the old audio component. You’ll also need to create the music assets so that they work in such a system. Chopping up a single long asset might not work.

Yikes. I’m using tracks with a length of 2-4 minutes (and they are all finished already), so as you say, chopping them up won’t work.

You said the RT decoding issue was in 4.7, has it been fixed/improved in later versions? And this is also the reason why with >5s tracks NotifyFinished is called half a sec too early - any way around that?

Thanks again for your detailed answers, much appreciated!