Hi guys,
i’m having an issue with synchronizing audio rendering in my project, i explain.
We’ve build a project for a music video where we’ve made blueprints to react to the float value of a music track. We’ve imported our music tracks and plug “On audio envelope value” to change the scale of a mesh, creating some kind of beautiful music equalizer. We had to change the audio rendering in the engine in the WindowsEngine.ini by just adding " ; " in front of the first line “AudioDeviceModuleName=XAudio2” and removing the " ; " of the third line. It’s a trick i found online.
All of that is working fine in the Editor and even in Pie when we launched it, but we want to render a film using sequencer and having the audio perfectly synchronized like in the Editor or in stand alone.
When we render out the movie with sequencer and using Master Audio Submix ( i know it’s experimental ) the audio is playing in realtime but the image sequence is de-synchronized.
I would like to capture exactly what we have in the editor where audio and blueprints are matching.
Any ideas? Another solution would be to save out the image sequence from the level blueprint, because we only want to use the video from Unreal for our video, but i’m not too sure how to do that.
Thanks for your help!
All that stuff is made for real-time, but the sequencer is not real-time like that! Gotta redesign how we get the driving value. Since you don’t seem to be using spatialization or anything, just viz for the song, I recommend using new Synesthesia plugin, there you can get baked loudness from a sample(the song), so you can poll it with timestamps for every frame the sequencer outputs, and get the envelope data from there, without having to play the sound. Completely decoupled from time etc:)
Synesthesia plugin is 4.24, so if you are stuck on earlier version, you can do it yourself by recording the envelope in PIE to a timestamped array/map/whatever, and store that dataset for further non-real-time use like sequencer.
Hi, Thanks for your reply.
I guess we were far to close to complete the project to undertake massive restructure in the blueprint system. It has to work with what we already set up !
And it did ! so we used the project in realtime to be played live at the event and we simply used tricks to actually record my screen ! A cheeky workaround and probably not the best solution but it worked in this short time frame, and we are pretty happy with the result. We are really interested using Synesthesia plugin tho, maybe for the next project.
Thanks for your help anyway.
Here’s the video rendered and captured in Real Time:
Really cool video! Looks great, and lovely concept. Thanks for the update!