Rendering Sound Waves...

UE4 Editor has a class for rendering sound waves called USoundWaveThumbnailRenderer.
As a test, I used this to render a sound wave to the player’s HUD. Why the HUD? Because the implementation uses UCanvas.

I would like to bring it out of the HUD and into Slate/UMG, but it appears that UCanvas is a different system altogether. This post has a great breakdown of Slate vs. UMG vs. Canvas.

Now I’m thinking about alternatives and looking for ideas and suggestions…

Cheers.

You could use Synesthesia’s Non-RealTime analysis to generate data tables for a waveform widget.

Hey @dan.reynolds, thanks for the response. It’s not really a question about getting data, per se. I have access to the raw waveform data already. It’s more a question about the best way to render (potentially large) amounts of data.

I gave Dear Imgui a try and here is the result:

And here’s the waveform drawn inside the player HUD, with an Ableton Live window for comparison:


I even thought about using the Curve Editor in UMG. An unsuccessful test:

But I should probably circle back to creating a waveform widget in Slate…

Here’s the result in Slate, with a comparison to the original asset in the Content Browser:

There seem to be artifacts in the render, but as far as I can tell, the code is identical to the thumbnail renderer. If I zoom in, you can really see it:

Oh well, I’ll keep poking away at it. It was a fun journey!

Cheers.

So one of the big differences between the Ableton and the asset waveform renderer is that the Ableton one more accurately displays waveform bias.

I think Ableton has created a mono channel for rendering, so the waveforms will looks slightly different as well.

Sequencer uses an waveform renderer as well, it’s called FAudioThumbnail, but it isn’t publicly exposed.

Hey! Do you keep trying methods to render the waveform? I’m curious about. I’m trying to do the same, but I don’t know where to begin. Any hint will help, thanks!