Is there a way to render audio waves in real time?

Hey,

I’m really new to unreal, and I was wondering if I could use it to render objects created from audio waves. Ideally I’d be able to load the audio from some source, a file for example, use that to create a wave (or even some kind of object). The object/wave would change shape in some way depending on the properties of the audio. I hope that’s not too outlandish of a concept.

I have ~5 years of experience programming in C++ so I’ll be fine if it requires coding, but I just have no idea where to get started.

Could anyone point me in the right direction?

Have you tried a forum search ?
I know there is a thread somewhere with something that sounds exactly like what you are looking for.

Unreal actually already comes with this ability, known as the Sound Visualization plugin. You can simply enable it from inside the editor, and do exactly what you’re talking about. I made a while back to help you achieve this.

Now, that plugin is extremely limited; it only does two samples (amplitude and frequency), and it will only work in the editor (it cannot be packaged into a distributable build). If you’re ok with that, then you’re done.

If you need a solution that can be packaged into a distributable build, you’ll likely have to code it yourself. I’ve seen some really good (private) implementations of this that are exposed to blueprints, so it’s certainly possible. Start with a search for “Fast Fourier Transform” algorithms, and that’ll point you in the right direction.