Audio Virtualization WIP teaser vid

watch?v=p_31_v3gb8Y

Check out this teaser video of an audio visualization project I’ve been working on!

Supports Oculus Rift!

Music is “Animal Crossing” by Shiny Baubles

nice :smiley: looks awesome!

Very drippy. I have a feeling most occulus games are going to have to come with some serious motion sickness and epilepsy warnings.

Sorry that should of read trippy not drippy. Ha ha!

Trippy was definitely the effect I was going for! lol And you’re right, I’ve made a couple of my friends a bit dizzy when showing them this project… I plan to release the download to my website here soon, so you can all try it for yourselves. Just need a bit more time to polish it up!

Is your audio coming from a file or the microphone? I’ve been racking my brain and digging in the source code trying to figure out how to get data from the microphone in unreal. I’m thinking about converting my art project ( Wall of Light — Hunter Luisi ) from processing to unreal.

I’m pulling spectrum/amplitude information from the audio visualization plugin that came with UE4. As far as I know, the plugin only works with audio files (and only a single file, at that).

I don’t know of an easy way to record audio from a microphone in Unreal. Eventually I’d imagine someone will write a simple sound recorder using the low-level Windows API, wrap it up as a UE4 plugin, and expose all that data to Blueprints. But until that day comes…

Hey . Could you please give us a pic of the blueprint for the spectrum? I have been trying to figure this out without success. Thanks!

There is like zero documentation on the Audio Visualization plugin that I could find. The information I was able to get pretty much came from this thread, where some good work had been done. Now, I’m not 100% certain I’m doing this correctly, but this is what I ended up putting together:

488159dd3368e735efb9776387cbd86c9bfe8114.jpeg

You can see here I’ve just set up these nodes to run every frame using the Event Tick node. Because I had ten basic objects I needed to scale, I set both my Spectrum Width and Amplitude Buckets to 10. I then set the output of each node to their respective arrays. The Set Indexes node is a function call specific to my project, and the Calculate Equalizer node is the function call where I multiply the frequency by the amplitude, and scale the bars accordingly on their Z axis.

Great Job man!, maybe one day Unreal Engine will also be a music player for a Juke Box!

Thanks , very much appreciated :slight_smile:
I will try it!

rather zenful. I like it alot :slight_smile:

im also working on a Music visu with unreal but im not ready yet, a tip for you guys dont use the tick Event as its not going to give you proper Timings. if anyone needs help with that just ask :slight_smile:

Pretty sure I’ve noticed this, too, but I haven’t been able to come up with an appropriate solution. Using the GetAccurateRealTime node just crashes my editor. Any tips?

hey sorry for late the reply. just make a loop with a delay after the spectrum node in the exec path. everthing that has to be updated freq should be in that loop. here is a vid i made to see if everyhink is working nicley. not pretty but it does the trick :wink:

That actually looks incredible… way smoother than what I’ve been able to get out of the nodes, and appropriately predictive. That’s some great work, well done! Thanks for the tips, I’m gonna mess with it tonite and see if I can get it working better.

hi and thanks, i have uploaded the vid to vimeo because of the crappy YouTube quali.

and here is a Screen of my Level_BP, because a picture is worth a thousand words :wink:

That spectrum video is awesome, great choice of song :smiley:

This is all amazing stuff, but I’m missing something. What are the numbers returned from the Calculate Frequency Spectrum? are they kHz, or dB or what? I see some log functions in the source code, so maybe they’re dB, but I don’t know how to unwrap that back into something like a 0-1 value. Is Get Amplitude the same?

cheers,