Haha, that is very cool stuff! Great work! Would love to be able to experience something like this in a VR headset w/ solid set of headphones on.
Any chance we could see another song in action? =)
Random side note, there are actually a few artists who have been using UE to create stage projections for their live shows, and might get a pretty good kick out of this.
Great work. I’ve been wanting to use UE4 in some artistic/abstract way. Im just starting to understand this fantastic tool, could you share a bit about how you did this? any pointer or direction helps
Big Thanks! And yes, maybe. I was actually not happy with this, because the visuals aren’t properly synched with the audio.
So I really want to do something like that again, probably make some GPU particles dance, But at the moment I am working on something more serious, so I don’t know when I will tackle this again.
Of course, here are some hints:
The Visuals are plain UE4(0 textures used). For the floor I just used a material with very low roughness + some SceneReflectionCaptureThings.
The Bars are just static meshes, that are scaled in Z direction based on the Spectrum/Volume values.
The Material on the Bars is simple as well. I have two ColorParameters that I just Lerp using a gradient mask. Then I multiply the top color with a floatParameter to make it glow(basically any color > 10 makes the material glow). Both ColorParameters and the MultiplierParameter are then used in Blueprints to change their values based on Spectrum Data.
For the music visualization I just used the pluging “AudioVisualization” (I believe) that comes with UE4, you just have to enable it first. Well that can give you spectrum and amplitude information you can plug into the bars(or whatever you want to use to visualize it). Check FFT spectrums on google to get further information on that.
Well, then I just put everything together in a blueprint and hooked up some custom timings to change visuals and not make it boring over the song duration.
By the bye, you used C++ programming to listen and recognizes the basses, highers and lowers of the song, right? This effect is script-made instead of manual designing, right (2)? Seriously, what a sight for sore eyes technique!
Yes, I used a plugin to get the frequency spectrum and passed the data to the according bars to visualize them, then I multiplied them by the amplitude(volume). But the data is not really good. With a bit more math and window functions for the FFT(to filter high and low frequencies), you can a a better result.
Yes, the whole thing is a blueprint, that just reads the frequency spectrum + amplitudes and updates every tick. I just hooked in a few custom events to start the particles and change the base color of the bars.
Very cool! At GDC, Notch, the man behind Minecraft, through this awesome party with lots of screens all over the place with cool effects playing to the music. What you’ve done here would have been right at home on one of those screens. Keep up the great creativity!
This is a really cool use of the engine, bravo DennyR! And thanks for dropping the hints about how the system works, very nice! I want to try this plugin now.