Struggling with Audio-Reactive

Hey everyone, I’ve been diving into Unreal lately and I’m really excited about the audio reactivity features. I’m working on making objects that respond to sound—moving or changing based on the audio input. Right now, I’ve been struggling for days to get a sphere’s color emission to react to audio, and I’m pretty stumped—would love some help! Ideally, I’d like to use a spectrum analysis to split it up, with different spheres reacting to bass, treble, mids, and highs—each one tied to its own frequency range. Any tips?

Unreal 5.5.3

Hi @D0rk80
Let’s see…

Audio Spectrum Analysis:
Use Unreal’s Get Spectrum Data node to examine the audio and receive frequency data.
The data gives you an array of amplitudes for different frequency bands

Break Spectrum into Ranges:
Split the spectrum data into frequency bands (e.g., bass, mids, highs, treble).
Calculate an average value for each range

Material Setup:
Create a material with an Emissive Color input.
Employ the frequency data to control the emissive color (e.g., map bass to red, treble to blue).

Employ Multiple Spheres:
Display multiple spheres, each for a specific range of frequencies.
Each sphere reacts to its specific range of frequency by intensity or color variation.
Smooth and Refine:

Smooth variations of frequency data using FInterpTo or nodes of similar purpose.
Adjust the sensitivity to raise the accent of some ranges (like the bass).
With the Get Spectrum Data node and mapping the outcome to sphere material properties, we can get an audio-reactive effect with different spheres reacting to different frequency ranges

Thanks man I really appreciate the feedback looking for the stuff you mentioned now.

Cant put the puzzle together find it really difficult working with unreal. Ive been trying to find someone to pay to help me but haven’t been successful posting briefs on fiverr trying to get help. Just completely lost really.