Visualising sound

Hi all

I’m learning UE in order to build interactive art applications in VR. I have previously worked almost exclusively with the C++ arts platform openFrameworks. The 2 applications that I’m planning both build abstract visuals using sound data.

The first application will take data from the microphone. In this case, the audio data will be used to create abstract visuals and will need to be recorded for later playback.

The plan for the second app is to take data from a Spotify playlist. I found this project but it seems like it simply controls the Spotify app when it is open rather than stream the audio into UE. Do you think it would be possible to achieve this - to stream the music from the Spotify app into UE and to analyse the sound data?



WASAPI Loopback is the underlying technology I would use on Windows in order to get access to the main audio stream, although it will be getting everything that audio device is outputting rather than just the output from a single app.

So far I didnt find a trivial way to implement such things in UE4 so I have been inclined to fudge a bridge between these two worlds. eg have another app running that does the WASAPI Loopback stuff and frequency analysis, and then sends the results of that analysis, in the form of a small bunch of floats, to UE4 using Open Sound Control. This at least allows me to get going with prototyping and developing the visual side of things, and I can only hope a more direct method is available in future.