Accessing sound data while using audio middleware

Hey everyone,

I’m currently working on audio focused VR project.
The core mechanics is modifying sounds around you, so we using FMod.

I was trying to use functions from sound visualisations plugin to get frequency and amplitude of each sound to make environment reacting to it.
Unfortunately FMod is using different data types, so it is not possible.

Do anyone have an idea how to solve that?
Expose those data in FMod or use any other middleware that can do it, nay other options?

Thanks a lot for help!

We solved it by generating waves files for each FMod event.
You can read current time for each FMod event and use Audio visualisation on related wav files.

It may seems like a work-around but thanks to that we can smooth the sound wave as the wav files is not played at all, we treat it as data container and sound designers can manipulate it however needed.