Hey, my name is Arthur Kazakov. We are working on our VR game title Proze.
Now we developed the first release version of our custom occlusion system and audiofilters for maximum player immersion in the game world.
We use FMOD, GVRAudio, and Blueprints that overwrap FMOD Events and calculates occlusion and player positions, attenuation radius and direction of the sound source, material and obstacle thickness.
We are continuing working on this technology as there are a lot more features that can be added to
maximize the game experience that can be translated by audio and we want to share our achievements:
Enjoy!
Feel free to ask any questions. In future we have plans post some tutorials on our blog, be online.
Curious why you are using FMOD for this? Have you checked out the native UE4 spatialization and occlusion plugin interface? Check out Steam Audio’s implementation.
Not to be facetious, but have you used FMOD? FMod Studio integration is a godsend. Native UE4’s audio is really quite barebones - I’m not sure I’ve ever shipped a project that actually used it.
I actually built and submit a similar system (Epic is reviewing the code yet) which uses only native Unreal Audio. FMod can be attached, but not required since it’s just occlusion and the collisions/traces are running in engine anyways, FMod just apply volume/pitch/LPFilter changes in this case which can also be done with the native Unreal’s Cue system.
Yes, I’ve shipped games with FMOD. I’ve also shipped games with other audio engines and have built several.
Actually a lot of games ship with native UE4 audio engine! At Epic, of course, we use it for all of our games.
Have you played Hellblade yet? Just won best sound design this year, might win more at GDC.
Uses native UE4, even before I rewrote the rendering backend.
As the audio programmer working on the audio engine at Epic, trying to improve the experience, I’m genuinely curious what features in FMOD you’re using that you can’t do in UE4.