Hi everyone,
I’m currently working on a project in Unreal Engine 5 and I’m running into an issue with spatialized audio that I haven’t been able to solve despite trying various approaches. I’m using Unreal’s built-in audio engine (not a third-party plugin like FMOD or Wwise).
Here’s the situation:
- I have multiple ambient sound cues placed in the level, each using attenuation settings for 3D spatialization.
- The attenuation is set up with a custom attenuation asset — with Spatialization enabled, Falloff Distance defined, and the Spatialization Algorithm set to “HRTF”.
- However, when I move the listener (i.e., the player pawn with a camera component), the audio doesn’t seem to behave directionally. It sounds like it’s either fully stereo or mono, with no noticeable left/right or front/back changes.
I’ve double-checked the following:
- The sound cues are using spatialized wave assets.
- “Enable HRTF Spatialization” is turned on in the project audio settings.
- I’m using stereo headphones during testing.
- The player pawn has the correct Audio Listener component (I even tried overriding the listener position to match the camera).
Has anyone experienced similar issues? Are there any hidden gotchas I might be missing — like engine-level settings, platform limitations, or quirks with how UE handles listener positioning?
I would really appreciate any guidance or suggestions. If needed, I can share screenshots of my settings or a short video clip demonstrating the issue.
Thanks in advance!