Sound and Hearing AI Perception

I have a 3D environemtn with diffrent sound que with attenuation . When my player moves he hears a sound based on the attenuation setting of sound que . I want to capture the loudness data that the player percieves while movin in the environment . When he moves closer to sound que he can listen the sound louder and loudness data should increase and when he moves away he listens the sounds low and loudness data should reduce and when he is out of sound que range then loudness data should be zero . can any one please help how should I proceed . Any BLueprint , youtube tutorial , idea would be helpful .

If you are properly using the AI Perception, it would provide you with that information but I don’t think it necessarily is needed. It seems like you are basically looking at the player’s proximity to objects to determine how “loud” they are.

As an example, if an object that makes sound has a max sound distance of 3000 units, you could subtract that from the player’s distance to the object, then divide it by the max distance to get a loudness ratio between 0-1.

(max distance - player’s distance) / max distance

(3000 - 5000) / 3000 = -1.5
If the player is further than the max sound distance, the ratio will be negative; thus he cannot “hear” it.

(3000 - 1500) / 3000 = 0.5
The object is half as loud as it can be, when the player is half way from it’s max sound distance.

(3000 - 0) / 3000 = 1
The object is fully loud when the player is at 0 distance.

In case you wanted to go the AI Perception route, Epic’s documentation is thorough:

hi … thanks for the knowledge but I am told by the designer that the 3D environement map is baked in Project Acoustic and we cant get the distance data . So distance cant be used to calculate loudness so i guess I have to go through AI perception route .
Any youtube tutorial or reference material regarding AI perception sound will be helpful . I will go through the documentation … once again thanks for your help