looks awesome! However it seems this is all applied to the sound source before its played? i’m wondering if we will one day be able to use a combination of DSP, material properties, colliders and audio ‘rays’, (maybe similar to how light rays function), in order to simulate reverb that is responsive to the virtual environment? this would be the audio equivalent to post-processing. It would be amazing to build a custom size/shape room and have audio ‘waves’ bouncing off the walls to create a reverb simulative of that space, governed by the material properties of that room, in realtime. for digital musicians it would be amazing to experiment with practicing, recording, or playing live in a custom shaped room, studio, cave, that does not / could not exist in real life.