Audio Occlusion System using Breadth - First - Traversal?

Hello! I am looking into creating an Occlusion system using middleware and Unreal Engine 4. I recently saw a GDC talk from the audio programmer at IO Interactive where they used a Breadth First Traversal (BFS) algorithm to store values or information about rooms or environments in their game Hitman. I wanted to try and create something similar in UE4, but have limited experience with blueprints, and thought I would ask if anyone knows how I could achieve this.

So the situation is this: Let’s say I have a room looking like this:


Where the circles/nodes are the rooms, the connecting edges are sound propagation paths and V is a float value from 0 - 1 for each room, representing an occlusion value.

The idea was that every 10 -15 frame , we could do a traversal routine using BFS in order to get the value or information of each room according to where the player is. So let’s say the player is in Room 1 (R1) and we do a BFS from there, where we first check the connecting nodes of the room where the player is, and then of the children of those connected nodes. The output/traversal order would look something like this:

What we could then do, is to add the occlusion value of the rooms furthest away from the player with the value of their connecting rooms, and not the players room. Something like this:

This could be done to represent the acoustic and physical phenomena that the sound reaching the listener in Room 1 from let’s say room 3, will not only be affected by the walls in room 3, but also by the walls in room 2. If the added value of room 2 and 3 is above 1.0, then the sound would be fully occluded and we would not hear the sound from room 3. We could also set a limit and say that all rooms that are three or four nodes away from the listeners room will be fully occluded.

Does anyone have an idea of how this could be done in Unreal Engine 4?