Intrepreting level geometry into node graph

I’m trying to build a system which allows me to keep track of where NPCs are when they and the chunk they’re in are not being rendered. I’m using a sims-inspired utility AI where most NPC interactions are coded into useable objects, and the core AI just consists of selecting an object to use, pathfinding to that object, and activating that object. As such, I’m trying to create a framework that allows me to query a) where in the world each NPC is, b) which NPCs are in the same room, c) what useable objects are in that room. The approach I’ve been stumbling over is using a Room class as the fundamental unit of organization: every section of geometry belongs to a Room, and each Room keeps a list of its occupants, objects, and any adjacent Rooms that can be reached. Organizing Rooms into a node graph would let me run pathfinding and convincingly fake NPC movement by teleporting them room-room each tick, and retaining references to room occupants and useable objects gives me options to simulate more complicated behavior if I wish.

The real rub here is that my entire system relies on the AI manager being able to break the level down into Rooms and correctly intuit how the rooms connect to each other, and as far as I can tell the only way to make that happen is to manually configure trigger volumes around each section that I want a Room assigned to, which is extremely brute force-ish and prone to human error. Is there a vastly superior design pattern I’m missing here?