Hello, I’m trying to figure out if I should use the built-in PerceptionSystem. It seems like it has most of what I would expect, but there doesn’t appear to be any way to record different types of data for different types of stimulus. For example, if I wanted to create a new sensor that sensed conversations, we might require several fields the define tthat are not part of the FAIStimulus structure (topic of conversation, subject of conversation, etc).
Is there a way to deal with this sort of situation, other than to modify the engine code to include the superset of all possible fields different sorts of senses might require?
It seems like one possible solution might be to store an optional source event into the FAIStimulus structure, but ideally there is a way to deal with this situation without modifying the engine.
You can create any sort of sense, see the different senses already defined in the AIModule perception code. You would normally need an AISense class and an AISenseConfig class. The config provides data for setup to the sense itself.
A sense registers itself with the sensory system and the AIPerceptionComponent for the agent that wants to do the sensing. I don’t have the code in front of me right now, but basically the sense gets a callback from the sensory system to determine whether a specific sense has fired for a given object registered with the sensory system (if you tick it to auto-register actors to have a sense then it’ll get them all presumably if they have an AIComponent).
So for your specific example of conversations, you’d maybe make a AISense_Conversation class and associated config class. Now in regard to the sense event itself. When you get the actors perception update, you get an array of sensed updates (which were added by the senses when they decided the sense had happened) which you can then test the FAIStimulus for information. You can lookup the sense that triggered the stimulus by the ID value. But as you say, there’s no way of really binding the sense information sensed (i.e. I heard a conversation happen and this was the subject they were talking about).
I would make a nice addition to the sensory system to do that. Another addition would be to make it possible to change the required senses parameters on-the-fly, so for instance having per-agent distance for sight senses, rather than a single value for all using that sense.
I guess its probably a bit of a work in progress, but given the number of AI programmers, I wouldn’t expect any updates soon
Yeah I probably should have been more clear. The AISense stuff seems nice and extensible. I was talking specifically about the (apparent) non-extensibility of the perceptual info stored in the PerceptionComponent.
Well, it’d be relatively easy to extend the FAIStimulus to have a TArray of properties you could return from any sense. The main issue I’ve got with it, is that you provide a config when you create a sense, that config is then used for ALL agents with that sense (it registers the sense with the AIPerceptionSystem), so actually having senses that need dynamic parameters isn’t possible. Same with returning dynamic parameters (as in your case). Should be an easy enough thing to extend to allow dynamic parameter binding for both cases really.