Currently the new AI perception system seems to be set up for a kind of on/off perception testing - there is info on the timing and gain/loss of perception of a particular actor, but no info relating to the stimulus event itself. Say for example I wanted my AI to react to a noise event differently, depending on what kind of sound it heard. Ideally I’d be able to attach some game specific info to each generated stimulus event, which I could then access through the perception component or update handler on my AI. This doesn’t seem to be possible though.
What would the suggested approach be for dealing with this at the moment? Is this potentially a direction in which the AI perception system could be expanded in the future?
Yes, we plan to add a way to associate extra, game-specific data to noise events to allow AI to react differently depending on a sounds. Currently there’s not way to get this kind of functionality in the engine. Sorry!
Okay no worries, I assumed it was on the to do list. Can I suggest, when you come to implementing this, try to keep it as generic as possible. Rather than just having an extended MakeNoise functionality, it would be great if data could be attached to any stimulus event (for any sense), so we can easily extend the system as needed. Cheers.
Any update on this? The “Report Noise Event” node has a tag property now. I’m not sure if that existed when this thread was created, but it can be used to distinguish between different sound types.
The sight and damage stimuli don’t have the tag property. When using the “Break AIStimulus” node the tag property is always “None” for sight and damage.
Maybe I’m missing something. Is it possible to set a tag for the sight stimulus?