Using two senses with AIPerceptionComponent

Hi all,

I’m working with blueprints and I’m using an AIPerceptionComponent inside an AIController.

OnPerceptionUpdated gives me an array of actors. Then, for each actor perceived, I can get the ActorPerceptionBlueprintInfo and AIStimuluss associated.

When I have one sense (sight), using the boolean SuccessfullySensed from the AIStimulus, I know if a player becomes visible or invisible from my AI.

But if I had a second sense (hearing) in the same AIPerceptionComponent , how would I know which sense triggered the AIStimulus ?

I tried to add two AIPerceptionComponent to the same AIController, one for each sense, but no luck…

I just asked a very similar question, but it’s too early for an answer yet. However it seems like you’ve made it slightly farther. Is there any way you could post a screenshot of your Perception related scripting?

Here is my initial setup :

As seen in this cool video with @MieszkoZ, I’m using the boolean SuccessfullySensed to know if a previously seen actor is now invisible.
But as soon as I had another sense, I can’t find a way to check which sense was triggered.

[Somebody found a hacky way to do that][2] : by setting a “Noise” tag when playing a sound. I can’t believe there is no other way Oo

I previously used PawnSensingComponent, and it worked well. But I heard [it was deprecated][3] so I tried using the new AIPerception system. Right now, I find it a bit confusing.

Ah, I see. This is where I’m stuck as well. I’m specifically interested in a way to trigger an event once the player leaves the outer ring of the AI’s vision. I suppose I could put in age checks, it just seems like there would be a cleaner way to do it.

Edit: found this https://answers.unrealengine.com/questions/249501/detect-when-ai-perception-loses-sight-of-the-playe.html

It helps with sight at least. Not sure about hearing yet.

So found something that works. Definitely not an ideal solution, but it seems to work for now.

What I’m doing is a Get Sense Class For Stimulus, and checking if it’s “AISense_Sight” or not. If false then it’s sound.

The only problem with this is when the AI perceives a sound, the class comes back “None”. I’m not sure if this is a bug or not, but it seems like it. I haven’t tried it with other senses to see what class they return, but if they come back “None” as well, I’m not sure how you would have more than two senses.

Here’s a screenshot of what I have, the 2nd bool differentiates between the player being seen, and the AI losing sight of the player.

Edit: So, looks like there’s another bug, but this may be in the Blueprint, I’m not sure. Now when I walk into the sight of the player, and make a sound, it registers the class as “AISense_Sight”, instead of “None”. No idea why…

Hi !

I tried your solution yesterday and it’s working fine.
To get rid out the the None
class, you have to handle all the stimuli and not only the first one ( get(0)). The first one appears to have a None class associated, but it’s not the case for the others :slight_smile:

We finally did it.

I’m now wondering if it is the good way to go. We are relying on an event (push) instead of asking at a fixed interval which actors are visible or heard (pull). If an event is missed, my AI will never know about it…

Awesome! I’ll have to try this out.

Found a problem with Sight. Have you run into this problem? https://answers.unrealengine.com/questions/381894/ai-perception-sight-and-character-rotation-inconsi.html