Download

Problem with AI remembering/forgetting perception

When i set the AI perception Max Age as a finite value, the AI forgets the player perception after a set time, even though it later sees the player again, it will only perceive the player if the player moves or makes a noise, if he stands still he’s completely invisible, which is very strange.

On the other hand, setting Max Age to infinite will just make the AI chase the player forever.

Is there a way to resolve this and make the AI respond to the player as long as it’s in line of sight?

Hi, could you show an image of your perception logic and how you set the relevant blackboard keys (especially the player position) ?

It doesn’t matter for sight sense whether he stands still or not. Of course sight sense will only give you an update when it starts or stops seeing something. You can always access the latest sight stimulus though (GetActorsPerception).

Could you specify what you want? Do you want to constantly access the latest sight stimulus location?

Here are the images, hope they help.

!
!
!
!

Part of the reason the player is “invisible” when standing still is because he’s not making noise so the AI doesn’t receive any hearing stimulus, but the AI should see him as well if within line of sight instead of ignoring him completely.

So right now, the AI would chase the player, the player runs into a room and breaks line of sight and stands still and makes no sound, so after 10 secs if the AI hasn’t entered that room then it should lose track of the player and go back to idle, but if the AI does enter the room it should see the player and keep chasing him, even if the player is making no noise.

The problem is, unless the player is constantly walking/running/making noise, the AI will forget he’s there after 10 secs, even if the AI was supposedly looking at the player at the time ( i set this max age value because i don’t want the AI chasing the player all the time and should give the player a chance to hide). So i guess what i’m trying to do is tell the AI: if you either see OR hear the player, chase him; if you stop seeing AND hearing him for 10 seconds, go back to idle.

One problem is in your image_aiController. OnTargetPerceptionUpdated will fire per sense, that means that if the AI stops hearing something it will fire an update with SuccessfullySensed being false, even if that something is still plain in sight.

There is a HasAnyActiveStimulus function inside the perception component, but it doesn’t seem to be exposed to blueprints. As a workaround you could use GetActorsPerception and then loop through the latest stimuli and if for one of them SuccessfullySensed is true, then at least one sense is currently perceiving that actor.

Also you can get the sense class that triggered the update via GetSenseClassForStimulus, if you want to do different things depending on what sense triggered the update.

Is this right? I’m not sure which event should call it and whether it should replace my existing nodes or run alongside them. I tried a few things like running it every tick and executing on perception updated but they don’t work.

Also, how does the Sight sense determine if it’s successfully sensed? The hearing sense returns false when there’s no noise within a set radius, but why isn’t sight sense returning true if the player is standing in plain sight, since in that situation the sight sense should update again after the hearing sense returns false?

Yes, that’s the function I meant.

When do you want DetectPlayer? to be true and when should it be false?

Distance (view distance), then dot (view angle), then linetrace.

A sight stimulus will be SuccessfullySensed as long as the actor is in sight, OnTargetPerceptionUpdated though will only update when that state changes, it doesn’t matter what other senses do, they all act independent from each other but all send their updates through OnTargetPerceptionUpdated. You can always access the latest sight sense stimulus via GetActorsPerception though like you’re doing above.

In your image above you’re looping through the latest stimuli of all senses, so you set DetectPlayer? to whether the last sense in that loop SuccessfullySenses the Actor (which in your case should be the hearing sense). Also you need to also plug the actor whose perception you want to know into GetActorsPerception, otherwise the Array will be emtpy.

And if you want to know what sense a stimulus belongs to, you can use GetSenseClassForStimulus and compare it.

As mentioned, if the AI either sees OR hears the player, it should chase him; if it stops seeing AND hearing him for 10 seconds, it should go back to idle.

I’ll try getting the specific index of the stimuli (instead of the whole array) and checking if each individual stimuli has been sensed and plug that into a branch and see how it goes.

Also, thanks for the suggestions so far, it’s helping me troubleshoot this thing even though i haven’t found a solution yet.