I want AIPerception to recognize something at a certain distance and forget about it at a certain distance, but this simple thing just doesn’t work.
Even when a stimulus is clearly within the set range, the time it takes to recognize it is random and inconsistent. This randomness becomes even more apparent when multiple stimuli are present. Some stimulus sources are immediately noticeable when within range, while others are not noticed at all until contact is made. Despite the lack of obstructions and 360 degree visibility.
And once recognized, it will not be forgotten, even if it apparently goes out of range. I’ve checked multiple events that notify me of updates to AIPerception, but nothing seems to be updated.
Does anyone know anything about this?
Hi, the sight sense does a maximum number of traces per tick prioritizing by distance (you can change that number in the config file, or you can reduce the work necessary by using team affiliation or only register pawns as sources that need to be detected instead of auto registering them all which is the default behavior). So the detection will slow down if you have too many actors. That’s the reason why it takes time for the detection to trigger and seems inconsistent.
For the second part, a stimuli is not forgotten if it is no longer successfully sensed. What you want sounds like checking whether it is successfully sensed, not whether its forgotten. So you could use that ‘OnPerceptionUpdated’ event and check for each actor whether its successfully sensed. Also in your current setup you will add all actors to Found Targets whether they get in sight or out of sight (you would need to check whether Successfully Sensed is true or false). Also keep in mind that you have an Auto Success Range of 100 meters, IIRC even if it goes out of the sight radius it will still be successfully sensed unless it goes further than those 100 meters away from the last successful location.
Currently it is detecting 5 pawns, but it still takes a few seconds to detect them all. Certain pawns will never be detected unless i get too close. The total number of actors in the level is 35. Is the default performance really that bad?
And I tried using “Successfully Sensed” but nothing changed. Even if i move a million units away from detection range, i still won’t be removed from perception
35 actors all having an AIPerception component with sight and all acting as sources would result in slow updates. One single actor with an AIPerception component and only 5 actors acting as sources should update perfectly without any delay. As a test you could increase the number of traces done per tick in the DefaultGame.ini file (but only use such a large number for testing, it will slow the game if it actually does 2000 traces per tick):
And I tried using “Successfully Sensed” but nothing changed. Even if i move a million units away from detection range, i still won’t be removed from perception
Did you try to do a print directly after the False branch?
Perception has 5 detection targets. 35 is the number of objects in the world, including skylights and landscapes.
When I debug, the InRange numbers go up and down, so I think the component is working according to distance. But for some reason the event is not being executed properly.
Did you try to do a print directly after the False branch?
I think I may have found the reason why AIPerception is not responding well.
My pawns have their origins at the same height as their feet, so it may be that the detection event is not triggered when the terrain is uneven and it is determined that the origin is below the mesh.
So when I placed a pawn in the air, it was detected instantly.
This is a real problem. From what I’ve searched, I can’t find a way to configure what AIPerception “sees” in detail. There is no way to see a specific collision channel, determine eye position, and determine stimulus position. At least in Blueprints.
My pawns have their origins at the same height as their feet, so it may be that the detection event is not triggered when the terrain is uneven and it is determined that the origin is below the mesh.
If the linetrace hits the landscape or underlying mesh and not the pawn itself, then it won’t be detected. By default the linetrace traces against the origin of the pawn. And if I remember correctly, it also starts from the origin of the pawn doing the detection.
There is no way to see a specific collision channel
You can set it in the project settings under ‘Default Sight Collision Channel’, don’t see how that would help though.
The easiest way would be to just set the origin of the pawn to around the center of the mesh. Otherwise you would need to use c++ and implement the CanBeSeenFrom function of the IAISightTargetInterface inside the pawn that should be detected. Then inside that function you do the linetrace yourself and then instead of tracing against the origin of the pawn, you could trace to against some other location.
And you can also set the eyes location, from where the trace starts but also only in c++ (don’t remember the exact function name).
You can set it in the project settings under ‘Default Sight Collision Channel’, don’t see how that would help though.
No way! Here’s the answer!
Creating a trace channel for Sight and setting the default to Ignore will allow you to detect pawns through all objects. If you want to block detection, you can prevent detection by setting the channel to Block.
It really helped me a lot. Thank you very very much!