Pawn Sensing

I am using the PawnSensingComponent in Unreal Engine and have a question about its sensing logic. The component appears to implement a built-in flip-flop behavior: the sensing state (e.g., bSeePlayer or equivalent) becomes true when a pawn is detected and automatically switches to false when the target moves out of sight. Is there a specific design reason why these internal states cannot be set manually? I would like to maintain my own boolean variables to accurately track whether the AI pawn is still actively sensing the player. This would allow me to cleanly trigger different behaviors the moment sensing is lost, rather than being limited to the component’s default on/off logic. I am aware of common workarounds such as performing a Line Trace or checking Sight Distance, but both have limitations in practice:

  • A Line Trace is too narrow, so even minor player movement can break the trace instantly, even when the target remains well within the Pawn Sensing radius.

  • Relying purely on distance requires the player to move quite far away before sensing stops — especially problematic when the AI pawn is actively rotating toward the player.

As a result, increasing distance often becomes the only reliable way to exit the sensing state. Is there a recommended way to achieve more flexible, manual control over PawnSensing states, or a better pattern for handling “lost sight” logic with this component?

If you need more control over the senses, might want to try out the perception component for ai controllers. It fires off perception update events you can use to decide what the pawn should do with the info. Don’t want pawns to see you on Tuesday? Branch off a successful sense update from a day=tuesday. Want to give them a 10s window before they forget you? Start a timer and only clear their target var after it finishes.

I was working on a rpg style stealth mechanic recently where I intercepted successful sightings with a distance test and checking if player (or other npc) was crouching so you could sneak up to within like 5m even if they were facing you. Is a bit easier to debug too since it has built in debug mode with spheres drawn on sensed actors (or locations if it was a sound report).