Announcement

Collapse
No announcement yet.

UAIPerceptionComponent vs. UPawnSensingComponent

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    UAIPerceptionComponent vs. UPawnSensingComponent

    Up to now, I've been using UPawnSensingComponent to allow the game AI-controlled characters to see and hear the player and other events in the world. Today, I was browsing through the docs trying to find an answer to something, and noticed UAIPerceptionComponent. Looking at the descriptions of the two classes as well as the properties and functions, I'm having a really hard time understanding what the distinction is between these two components. I notice that AAIController comes with an instance of UAIPerceptionComponent automatically, but UPawnSensingComponent has to be added manually. The former also seems to be a touch more sophisticated, with built-in functionality that goes beyond UPawnSensingComponent, including support for more sensory input besides sight and sound. I don't see any evidence of UPawnSensingComponent being deprecated though.

    If anyone has any insight or can demystify these two components for me, I'd appreciate it.

    #2
    I did come across a bit of info somewhere, though I don't remember where now.
    Anyway the gist was that yes, UPawnSensingComponent is pretty basic, UAIPerceptionComponent is what you'll eventually want to use if you want to do anything more complex. I'm not sure, but I would guess due to the lack of info that the latter is currently still very much a work in progress. I haven't attempted to make use of it as yet.

    Comment


      #3
      Originally posted by kamrann View Post
      I did come across a bit of info somewhere, though I don't remember where now.
      Anyway the gist was that yes, UPawnSensingComponent is pretty basic, UAIPerceptionComponent is what you'll eventually want to use if you want to do anything more complex. I'm not sure, but I would guess due to the lack of info that the latter is currently still very much a work in progress. I haven't attempted to make use of it as yet.
      Thank you. That's essentially what I was kinda guessing. I'm a little skittish about trying to swap it out now without more robust documentation, but maybe if I have some time, I'll poke around the engine code and see if I can figure out if UIAPerceptionComponent is ready for prime time. I'm always happy when I can delete my code in favor of Epic's, and it looks like UIAPerceptionComponent would let me trash some of my custom controller code.

      Comment


        #4
        Originally posted by jeff_lamarche View Post
        Thank you. That's essentially what I was kinda guessing. I'm a little skittish about trying to swap it out now without more robust documentation, but maybe if I have some time, I'll poke around the engine code and see if I can figure out if UIAPerceptionComponent is ready for prime time. I'm always happy when I can delete my code in favor of Epic's, and it looks like UIAPerceptionComponent would let me trash some of my custom controller code.
        I'm wondering about this as well. I'll look into it a little maybe and post what I found if I have time later too. If you find out anything else about it, I would really appreciate it if you could post it here. Thanks!

        Comment


          #5
          I've been using the perception component the past few months. Its part of a more advanced sensing system that is still WIP by Mieszko. There are still some active bugs and things that have to be changed in order for it to be fully functional, but overall its great to use if you are a heavy user and/or comfortable with C++. This reminds me that I should submit some pull reqs on it. .

          For the component/system you define a set of specific senses for a given AI's perception. Each ai and each sense can have unique sense configurations. Anyone in world can fire a sense event for a given team type for such things as noise, ally saying they saw an enemy, an ally was damaged, you or an ally collided with something hostile etc. Senses can be defined in blueprint, but the API isn't there yet and I haven't made any. There is only one active instance of a given sense for the system so the Sense_Hearing processes all noise events from the world, the Sense_Sight processes all sight queries from perceptionlisteners to their sight targets, etc. They are also time sliced sometimes. The Sight sense currently has a hardcoded max traces per frame for example.

          The AIPerceptionComponent fires a delegate off when any of its senses have been updated with a successful stimulus. The component keeps track of all sense targets and of the success state of the recent queries.
          Last edited by dzeligman; 02-11-2015, 12:28 AM.

          Comment


            #6
            Thanks for the info. Definitely sounds like it's worth checking. I've heard that Mieszko guy knows a thing or two about Game AI.

            Comment

            Working...
            X