Newbie: AI questions

Hiya,

I am new to unreal engine, so please pardon my vocabulary as I am quite new to some of the terminology. I have a pet AI character that follows the player character around, what I want to add to the AI is to allow the pet to detect/sense an object and run to the object as well as alert the player that the pet has found that object. For example, the pet AI can help the player find a health berry bush that when collected will improve player character health and stamina. What are the steps involve to allow this type of action to the AI character? Also, if the player character decides not to be bother with interacting with the object, how do I get the pet AI to continue following the character? I hope that makes sense…:slight_smile:

From a design standpoint, I would use a state to determine what the AI should be doing. If the player moves X distance from the AI, then the AI should switch states to return to the player. If the AI is X distance from a bush, then the AI should switch states to alert the player by moving towards it.

For determining AI distance from the bush, you could use a collision sphere that uses OnOverlap to check the overlapped actor’s ability to cast to the object you are looking for. You could also use a custom collision channel for overlapping with interactable objects in the world, provided you will have other objects that the pet should also “find”.

AI distance from the player can be checked by getting the vector length of the difference in the two object’s world location.

You can use behavior trees and the Environment Query System.
https://docs.unrealengine.com/latest/INT/Engine/AI/BehaviorTrees/index.html
https://docs.unrealengine.com/latest/INT/Engine/AI/EnvironmentQuerySystem/index.html

With BT and EQS, you can do all you just described.
I will warn you have BT has a learning curve, so take your time.

If the AI won’t be more complex, I’d rather use just a simple blueprint check via trigger events (As Envis10n mentioned). If your AI ever gets more complex, I’d recommend you start learning about Behavior Trees. They are frustrating at first but have quite interesting capabilities and are much easier to debug than simple blueprints.

The process would be as follows:

  1. Detect near object. There are two simple ways to do this:
    a) Collision: Add a Collision Sphere to your Pet BP. To be neat, I would add a new collision type that only overlaps with your detectable Items (you can set this up in engine > project settings > collision). Then, on your Pet BP, add a new event from the CollisionSphere component (On Begin Overlap). Now any time an Item overlaps the Collision Sphere, the OnBeginOverlap event will be fired. That event references the item with which your pet overlapped.
    b) AI Perception: This is a more complex feature, but if you take the time to learn it, you’ll probably like it. It’s a bit longer to explain but there are some videos showing it. Basically you have to add a Perception Component to your pet BP, configure the senses you want (In this case, Sight), and then add a “AIPerceptionStimuliSource” to each item. This will allow your Pet to perceive items. Any time the Pet perceives an item, it will fire an OnPerceptionUpdated event referencing the item.

  2. To make the AI move to a location, you can use the “AI Move To” node in BP. It’s quite simple and has built-in pathfinding. Please note that your level must have a NavMesh Navigation volume in order for this to work.

  3. After the pet reaches the object (The “Ai Move to” node has an “On Success” pin that activates when you reached the destination), you could smoothly rotate it toward the player and play some kind of animation.

  4. After you played the animation, you could set up a delay for X seconds. After the delay passes (if the player is not interested) or the object is picked, you can forget about the item and go back to your player chasing behavior.

Hope this makes sense!

Thanks for the responses! I’ll read more into the collision sphere and behavior tree.