AI Behavior Tree - More accurately following detected player

I’ve just begun with behavior trees as of a week ago. They are pretty straightforward but I feel like every tutorial I read through goes through the same basic steps and calls it a day. It’s left me with with a number of questions.

I have AI Perception enabled with sight on my AI controller. On detecting a player, a blackboard key (bool) is set (or unset).

In my behavior tree, I use this key to drive the functionality of Sequence nodes. If the player is not detected, get a random reachable location in range and move there. If the player is detected, break out of the “roam” sequence tree and move to a “pursuit” tree. This sequence tree will use the AI Perception to get a list of sensed actors, find the closest one then store it as the “Move Location” blackboard key. Then it runs a “Move” to that location.

The above (aside from getting the closest perceived actor) is pretty much what every single Behavior Tree tutorial walks you through.

The problem is that if the player moves while the AI is moving towards their location, the AI ends up where the player WAS, not where the player IS. It ends up looking ridiculous and is made even worse if you implement “Focus”… then the AI just stares at the player while walking sideways to their old location.

I had implemented a solution where I obtained the player’s location in a service and had the pursuit sequence restart if the player’s location changed. The problem with this is calling “Move To” repeatedly causes stuttering in the velocity and animation. Not to mention it just felt like a really messy implementation.

As I type this, I am realizing I may have a better solution. I am currently using a click-to-move setup in a multiplayer environment which required some “custom” functionality: effectively just a loop to say “While I am not at the clicked location, add movement towards the clicked location”.

I think I can implement a very similar Behavior Tree task to say “Get the focused player’s current location and add movement towards that location”. Using “Add Movement” should keep the animations smooth and allow the AI to path smoothly towards the player if they move.

Of course, it’s possible there’s some other issue here I’m not seeing and I won’t be able to test this for another 8 hours, so if anyone has any input like "Yeah, but that won’t work because … " or can point me towards some more thorough/advanced behavior tree tutorials, I’d appreciate it. If not, I’ll just leave my ramblings here in case it happens to help anyone else in the future.

You can use “move to” to move toward an object(Player Pawn), and not a vector. That way, turning will be smoother.

Ah, but within a blueprint graph, correct? I think the “Move To” that exists within the behavior tree accepts only a vector.

Also, I believe there are issues using “Move To” in a multiplayer environment which is what caused me to use “Add Movement Input” in the first place, but I’m not certain…

The MoveTo task in the behavior tree can move to an actor and not just a vector. Chasing a moving actor will be smooth, but you’ll notice a stutter when we reaches the acceptance radius. There are ways to improve the visuals of this… a quick way would be to have a very small acceptance radius. Or maybe set your LookTarget on the controller toward the actor to be chased and allow the bot to strafe.

You could also create your own move to behavior by making a BTTask_BlueprintBase. Create the desired functionality in there and use that in place of the default MoveTo