i got an idea i share my idea for navigation for bots

it might work like this. you start with a camera, then from the camera you get a hit result, and based on the hit result you have a line. you get the length of the line

see the diagram for how that words when i click the left mouse button:


now the idea part.

with multiple cameras, you can get multiple hit results.

just point the camera in different directions offset at an angle from each other.

the bot then moves forward and each step the bot measures the lengths of the hit results and goes to the longest hit result.

Now i shared my idea epic can make this easy with 1 click in a update? :slight_smile:

because then that would mean get camera forward vector, right? And there is no get camera number something forward vector right now…

1 Like

you can use get actor forward vector.if that camera is a component you can use get forward vector (get forward vector,without actor version ).
For offsetting the vector there’s a “rotate vector” node.you can use it to easily offset/rotate a vector (by degrees).

1 Like

Thank you!

fyi i made the blueprint that fires at multiple spots at one time and finds the distance of each hit. I will share it here:


For the previous post i made showing how i shot multiple times,

i could now make 360 sequences then i make 360 macros, each macro being 1 shot/hit.

is there a better way to do this that’s not so time consuming to make 360 sequences then 360 macros and manually adjusting the z axis by +1 for each new macro?

here is how i get the longest distance from three sequences but for 360 sequences it would be really large graph:


I think the problem with this approach is it’d get very expensive on resources pretty quickly compared to a nav mesh or just blindly moving

but feel free to prove me wrong but i suspect as little as 15 AI you’d notice

something like a navmesh is usually pre determined with some amount of dynamic to account for things moving where’s a line trace would be almost constant calculation