Blend between cameras based on location of pawn

Hey all, I am creating a designer placed camera system. We call the placement blueprint objects “camera nodes”. Then we have a camera node manager object that finds all camera nodes within its sphere collision and sorts this array of camera nodes by distance from pawns location.

Our goal is to place camera nodes around the world and have the cameras blend to each other based on the players position to them, creating predictable and smooth blending of targeted camera rotations and positions relating to the player pawn. I have tried many things but I think this is more math heavy than I am capable of working out. We get 80% there but it always pops to odd locations and doesn’t maintain the relationship between all of the nodes. Any ideas?

My main thought is to find a value that represents closeness percentage to a target node by further sorting by distance and direction, then averaging the three nodes data with each other based on this alpha value. Again, I’m not sure how the math would work out though, even if I’m right.

Could you have a look at:

Would this be suitable at all? Or is this a very different camera setup / blending style? Using a spline would take most of the math out of it, and it makes is designer friendly - quick to update / iterate.

Rather than having just a node, you’d have a node with a spline that dictates camera behaviour in the node’s proximity.

Sorry for the delayed reply. I appreciate your help. We had since moved to another camera system. However, I did find an algorithm that was able to do what we wanted, but never implemented it. If I ever get back to it, and implement it, I will come back and provide more info.