Unreal has camera tracking options built in to look at and/or track focus on a specific object. Unfortunately this assumes the pivot point is the point to track. For a human the pivot point is at their feet where as what you really want is to track their face. So the user has to manually adjust the offset every time you want to track a different character. And if that character is animated (such as bending down) then the tracking offset has to be key framed. And don’t even think about auto tracking with root motion.
Suggestion: Have a default point option for actors. If there is none then the pivot point is used. But for characters it could be the head/face or nose. Bonus point for using the closest eye toward camera for precise focus. As Unreal delivers metahumans these capabilities will become even more important. Minimizing common repetitive manual steps the user has to make is always better.
Another potential win for this would be able to Pilot the metahuman. Piloting currently counts on the pivot point but since that’s either at the feet or center of object it’s not useful for things like humans. A view from the head/eyes though make it possible to verify and adjust the eyeline of the character to make sure they’re lookin at the other human or object. And this same information could set the eye convergence on the metahuman to make sure they’re not staring off into space.