I’ve been looking into adding sensing of a player character into several AI figures in our project, by way of blueprints (if for no other reason than rapid prototyping).
I’d like to get the sensing to be oriented to a particular bone or socket for the animation being used.
I started with PawnSense even though I’m aware it’s intended to be phased out.
However, I do not appear to be able to change the socket it should follow (some complaint about not being able to change it on inherited components). If it’s meant to follow a socket of a particular name, it’s not clear what it should be. It doesn’t appear to be following a particular bone. It doesn’t appear to have a transform value I can programmatically alter at the blueprint level. It doesn’t appear to have a way for me to programmatically change the socket to follow. I can’t attach it to some sort of secondary scene component that does have a transform. Short of attaching it to an entirely separate actor that I can invisibly sit at the first actor’s eyes, I don’t seem to have a way to alter the orientation, and that solution seems like swatting a fly with a Buick.
I went to check out the AI Perception system and seem to be running into similar problems. So I thought I’d stop spinning my wheels and post here.
What is the canonical way to reorient one of the two perception systems? Part of my problem is to compensate for skeletons that are oriented strangely in their animations relative to the actor’s orientation, but it’d be nice to have the perception move about with the head, eyestalk, floating nimbus cloud, whatever.
(I realize I may gain more control by abandoning blueprints and heading to C++ land, but I thought I’d ask about the blueprint side anyway… any advice for either language would be appreciated).