How can I control the meta humans look direction in runtime

I tried to make the meta humans eyes follow my player in runtime, but cant get it to work. even moving the eye bones in the skeletal mesh doesn’t move the eyes,
I want to make a meta human just have a world look location that the eyes will follow .This was easy with traditional character meshes with eye bones.
Whats the trick to doing the same with meta humans

Was just about to create same topic, since I have the same problem. I think the way to do it is through control rigg, but I’m only able to move the eye in the control rig menu not in realtime. Have tried the same logic on the head and that works fine but eyes bones dont.
The only topic on this out there I have found is:
In the comment section he said he took the logic from the control rigg.

Yeah I have been asking this question for a while in multiple places without a single Answer. And I cant find a single bit of Info that is useful on this topic

I’m having the same issue. I’ve been trying to figure this out for weeks. Would love to find documentation or guidance on this.

1 Like

I can move the eye in real-time through the control (not the bone). Still have not figured out the method how to move the eye towards player. The value I want to put in the vector is between -1 and 1 for X and Y.

1 Like

Thanks I will try this after work.
See if I can get something working

The magic part could be the ,CTRL_LookAtSwitch .
If we could just set its world location in runtime to say another characters head bone. EVERYTHING WOULD BE COOL.
Im almost ready to give up entirely on Metahumans as this seem like top secret info that either EPIC don’t want to share or explain or NOBODY knows which means the documentation is too poor.
I can’t even get the metahumans face to move atm with the sliders on the face control board?

Has anyone figured this out?

ANYONE looking for this solution can find it HERE!

so does anyone have free solution for this problem?

1 Like

I can do something similar with the Python interface. Is that helpful? It’s probably not hard to translate it back to blueprints.

A couple of months ago I did a tutorial on how to control the eyes look at behaviour in realtime, here is the link.