I’ve been working a while on a project to synthesize realistic character eye movement. Still very much a work-in-progress, but I figure I’ll share what I’ve found useful so far. Please jump in with any thoughts or contributions - I’d love this to be a community thing, since it’s something any of us doing characters in VR scenes are going to need.
The first thing we need to do is figure out where the player camera is. We do this by finding the location of the player camera and then applying an offset for the HMD’s positional tracking. For this, I’ve created a blueprint function library and added a pure function named GetPlayerCameraLocation, since I’ll be calling this from several other blueprints.
Here’s the signature:
And its implementation:
One improvement I’d like to make to this method is to read the IPD and return, in addition to the camera location, the offset locations of each eye. This method returns a location between the viewer’s eyes, and when looking closely at a character in scene who’s looking at this point, it can be apparent that he or she is looking between your eyes, rather than into one or the other as people actually do.