I am using the InCameraVFX template which contains nDisplay and LiveLink.
This is generally used for virtual production and LiveLink’s function is getting VR input (SteamVR controllers, etc) to control the frustrum cameras that the project includes.
I am wondering if there was a way I could use head XYZ coordinates from the LiveLink Face App to control the camera. Essentially, open the App and with that input have the camera pan left and right, tilt up and down.
Ideally though, I would love to do it without LiveLink and just use any webcam. This project includes LiveLink and it seems the easiest way to go, but it also makes me wonder if I could bypass it.
I have learnt how to move a camera by building a blueprint that gets information from my Vive Tracker. When I attach it to a camera in UE, it works without any problem. It is rather simple and it is not as precise as the InCameraVFX project, because it doesn’t know where it is.
I have tried many things, but I know I must be missing something. It doesn’t look like something overly complicated to get XYZ coordinates from Facial Mocap to drive the camera, but I just haven’t found the path to do it properly.