I’m developing a project where there will be a user wearing a full body mocap gear ( either Vive Trackers or a mocap suit ), and what I need to do is to basically have a virtual camera that I can move around in real life, such as an iPad/iPhone.
I also need to have custom controls on the camera using UMG, such as change the focus, zoom, and so on.
From what I’ve seen the alternatives are to either using the Virtual Camera plugin using Unreal Remote ( tested, works ok ) or the Pixel Streaming option ( never tested ).
I also need to take into account of what will be shown on the PC screen, which would be what the tracker iPad sees, but withou showing the UMG controls ( if this is possible ).
About the tracking, ARKit works ok, but considering that I need to track a blank room ( school room ) with a person moving, I’m worried about jittering or lose tracking easily, that’s why I was thinking to rely on the Vive Trackers to track the iPad…but then how will I get the video feed from the PC to the iPad? Pixel Streaming?
Any advice on what could be the best solution?
Any usefull advice will be appreciated.
I know that @Greg.Corson experimented quite a bit with the entire setup, so maybe he could point me towards a usefull solution.