Tracking external "Camera" using an iPad/iPhone as viewfinder

Hi all,

I’m developing a project where there will be a user wearing a full body mocap gear ( either Vive Trackers or a mocap suit ), and what I need to do is to basically have a virtual camera that I can move around in real life, such as an iPad/iPhone.

I also need to have custom controls on the camera using UMG, such as change the focus, zoom, and so on.

From what I’ve seen the alternatives are to either using the Virtual Camera plugin using Unreal Remote ( tested, works ok ) or the Pixel Streaming option ( never tested ).

I also need to take into account of what will be shown on the PC screen, which would be what the tracker iPad sees, but withou showing the UMG controls ( if this is possible ).
About the tracking, ARKit works ok, but considering that I need to track a blank room ( school room ) with a person moving, I’m worried about jittering or lose tracking easily, that’s why I was thinking to rely on the Vive Trackers to track the iPad…but then how will I get the video feed from the PC to the iPad? Pixel Streaming?

Any advice on what could be the best solution?

Any usefull advice will be appreciated.
I know that @Greg.Corson experimented quite a bit with the entire setup, so maybe he could point me towards a usefull solution.

Regards,

Nicolas

So the user is in mocap gear and also holding the ipad/virtual camera, but they are not wearing a VR headset? Sorry I’m just a little unclear on what kind of setup you want. You are saying the ipad will show a virtual world from unreal (CG only) and not a AR (CG+live video) view? And you want the PC to show the same view?

I have not done that much work with the iPad virtual camera, if you are concerned about jitter in tracking you probably should just give it a test in your target room and see. In my environment (an office) the ARkit tracking seems pretty stable. In a room with blank walls, hard to say…if it jitters a lot you could try adding some features to the walls (ie: stick a bunch of post-its on it) which might help.

If you are able to build an unreal iPhone app (requires apple developer status) you can build the AR sample project as another way to test how well tracking works.

My experience with VIVE trackers has been pretty good, but they do jitter a bit in my setup…though this might be partly because our building is very “flexible” when someone walks by I can feel my desk moving a bit.

If you can describe your application in a bit more detail I might be of more help. Is the mocap on the player animating a character? Is this all CG or is it mixed reality…etc.

Greg

Hi Greg,

Thanks for the reply.

The setup is like this:

An user is wearing the mocap gear ( body + facial tracking ), sending all the animation data inside UE4 at runtime, controlling a character inside a scene.
He will move around, talk, and so on, that’s all he’ll do.

Another user will use an iPad with ARKit, so he’ll be able to use the iPad as a virtual camera, so pointing the iPad towards the “mocap user” allows him to use the iPad as a viewfinder ( since he could do a body shot, mid or closeup ), but also ( possibly ) zoom, focus, and so on, acting exactly like a real camera in real life.

The alternative solution for me would be to use a Vive Tracker on an iPad, so that I can track the iPad position, but the issue is that I need to send the video feed from UE4 to the iPad, and I’m not sure how to do that.
ARkit does that by default, but in my case I’m using external tracking and I do need to have video feed in order to understand what I’m pointing at and use additional features ( zoom, focus and so on ).

ARKit sounds the best choice, but I would like to know if there is also an alternative solution using external tracking, since Vive Trackers are quite robust in terms of tracking, and since I’ll be using 4 Base stations 2.0, the tracking area will be quite big, so I’m worried about loosing tracking because of ARKit.

According to some of the docs, the iPad virtual camera app for unreal supports using a VIVE tracker. In the settings there is a spot where you can select “input” which defaults to ARKit but is supposed to be able to be changed. I’ll see if I can try this out with my setup.

When you are running the virtual camera app, a PIE window shows up on the host PC that does whatever the camera does…though it still has the GUI on it.

However you are supposed to be able to record “takes” (camera motion and all the character motion) in sequencer and then play them back on the host PC without the virtual camera GUI.

If you need the host PC to display another 3d window without the GUI I’m not sure how to do that. One option might be to setup nDisplay on a second PC and have it’s camera follow the virtual camera? Again, haven’t tried it so don’t know how hard it is to do.

I looked at this a bit and I get the impression that VIVE tracker support used to be there but was removed for some reason. If you have a look at the project you can probably rewire it a bit to make it work. You would just need to create a motion controller object tied to the VIVE controller you are using with a blueprint to copy it’s pose over to the camera. I didn’t have time to try this and was having trouble figuring out how the UI code worked…maybe someone else could have a look? If you look at my project GitHub - MiloMindbender/UE4VirtualProduction: An example Unreal Engine Virtual Production Project</tit it has a blueprint that copies a motion controller’s pose over to a camera, feel free to steal that setup. My youtube channel https://www.youtube.com/user/GregCorson has a video explaining the setup of the project.

Hey [USER=“1210893”][/USER],

Thanks for the update.

I think that an alternative solution would be to still use ARKit as a camera, and simply record the camera movement ( if possible ) using Sequencer, including all the other feature ( zoom, focus ), and then use that data separately, so that I can still use “realtime camera data” that has been previously recorded.

I briefly looked at your project, but I still have to fully test it…by the way, I was curious about your green screen setup material, but I haven’t found it…maybe I’m missing something?

You can use a vive tracker on the physical camera and use that to track the movement of the camera in physical space.

I’m not clear what you mean by this? I am getting a video ready that shows some easy ways to put up a green screen and VIVE trackers at home or attach it to a suspended ceiling in an office, it’s been real rainy here since thanksgiving and too dark to film some parts of it, hope to put something up soon.

The setup of the green-screen keying in Unreal is in my sample project and I go over it in the tutorial video. I’m using the “composure” plugin that is part of Unreal.

By the way, I haven’t tried it yet but the iPhone/iPad virtual camera app does record the camera motion in sequencer, Epic did about an hour long stream awhile back, it should be on Twitch/Youtube on their channel.

If you’re looking for something else, let me know.

My bad, I downloaded the project but I haven’t checked the tutorial, and yes, after checking the tutorial, I see now how the green screen setup works ( since I was using a simple material before without using Composure ) and also the frame delay from the camera feed is very helpfull.

About the recording of the camera movement, in the past I’ve been using both IKinema Orion and other mocap gear and record the realtime animation at the same time using Sequence Recorder, and also did a very quick experiment by using a webcam with a Vive Tracker on top of it, where I was moving the webcam around as a virtual camera, looking at the pc screen to see what I was pointing at, while recording the Vive Tracker movement…the annoying part was the long cable used for the webcam, and of course not having a small screen below the webcam ( like an iphone/ipad ) to see what I was recording while moving the webcam around.

So the main difference between the sample project you created and my setup, is that I’m not using a live footage from a camera, but I just move around in the real world while filming the virtual set I created in UE4, which mainly will be a dude with the mocap gear walking around, and since I would like to have a big space ( using 4-8 Lighthouse 2.0 ) where the person will walk around, I need realtime video feed.

I just saw some available options to turn the iPhone into a second screen ( Duet display ), hopefully the lag won’t be a problem, but at least I will be able to do what I have in mind.

Thanks for the advices, and also thanks for the project and tutorial, I hope to experiment a bit further with your setup :slight_smile: