My Virtual Production Sample Project and Tutorial

Hi Everyone,

I finally got a basic Virtual Production setup running using VIVE trackers. There seemed to be a lot of interest on how to do this so I thought I would start a fresh post on my example project and tutorial videos to make it easier for other people to get started. The base setup for the project works with VIVE headset, vive tracking pucks. The project can be used with almost any webcam or video input device that works with Unreal, right now it is setup for the Aja KONA-HDMI but should not be too hard to modify to use another camera. It could also be modified to work without any tracking gear which would make the setup really cheap, but without a tracker you canā€™t move the camera while filming.

The project handles delaying the tracker data to sync it up with the video, which is usually several frames behind. It also includes a tracked object (kind of a big cartoon hammer) that follows a second VIVE tracker you can hold in your hand. You could attach the tracker to a handle or selfie stick to make it easier to hold.

It is setup to do background removal using a greenscreen so you can put yourself in the middle of a totally virtual set. There is a tracked garbage matte so no matter how you move the camera, only the real world area where the green screen is will be seen.

The project only uses the ā€œ3rd personā€ starter map so it is really small and quick to download. But it works just as well with a more detailed map. The same setup can also be used to insert virtual characters and objects into the real world, like a CG character sitting on your living room couch.

When you look at my stuff, please remember these are just starter projects, in many cases the lighting is kind of bad and there is no color correction. Iā€™ll be showing how to improve this in future videos.

Below is the whole setup I have, you can use cheaper (or more expensive) stuff than this. You can modify the project for almost any kind of tracker or no tracker if you donā€™t want the camera to move. It can also be modified to work with nearly any type of camera and video capture card, or even a webcam. So it is possible to get started pretty cheaply.

PC with NVIDIA 2080 TI card (the example uses less than 20% of this)
VIVE PRO vr bundle
2 VIVE tracking pucks
AJA KONA-HDMI 4 input HDMI capture card
Full-frame camera with 35mm lens and HDMI output
Green Screen (I started with a 10 foot wide one from amazon, my current one is bigger)

Happy to answer questions, if you have any suggestions they are welcome, Iā€™ll be doing more videos on this in the future.
Please post your own work too, I want to see it!

Project (UE 4.23) is here GitHub - MiloMindbender/UE4VirtualProduction: An example Unreal Engine Virtual Production Project<
Youtube examples and tutorials here: https://www.youtube.com/user/GregCorson

3 Likes

Hi Greg,

Do you have any examples of the camera moving while youā€™re in the scene to see parallax/movement within the space youā€™re shooting? I started a project like this a while back but wasnā€™t able to get camera motion working correctly. Your example with the virtual studio on your YouTube channel with adding camera motion is what Iā€™m looking to accomplish. Thanks so much for posting your work, Iā€™m looking forward to diving into it when I get some free time from client work.

thatā€™s pretty sweet

LFedit,

My problem is Iā€™m mostly working by myself so nobody to operate the camera for me. Thatā€™s why my samples have that chair in them, itā€™s a stand-in for a person! The last demo I did, I do move the camera around before I walk into frame and I think the chair is tracking pretty well, and the perspective seems right when I walk into frame. Iā€™ll have to try doing a sample where I walk around some more. I think the perspective is ok because my virtual and real sets are both in exactly the same scale with the camera camera setup, so as long as I havenā€™t messed up something like the measurements between the camera and the tracker, the perspective should just be right.

The main problem Iā€™m having is that there is a bit of jitter in the VIVE track that you can sometimes see, I need to see if I can come up with a way to smooth that out a bit. The other issue with the VIVE is that the tracker data doesnā€™t have any timestamps on it and my camera doesnā€™t have timecode, so I canā€™t precisely sync the two. Iā€™m pretty close, like within a 1/60th but the tracker and camera donā€™t run at the same frame rate so their could be some wobble there.

Right now Iā€™m mainly trying to fix up that horrible lighting, it was easier a few months ago when there was a ton of light coming through the windows behind me all day long :wink:

Greg

Thanks for the reply Greg, Iā€™ve also had the issues with jittering in the vive tracker. Iā€™ve looked into the highend trackers for this type of work, but it starts at around $60k so it looks like the vive trackers are one of the only ā€œlow budget/accessibleā€ solution. Iā€™ve gotten around the jitter issues by exporting the FBX data from the tracker and post processing it in DCC applications like Cinema4d and Blender. It does work, but it certainly is a long process when needing to do post production work to this workflow. And defeats the real time aspect.

The chair does seem to stick into virtual scene well. Thanks for pointing that out. Iā€™m exploring this technique for a client project, and some of the issues youā€™re finding are what kind of derailed me from pursuing it further. Iā€™ll keep exploring, and hope you keep posting your findings and examples. Itā€™s inspiring to see you pioneering the accessible options for virtual production.

Cheers.

LFedit,

Something I have not had the time to try yet, but could solve the jitter problem. Iā€™m thinking about writing a blueprint that takes some number of tracker samples and applies some kind of smoothing function to them. The blueprint would be very similar to the one I use to delay the tracker data to sync it up with the video. Before I do this, Iā€™m going to write a blueprint to actually measure the jitter over time and spit out some numbers. Hopefully I can get a bunch of other people to run this as a test so we can see if everyoneā€™s setup is about the same or if some are better.

The other thing that is really needed is shadowsā€¦I need to figure out how to transfer shadows falling on the green-screen into the virtual set so the talent can cast a shadow on the floor. Same thing the other way around, need to get a setup so the CG objects can cast shadows on a real-world floor.

Of course, you can always ā€œcheatā€ and setup the lighting so nothing casts much of a shadow, but that doesnā€™t look as realistic.

Thank you very much @Greg.Corson

Hi Greg.
Thanks a lot for publishing your tests, they are very inspiring.
A couple of questions:

  • I see the scene is kind of tracking accurately with the chair. The thing is how accurate can this be? Since the tracker is not exactly where the camera lens is, there should be a little drift. Not important if you donā€™t see the feet of the character but it can be a problem if you see the ground. Is there a way of calibrating the camera?
  • Regarding the jitter, thatā€™s scary. However if you can really create expressions in Unreal (can you?) it could be smoothed. I work in visual effects compositing for big productions and sometimes we use python expressions to do, precisely, that. I can share the expression here if itā€™s useful.
  • For shadows I can only imagine using your footage on a card although thatā€™s kind of cheap. Maybe digi doubles?

For LFedit, what high end camera track solutions did you find? Have anyone tried other solutions apart from vive, like vicon, rokoko, etc?

Happy new year!

In my project the vive tracker data comes in but the unreal camera is offset from that tracker location by the distance of the real world tracker to the film plane. This seems to work well but Iā€™m not sure if the offset should be to the film plane or the nodal point of the lens?

The main thing the VIVE setup currently lacks is the ability to exactly set the video latency. I think the tracker data comes in at over 100hz and the video is 60hz, so there could be some rounding wobble in my ā€œdelay by # of framesā€ solution. I understand LiveLink time stamps incoming tracker data and interpolates it to the exact video frame time to avoid this, but Vive data doesnā€™t come in this way. Iā€™m looking for a fix for this but it may take awhile.

You can create algorithms (in blueprints or C++ code) in unreal to do smoothing but I havenā€™t tried to do it yet. I noticed the other day that unplugging one of my vive trackers reduced the jitter so Iā€™m thinking it may be picking up some vibration from the part of the ceiling itā€™s mounted to, I need to do some research on this. When the mounting is solid, the tracking seems quite stable so having it on a solid weighted base instead of the ceiling might help. When I had the setup in my apartment it was mounted to tension poles pressed against the floor and ceiling and I hardly saw any jitter.

For green-screen shadows, I have seen some systems that extract shadows on the green floor and turn them into alpha-channel images that you can use to create shadows of the talent on your ā€œvirtualā€ floor. Just not sure how to do this in Unreal yet. Probably would have to do something like take the inverse of the green screen key (the green area) and do some processing on it.

A couple of people Iā€™ve been talking to have tried Optitrack to track the camera and say it works well. I have an optitrack but it isnā€™t setup in my studio yet, Iā€™ll let you know how it goes.

P.S. I just put up a couple of quick studio setup tips on my channel. One is how to use tension poles to setup trackers, lights and green screens in a room with hard ceilings. The other shows the magnetic hooks and mounts I used to quickly attach my green screen and trackers to the suspended ceiling in my larger studio. https://www.youtube.com/user/GregCorson

[QUOTE=Greg.Corson;n1704786

A couple of people Iā€™ve been talking to have tried Optitrack to track the camera and say it works well. I have an optitrack but it isnā€™t setup in my studio yet, Iā€™ll let you know how it goes.

[/QUOTE]

Iā€™d be very interested in in your experience with Optitrack. From my research it does seem to be the next jump in quality and price from the vive for this type of virtual production. Thanks again for posting all your findings.

Hi Greg. @Greg.Corson
Iā€™ve been checking out the videos that youā€™ve created, great stuff and you seem to have a good grasp on how to set up the Vive trackers in Unreal.
So Iā€™m really hoping you can help me out here and is probably quite simple compared to the complexities of what you have created.
Iā€™m banging my head on the desk here and Iā€™m relatively new to Unreal.

Third person view in Unreal
I have a Vive tracker, I create a pawn, Add a motion controller then a camera and a stretched out cube mesh pointing out of the front of the camera to show the line the lens is facing.
When I play in the view port, the camera / cube is always pointing upwards towards the sky and not parallel to the ground.

Iā€™ve tried all sorts to offset it, but cannot get it to work.

I can see from one of your videos where you have the tracker attached to the physical camera and you mention offsets.

Is there any chance you could point me in the right direction of getting the tracker to be in the correct orientation.
Blueprint Example would be great.
Any help is greatly appreciated and recieved.
Regards
Rich

First off, a lot of people have been asking how to migrate my virtual production template over into another UE project. It turns out this wasnā€™t easy to do (even for me!) so I spent some time working out what was going wrong and now itā€™s very easy. If you grab the latest project from GitHub - MiloMindbender/UE4VirtualProduction: An example Unreal Engine Virtual Production Project< you can look at Readme_2 for a description of how to move it. It took some time to figure this out, but now you can migrate everything to a new project in a couple of minutes.

The bad news is they have to do some construction in my building so my studio will probably be down for about a week. The good news is Iā€™m moving to a new location that has a proper light grid in the ceiling, so I should be able to hook up the Optitrack and see how that works.

@LFedit It will probably take me 1-2 weeks at least to get the Optitrack setup because of the studio move, when it is working, Iā€™ll put up some video and project updates for it. Need to see if I can get a 4.24.1 version of the optitrack plugin to test too.

@Tricky_3D Grab my project and take a look at ā€œCompCameraRigā€ in it. Under ā€œdetailsā€ the SceneComponent location/rotation is the position of the camera (in my case, relative to the ā€œtalentmarkerseparateā€) if you look at the CameraComponent, itā€™s location/rotation is the offset from the tracker to the camera film plane. You said you had seen the picture of my camera rig, for that rig the offset is about -9,0,-11.7 with a Y rotation of -90. There is also a ā€œconeā€ mesh that represents the camera viewā€¦the location/rotation of this had to be set also to get the point of the cone to be (approximately) where the camera lens is. If you are wondering where my motion controler is, it is inside of ā€œtalentmarkerseparateā€ which contains blueprint code that delays and copies the position of the tracker over to CompCameraRig.

Once you set the offsets right, the method you describe for making a pawn SHOULD work. To make debugging easier I suggest you go into your motion controller and check ā€œdisplay device modelā€ so you can see the tracker inside unreal. However be aware the origin of the tracker model in unreal (last time I checked anyway) is not exactly right. It seems to be in the center of the model and not at the base of the tripod screw. This doesnā€™t effect what offsets you use, these should still be based on real world measurements (in cm). Just be aware that even after you get the offsets right, the model of the tracker may not sit exactly on the origin of that cube.

One other thing to watch out for if you are using COMPOSURE. The process you describe will make a tracked camera that works, but composure will not recognize it. Composure will not recognize a camera component down inside another actor, it only recognizes a camera actor. You need to create a camera actor, then copy the location/rotation into it like I do, then composure will see the camera (drove me crazy for awhile figuring this out!) Iā€™m trying to figure out a better way to do this, but for now the way itā€™s done in my project works.

Hope this helps!

I encourage everyone to post their experiences publicly! Iā€™ve found a lot of weird little things that drove me crazy for awhile till I figured them out, so posting your fails/successes will help other people and help Epic see where improvements are needed.

Greg

New video this time Iā€™m in two new virtual TV studio sets. Both are from Epicā€™s ā€œvirtual studioā€ sample project with my virtual production template copied over. This time there is sound, Iā€™ve gotten rid of the window title bar and the lighting is better.

The color balance and exposure on the live camera is much better, but still needs work. The slight green cast you see is NOT from the green-screen, itā€™s from two large fluorescent fixtures right over my head that canā€™t be turned off clashing with the mostly daylight balanced lights. Instead of using OBS I recorded from a fullscreen PIE window on one of my graphics cardā€™s HDMI outputs on a Ninja V HDMI recorder. This takes load off the PC and records to Prores so there is no quality loss when you edit and recompress the video later.

As I mentioned earlier, there are some issues with the Steam beta 1.10.1 where trackers just donā€™t work, so use the release version 1.9.16 for now. Also with Unreal 4.24.1 you need to setup bindings in steam for your trackers, Readme_2 in my project explains this.

The VIVE trackers are performing very well as you can see when the camera moves. There is a slight jitter in the tracking from one of my VIVE base stations, I think this is because of vibration (there is a big HVAC fan in the ceiling near it). Iā€™ll be moving to a new studio area this week and weā€™ll see if this fixes it.

Hi Greg, @Greg.Corson
Very much appreciated for your time and effort in replying.
Over the coming week or so Iā€™ll be digging through your project to help get my tracker etc working how I want it.
Iā€™ll be more than happy to share my results, no problem.

Thanks again for a great starting point and liking the new video tooā€¦

All the best with the new studio move also.
Regards
Rich

By the way, if anyone knows how to equate a tracker device id to a specific tracker role, please tell me. The new steam input setup in 1.24.1 seems to handle binding of a specific tracker role like ā€œright footā€ to something like ā€œSpecial_1ā€ but I donā€™t see a way to get the device ID of Special_1 inside a blueprint so I can do ā€œget tracked device position and orientationā€. Seems I can enumerate all the device ids and I can enumerate a list of which names like Special_1 are currently tracking but I donā€™t see how to find the device id for ā€œspecial_1ā€.

I would like to be able to do the equivalent of a ā€œget tracked device position and orientationā€ on Special_2 for example, and not have to determine itā€™s device id by trial and error.

This would let me spawn a bunch of tracked objects based on the trackers that are connected.

Greg

Greetings Greg, @Greg.Corson

Iā€™m starting to get some success with going through your blueprint. Got the tracker working in the right orientation.

Iā€™m now trying to replicate the CompCameraRig (actor) in another project. I see that the TalentMarkerSeparate references the CompCameraRig.

However the CompCameraRig(self) Blueprint states that the SceneComponent is ā€˜inheritedā€™ along with the CameraComponent ā€˜inheritedā€™ and the Variable Category is CameraActor.
As shown below.

Please could you tell me how you achieved this Blueprint to inherit the components. Iā€™ve tried all sorts of ways but not getting the result as the Blueprint doesā€™nt match yours and I get errors.

Again, many thanks for your time.
Regards
Rich

Hmmm, I donā€™t recall doing anything specialā€¦comp camera rig is just an asset I built that is then dragged into the level. I believe it was constructed by creating a pawn or actor of type CineCameraActor but Iā€™ll have to go back and check. Unfortunately I had to tear down my studio today prior to moving, so I wonā€™t be able to do much till at least friday or monday. Iā€™ll try to have a look on one of my other computers.

@Tricky_3D Sorry for the delay, finally got my studio back together and am working again. Regarding ā€œCompCameraRigā€ I am pretty sure it was created like thisā€¦

  1. in content browser click ā€œadd newā€ and select blueprint class.
  2. In the dialog, expand ā€œall classesā€ at the bottom and type ā€œcineā€
  3. From the list, select ā€œcineCameraActorā€

This should give you what you want.

It is important to select "cineCameraActor and NOT ā€œCineCameraComponentā€. Composure does not seem to recognize ā€œcomponentsā€ as valid cameras, only actors.

ā€”Everyone elseā€”

Checked in an improvement on friday that makes the garbage matte for the green screen much easier and faster to setup. The main difference is that the origin of the matte is now at the center of the line where the wall meets the floor. This makes it easy to set the distance from the talent to the wall and the heigth/width of the green screen. Working on a document now but the basics are:

** Before continuing, make sure you have calibrated the vive, set it in the center of your green screen however far from the screen that you want your talent to stand and do a ā€œroom setupā€. Using a metric tape measure (unreal is metric) measure the width and height of the screen, the width and length of the portion that is on the floor and the distance of the talent mark from the screen. In my example the floor is 3 meters and the wall is 2.7, both are 5.2 meters wide. The talent mark is 1.1 meters from the wall.

  1. Go into the ā€œFlatGreenScreenā€ actor and set Location X to the distance of the talent mark in cm from the green screen (110 in my example)
  2. In ā€œFlatGreenScreen->AdjustFloorā€ set scale x and y to the length and width of the floor area in meters (3.0 and 5.2 in my example)
  3. In ā€œFlatGreenScreen->AdjustWallā€ set scale x and y to the height and width of the wall area in meters (2.7 and 5.2)

To fine tune the position so it matches your setup the easiest way I have found is to have your video feed running and go into the ā€œlive video plateā€ and uncheck the enabled box under chroma keying. This should allow you to see the green screen and the edges of the garbage matte. This is quickest if you can see the whole greenscreen area at once, but if you canā€™t get that far back position the camera so you can see one edge of the screen and the fold where it meets the floor.

  1. The corners of the green screen area should match the point where the green screen fold meets the floor, if it does not, adjust location X in the flat green screen till it does.
  2. Now go into the ā€œFlatGreenScreen->AdjustFloorā€ or AdjustWall and adjust the x and Y scales till you donā€™t see anything but your green screen.

Hope this makes sense, I will shoot a video that will show the process better.

Greg

Hey everyone,

I finally started to do ā€œreleasesā€ on github, so if you grab the latest release it guarantees there wonā€™t be any ā€œwork in progressā€ stuff in what you get. If you checkout straight from the github, you may end up getting untested stuff so use the releases if you can.

The latest release isnā€™t anything big, I just redesigned the asset used to create the garbage matte, the new asset has itā€™s origin setup a bit differently so the bottom center of the screen is at 0,0. Itā€™s a lot easier to set the width and height of the wall and floor section, and you can easily set the position of the back wall to the distance from the talentā€™s feet.

Wanted to put out a video showing how to use it but someone else needed to use my studio this week, as soon as I get it back Iā€™ll post something.

Greg

Hi Everyone!

So after finally getting my studio back together, I was able to continue with another tutorial on how to setup my Garbage Matte assets for your own studio setup. I made some changes in the assets to make them much easier to get setup right than the old ones. All you need is a metric tape measure to make it easy. This setup assumes a simple green screen that is flat on the wall and continues out onto the floor. If you have a different setup like a 3-sided corner or partial box, you might want to build your own to-scale model of your green screen for a more exact match.

I also made a small change to the instructions for migrating the project. I recommend you get my example project working with your video setup before you try to migrate the assets into your own project. This will make sure all the assets are setup right for you and prevent any migration problems if you need to use different video plugins than me.

The latest release on github is here: Releases Ā· MiloMindbender/UE4VirtualProduction Ā· GitHub
And this is the garbage matte tutorial video