Announcement

Collapse
No announcement yet.

My Virtual Production Sample Project and Tutorial

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    My Virtual Production Sample Project and Tutorial

    Hi Everyone,

    I finally got a basic Virtual Production setup running using VIVE trackers. There seemed to be a lot of interest on how to do this so I thought I would start a fresh post on my example project and tutorial videos to make it easier for other people to get started. The base setup for the project works with VIVE headset, vive tracking pucks. The project can be used with almost any webcam or video input device that works with Unreal, right now it is setup for the Aja KONA-HDMI but should not be too hard to modify to use another camera. It could also be modified to work without any tracking gear which would make the setup really cheap, but without a tracker you can't move the camera while filming.

    The project handles delaying the tracker data to sync it up with the video, which is usually several frames behind. It also includes a tracked object (kind of a big cartoon hammer) that follows a second VIVE tracker you can hold in your hand. You could attach the tracker to a handle or selfie stick to make it easier to hold.

    It is setup to do background removal using a greenscreen so you can put yourself in the middle of a totally virtual set. There is a tracked garbage matte so no matter how you move the camera, only the real world area where the green screen is will be seen.

    The project only uses the "3rd person" starter map so it is really small and quick to download. But it works just as well with a more detailed map. The same setup can also be used to insert virtual characters and objects into the real world, like a CG character sitting on your living room couch.

    When you look at my stuff, please remember these are just starter projects, in many cases the lighting is kind of bad and there is no color correction. I'll be showing how to improve this in future videos.

    Below is the whole setup I have, you can use cheaper (or more expensive) stuff than this. You can modify the project for almost any kind of tracker or no tracker if you don't want the camera to move. It can also be modified to work with nearly any type of camera and video capture card, or even a webcam. So it is possible to get started pretty cheaply.

    PC with NVIDIA 2080 TI card (the example uses less than 20% of this)
    VIVE PRO vr bundle
    2 VIVE tracking pucks
    AJA KONA-HDMI 4 input HDMI capture card
    Full-frame camera with 35mm lens and HDMI output
    Green Screen (I started with a 10 foot wide one from amazon, my current one is bigger)

    Happy to answer questions, if you have any suggestions they are welcome, I'll be doing more videos on this in the future.
    Please post your own work too, I want to see it!

    Project (UE 4.23) is here https://github.com/MiloMindbender/UE4VirtualProduction
    Youtube examples and tutorials here: https://www.youtube.com/user/GregCorson



    #2
    Hi Greg,

    Do you have any examples of the camera moving while you're in the scene to see parallax/movement within the space you're shooting? I started a project like this a while back but wasn't able to get camera motion working correctly. Your example with the virtual studio on your YouTube channel with adding camera motion is what I'm looking to accomplish. Thanks so much for posting your work, I'm looking forward to diving into it when I get some free time from client work.

    Comment


      #3
      that's pretty sweet

      Comment


        #4
        LFedit,

        My problem is I'm mostly working by myself so nobody to operate the camera for me. That's why my samples have that chair in them, it's a stand-in for a person! The last demo I did, I do move the camera around before I walk into frame and I think the chair is tracking pretty well, and the perspective seems right when I walk into frame. I'll have to try doing a sample where I walk around some more. I think the perspective is ok because my virtual and real sets are both in exactly the same scale with the camera camera setup, so as long as I haven't messed up something like the measurements between the camera and the tracker, the perspective should just be right.

        The main problem I'm having is that there is a bit of jitter in the VIVE track that you can sometimes see, I need to see if I can come up with a way to smooth that out a bit. The other issue with the VIVE is that the tracker data doesn't have any timestamps on it and my camera doesn't have timecode, so I can't precisely sync the two. I'm pretty close, like within a 1/60th but the tracker and camera don't run at the same frame rate so their could be some wobble there.

        Right now I'm mainly trying to fix up that horrible lighting, it was easier a few months ago when there was a ton of light coming through the windows behind me all day long ;-)

        Greg

        Comment


          #5
          Thanks for the reply Greg, I've also had the issues with jittering in the vive tracker. I've looked into the highend trackers for this type of work, but it starts at around $60k so it looks like the vive trackers are one of the only "low budget/accessible" solution. I've gotten around the jitter issues by exporting the FBX data from the tracker and post processing it in DCC applications like Cinema4d and Blender. It does work, but it certainly is a long process when needing to do post production work to this workflow. And defeats the real time aspect.

          The chair does seem to stick into virtual scene well. Thanks for pointing that out. I'm exploring this technique for a client project, and some of the issues you're finding are what kind of derailed me from pursuing it further. I'll keep exploring, and hope you keep posting your findings and examples. It's inspiring to see you pioneering the accessible options for virtual production.

          Cheers.

          Comment


            #6
            LFedit,

            Something I have not had the time to try yet, but could solve the jitter problem. I'm thinking about writing a blueprint that takes some number of tracker samples and applies some kind of smoothing function to them. The blueprint would be very similar to the one I use to delay the tracker data to sync it up with the video. Before I do this, I'm going to write a blueprint to actually measure the jitter over time and spit out some numbers. Hopefully I can get a bunch of other people to run this as a test so we can see if everyone's setup is about the same or if some are better.

            The other thing that is really needed is shadows...I need to figure out how to transfer shadows falling on the green-screen into the virtual set so the talent can cast a shadow on the floor. Same thing the other way around, need to get a setup so the CG objects can cast shadows on a real-world floor.

            Of course, you can always "cheat" and setup the lighting so nothing casts much of a shadow, but that doesn't look as realistic.

            Comment


              #7
              Thank you very much Greg.Corson

              Comment


                #8
                Hi Greg.
                Thanks a lot for publishing your tests, they are very inspiring.
                A couple of questions:
                - I see the scene is kind of tracking accurately with the chair. The thing is how accurate can this be? Since the tracker is not exactly where the camera lens is, there should be a little drift. Not important if you don't see the feet of the character but it can be a problem if you see the ground. Is there a way of calibrating the camera?
                - Regarding the jitter, that's scary. However if you can really create expressions in Unreal (can you?) it could be smoothed. I work in visual effects compositing for big productions and sometimes we use python expressions to do, precisely, that. I can share the expression here if it's useful.
                - For shadows I can only imagine using your footage on a card although that's kind of cheap. Maybe digi doubles?

                For LFedit, what high end camera track solutions did you find? Have anyone tried other solutions apart from vive, like vicon, rokoko, etc?

                Happy new year!

                Comment


                  #9
                  In my project the vive tracker data comes in but the unreal camera is offset from that tracker location by the distance of the real world tracker to the film plane. This seems to work well but I'm not sure if the offset should be to the film plane or the nodal point of the lens?

                  The main thing the VIVE setup currently lacks is the ability to exactly set the video latency. I think the tracker data comes in at over 100hz and the video is 60hz, so there could be some rounding wobble in my "delay by # of frames" solution. I understand LiveLink time stamps incoming tracker data and interpolates it to the exact video frame time to avoid this, but Vive data doesn't come in this way. I'm looking for a fix for this but it may take awhile.

                  You can create algorithms (in blueprints or C++ code) in unreal to do smoothing but I haven't tried to do it yet. I noticed the other day that unplugging one of my vive trackers reduced the jitter so I'm thinking it may be picking up some vibration from the part of the ceiling it's mounted to, I need to do some research on this. When the mounting is solid, the tracking seems quite stable so having it on a solid weighted base instead of the ceiling might help. When I had the setup in my apartment it was mounted to tension poles pressed against the floor and ceiling and I hardly saw any jitter.

                  For green-screen shadows, I have seen some systems that extract shadows on the green floor and turn them into alpha-channel images that you can use to create shadows of the talent on your "virtual" floor. Just not sure how to do this in Unreal yet. Probably would have to do something like take the inverse of the green screen key (the green area) and do some processing on it.

                  A couple of people I've been talking to have tried Optitrack to track the camera and say it works well. I have an optitrack but it isn't setup in my studio yet, I'll let you know how it goes.

                  P.S. I just put up a couple of quick studio setup tips on my channel. One is how to use tension poles to setup trackers, lights and green screens in a room with hard ceilings. The other shows the magnetic hooks and mounts I used to quickly attach my green screen and trackers to the suspended ceiling in my larger studio. https://www.youtube.com/user/GregCorson


                  Comment


                    #10
                    [QUOTE=Greg.Corson;n1704786

                    A couple of people I've been talking to have tried Optitrack to track the camera and say it works well. I have an optitrack but it isn't setup in my studio yet, I'll let you know how it goes.

                    [/QUOTE]

                    I'd be very interested in in your experience with Optitrack. From my research it does seem to be the next jump in quality and price from the vive for this type of virtual production. Thanks again for posting all your findings.

                    Comment


                      #11
                      Hi Greg. Greg.Corson
                      I've been checking out the videos that you've created, great stuff and you seem to have a good grasp on how to set up the Vive trackers in Unreal.
                      So I'm really hoping you can help me out here and is probably quite simple compared to the complexities of what you have created.
                      I'm banging my head on the desk here and I'm relatively new to Unreal.

                      Third person view in Unreal
                      I have a Vive tracker, I create a pawn, Add a motion controller then a camera and a stretched out cube mesh pointing out of the front of the camera to show the line the lens is facing.
                      When I play in the view port, the camera / cube is always pointing upwards towards the sky and not parallel to the ground.

                      I've tried all sorts to offset it, but cannot get it to work.

                      I can see from one of your videos where you have the tracker attached to the physical camera and you mention offsets.

                      Is there any chance you could point me in the right direction of getting the tracker to be in the correct orientation.
                      Blueprint Example would be great.
                      Any help is greatly appreciated and recieved.
                      Regards
                      Rich
                      Last edited by Tricky_3D; 01-09-2020, 08:08 PM.

                      Comment


                        #12
                        First off, a lot of people have been asking how to migrate my virtual production template over into another UE project. It turns out this wasn't easy to do (even for me!) so I spent some time working out what was going wrong and now it's very easy. If you grab the latest project from https://github.com/MiloMindbender/UE4VirtualProduction you can look at Readme_2 for a description of how to move it. It took some time to figure this out, but now you can migrate everything to a new project in a couple of minutes.

                        The bad news is they have to do some construction in my building so my studio will probably be down for about a week. The good news is I'm moving to a new location that has a proper light grid in the ceiling, so I should be able to hook up the Optitrack and see how that works.

                        LFedit It will probably take me 1-2 weeks at least to get the Optitrack setup because of the studio move, when it is working, I'll put up some video and project updates for it. Need to see if I can get a 4.24.1 version of the optitrack plugin to test too.

                        Tricky_3D Grab my project and take a look at "CompCameraRig" in it. Under "details" the SceneComponent location/rotation is the position of the camera (in my case, relative to the "talentmarkerseparate") if you look at the CameraComponent, it's location/rotation is the offset from the tracker to the camera film plane. You said you had seen the picture of my camera rig, for that rig the offset is about -9,0,-11.7 with a Y rotation of -90. There is also a "cone" mesh that represents the camera view...the location/rotation of this had to be set also to get the point of the cone to be (approximately) where the camera lens is. If you are wondering where my motion controler is, it is inside of "talentmarkerseparate" which contains blueprint code that delays and copies the position of the tracker over to CompCameraRig.

                        Once you set the offsets right, the method you describe for making a pawn SHOULD work. To make debugging easier I suggest you go into your motion controller and check "display device model" so you can see the tracker inside unreal. However be aware the origin of the tracker model in unreal (last time I checked anyway) is not exactly right. It seems to be in the center of the model and not at the base of the tripod screw. This doesn't effect what offsets you use, these should still be based on real world measurements (in cm). Just be aware that even after you get the offsets right, the model of the tracker may not sit exactly on the origin of that cube.

                        One other thing to watch out for if you are using COMPOSURE. The process you describe will make a tracked camera that works, but composure will not recognize it. Composure will not recognize a camera component down inside another actor, it only recognizes a camera actor. You need to create a camera actor, then copy the location/rotation into it like I do, then composure will see the camera (drove me crazy for awhile figuring this out!) I'm trying to figure out a better way to do this, but for now the way it's done in my project works.

                        Hope this helps!

                        I encourage everyone to post their experiences publicly! I've found a lot of weird little things that drove me crazy for awhile till I figured them out, so posting your fails/successes will help other people and help Epic see where improvements are needed.

                        Greg

                        Comment


                          #13

                          New video this time I'm in two new virtual TV studio sets. Both are from Epic's "virtual studio" sample project with my virtual production template copied over. This time there is sound, I've gotten rid of the window title bar and the lighting is better.

                          The color balance and exposure on the live camera is much better, but still needs work. The slight green cast you see is NOT from the green-screen, it's from two large fluorescent fixtures right over my head that can't be turned off clashing with the mostly daylight balanced lights. Instead of using OBS I recorded from a fullscreen PIE window on one of my graphics card's HDMI outputs on a Ninja V HDMI recorder. This takes load off the PC and records to Prores so there is no quality loss when you edit and recompress the video later.

                          As I mentioned earlier, there are some issues with the Steam beta 1.10.1 where trackers just don't work, so use the release version 1.9.16 for now. Also with Unreal 4.24.1 you need to setup bindings in steam for your trackers, Readme_2 in my project explains this.

                          The VIVE trackers are performing very well as you can see when the camera moves. There is a slight jitter in the tracking from one of my VIVE base stations, I think this is because of vibration (there is a big HVAC fan in the ceiling near it). I'll be moving to a new studio area this week and we'll see if this fixes it.

                          Comment


                            #14
                            Hi Greg, Greg.Corson
                            Very much appreciated for your time and effort in replying.
                            Over the coming week or so I'll be digging through your project to help get my tracker etc working how I want it.
                            I'll be more than happy to share my results, no problem.

                            Thanks again for a great starting point and liking the new video too..

                            All the best with the new studio move also.
                            Regards
                            Rich

                            Comment


                              #15
                              By the way, if anyone knows how to equate a tracker device id to a specific tracker role, please tell me. The new steam input setup in 1.24.1 seems to handle binding of a specific tracker role like "right foot" to something like "Special_1" but I don't see a way to get the device ID of Special_1 inside a blueprint so I can do "get tracked device position and orientation". Seems I can enumerate all the device ids and I can enumerate a list of which names like Special_1 are currently tracking but I don't see how to find the device id for "special_1".

                              I would like to be able to do the equivalent of a "get tracked device position and orientation" on Special_2 for example, and not have to determine it's device id by trial and error.

                              This would let me spawn a bunch of tracked objects based on the trackers that are connected.

                              Greg

                              Comment

                              Working...
                              X