Announcement

Collapse
No announcement yet.

My Virtual Production Sample Project and Tutorial

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    My Virtual Production Sample Project and Tutorial

    Hi Everyone,

    I finally got a basic Virtual Production setup running using VIVE trackers. There seemed to be a lot of interest on how to do this so I thought I would start a fresh post on my example project and tutorial videos to make it easier for other people to get started. The base setup for the project works with VIVE headset, vive tracking pucks. The project can be used with almost any webcam or video input device that works with Unreal, right now it is setup for the Aja KONA-HDMI but should not be too hard to modify to use another camera. It could also be modified to work without any tracking gear which would make the setup really cheap, but without a tracker you can't move the camera while filming.

    The project handles delaying the tracker data to sync it up with the video, which is usually several frames behind. It also includes a tracked object (kind of a big cartoon hammer) that follows a second VIVE tracker you can hold in your hand. You could attach the tracker to a handle or selfie stick to make it easier to hold.

    It is setup to do background removal using a greenscreen so you can put yourself in the middle of a totally virtual set. There is a tracked garbage matte so no matter how you move the camera, only the real world area where the green screen is will be seen.

    The project only uses the "3rd person" starter map so it is really small and quick to download. But it works just as well with a more detailed map. The same setup can also be used to insert virtual characters and objects into the real world, like a CG character sitting on your living room couch.

    When you look at my stuff, please remember these are just starter projects, in many cases the lighting is kind of bad and there is no color correction. I'll be showing how to improve this in future videos.

    Below is the whole setup I have, you can use cheaper (or more expensive) stuff than this. You can modify the project for almost any kind of tracker or no tracker if you don't want the camera to move. It can also be modified to work with nearly any type of camera and video capture card, or even a webcam. So it is possible to get started pretty cheaply.

    PC with NVIDIA 2080 TI card (the example uses less than 20% of this)
    VIVE PRO vr bundle
    2 VIVE tracking pucks
    AJA KONA-HDMI 4 input HDMI capture card
    Full-frame camera with 35mm lens and HDMI output
    Green Screen (I started with a 10 foot wide one from amazon, my current one is bigger)

    Happy to answer questions, if you have any suggestions they are welcome, I'll be doing more videos on this in the future.
    Please post your own work too, I want to see it!

    Project (UE 4.23) is here https://github.com/MiloMindbender/UE4VirtualProduction
    Youtube examples and tutorials here: https://www.youtube.com/user/GregCorson



    #2
    Hi Greg,

    Do you have any examples of the camera moving while you're in the scene to see parallax/movement within the space you're shooting? I started a project like this a while back but wasn't able to get camera motion working correctly. Your example with the virtual studio on your YouTube channel with adding camera motion is what I'm looking to accomplish. Thanks so much for posting your work, I'm looking forward to diving into it when I get some free time from client work.

    Comment


      #3
      that's pretty sweet

      Comment


        #4
        LFedit,

        My problem is I'm mostly working by myself so nobody to operate the camera for me. That's why my samples have that chair in them, it's a stand-in for a person! The last demo I did, I do move the camera around before I walk into frame and I think the chair is tracking pretty well, and the perspective seems right when I walk into frame. I'll have to try doing a sample where I walk around some more. I think the perspective is ok because my virtual and real sets are both in exactly the same scale with the camera camera setup, so as long as I haven't messed up something like the measurements between the camera and the tracker, the perspective should just be right.

        The main problem I'm having is that there is a bit of jitter in the VIVE track that you can sometimes see, I need to see if I can come up with a way to smooth that out a bit. The other issue with the VIVE is that the tracker data doesn't have any timestamps on it and my camera doesn't have timecode, so I can't precisely sync the two. I'm pretty close, like within a 1/60th but the tracker and camera don't run at the same frame rate so their could be some wobble there.

        Right now I'm mainly trying to fix up that horrible lighting, it was easier a few months ago when there was a ton of light coming through the windows behind me all day long ;-)

        Greg

        Comment


          #5
          Thanks for the reply Greg, I've also had the issues with jittering in the vive tracker. I've looked into the highend trackers for this type of work, but it starts at around $60k so it looks like the vive trackers are one of the only "low budget/accessible" solution. I've gotten around the jitter issues by exporting the FBX data from the tracker and post processing it in DCC applications like Cinema4d and Blender. It does work, but it certainly is a long process when needing to do post production work to this workflow. And defeats the real time aspect.

          The chair does seem to stick into virtual scene well. Thanks for pointing that out. I'm exploring this technique for a client project, and some of the issues you're finding are what kind of derailed me from pursuing it further. I'll keep exploring, and hope you keep posting your findings and examples. It's inspiring to see you pioneering the accessible options for virtual production.

          Cheers.

          Comment


            #6
            LFedit,

            Something I have not had the time to try yet, but could solve the jitter problem. I'm thinking about writing a blueprint that takes some number of tracker samples and applies some kind of smoothing function to them. The blueprint would be very similar to the one I use to delay the tracker data to sync it up with the video. Before I do this, I'm going to write a blueprint to actually measure the jitter over time and spit out some numbers. Hopefully I can get a bunch of other people to run this as a test so we can see if everyone's setup is about the same or if some are better.

            The other thing that is really needed is shadows...I need to figure out how to transfer shadows falling on the green-screen into the virtual set so the talent can cast a shadow on the floor. Same thing the other way around, need to get a setup so the CG objects can cast shadows on a real-world floor.

            Of course, you can always "cheat" and setup the lighting so nothing casts much of a shadow, but that doesn't look as realistic.

            Comment

            Working...
            X