Announcement

Collapse
No announcement yet.

Virtual Production and Virtual Set samples really needed!

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Virtual Production and Virtual Set samples really needed!

    I've been trying to do a very basic demo of a virtual set and have really been struggling from the lack of documentation/samples on the latest features. While searching for answers I've noticed a lot of other requests for help going unanswered too, so I'm not the only one.

    It would be really helpful if the UE team could provide a virtual set sample with moving camera (logitech 920 webcam or gopro/dslr on a capture device) tracked by a VIVE or other motion controller.

    It's taken me forever to get something pretty basic almost working, so even a REALLY SIMPLE example would be a huge help! For example having one of the CG paragon characters standing next to a person in a live video or having a CG car inserted into live video where the camera moves around an empty garage.

    Obviously, this kind of example could get pretty involved, but it's the basics that seem to be killing people so just skip the fancy stuff and do a basic example of inserting a CG object into live moving camera video and delaying the tracker data so it matches up with the video.. Don't worry about stuff like perfect lighting, shadow catching...etc, do something basic and save the rest for a follow on. It would help a lot of people.

    I managed to get something very basic going but it took a lot of time to figure it out and I still don't know how to delay the tracker data.

    You've been pushing virtual production really hard lately, please supply some BASIC samples and docs to help the rest of us get started.

    #2
    Hi Greg.Corson!

    At the moment, there have been discussions about expanding (namely our current virtual Studio sample project) or making a sample that covers a lot of use cases, however, there is no ETA or commitment that I'm aware of to that end just yet.

    For documentation around Virtual Production and our Composure system, myself and another member of our team share a lot of the documentation in this area. I can't necessarily speak to all elements of virtual production but with regards to composure, post 4.23, once I ship all my updates for documentation on that release, I'm planning to get back to some work on the Composure documentation to clean up, organize, and expand it. Understandably, there are a lot of use cases that aren't documented that can be done, so I've got to set aside a good bit of time figuring out what our needs are for documentation with our engineers to show all these use cases and then set out goals and plans for what will go into that doc set and where to start based on the list we come up with.

    Virtual Production, while still a new set of features, is rapidly evolving and growing and I understand how important of an area it is for our customers. It's high on my priority list of things I want to address post 4.23 but I'll have to wait until I can actually get it on my schedule and when I'm able to sit down and start chatting with engineers to get things in motion. Hopefully soon, though!

    In the meantime, please list and organize any requests you have in this post. I'll add them to my list when I start discussing with the engineers and placing priorities on this work.






    Tim Hobson | Learning Resources | Epic Games
    UE4 Documentation

    Comment


      #3
      Tim,

      Right now my biggest question is how to delay the Vive Tracker data so it will sync up with the video capture which is several frames behind.

      A second request would be recommendations for how create a properly rigged object in UE that takes into account the difference in position/orientation between the camera and tracker. I currently use a rig like in the attached picture, but haven't tried compensating for the couple of inches of offset between the two devices yet. I've also seen postings where people attach the VIVE tracker directly to a DSLR's hot shoe, but they are also having trouble adjusting the coordinate system correctly.

      It would also be very nice if you could have cine camera presets for the Logitech 920 pro webcam and the different gopro models. I think I've got the parameters right, but it would be nice if these common cameras were built in.

      For doing virtual sets I didn't find the composure sample or the virtual studio sample very helpful because neither one included a live tracked/movable camera and that was where I had many problems.

      A really simple example using a Logitech 920 pro webcam (the one you recommend for mixed reality) and a VIVE tracker would be great. I'd be very happy with something that just inserted a few CG rendered cubes into the live video feed from the camera so that they stay on the same "real world" spot when the camera is moved. In other words, the CG object/character is the foreground and the video feed of the room is the background (and the camera moves).

      I'm sure someone who knows what they are doing could put a sample like this together really quickly. Just a basic example, save the hard stuff like color correction, matching lighting, chroma-key and other more involved things for a follow on example.


      Comment


        #4
        One other item that just came up. How can UE be frame synced/genlocked to a video source other than a black magic or Aja card? Is it possible to frame sync with a webcam or other video capture device?

        Comment


          #5
          Tim,
          After thinking about it a bit, some of the things I would really like to see documents/samples (or built-in solutions) for.

          1. Doing this with WMF or other webcam/capture devices as well as PRO cards.
          2. How to sync the unreal framerate to a video source without needing to use a PRO card.
          3. Proper way to enumerate/configure the video so it will work right when the hardware (webcam, capture card...etc) being used is changed.
          4. Proper way to get input from multiple trackers and route them to different pawns/cameras.
          5. A clean way to delay ALL the tracker data so it is in-sync with the video (ie: compensate for latency/delay in the video capture)
          6. How to properly setup the Cine camera to emulate some common cameras and fix up their lens distortion (ie: logitech 920 webcam, GoPro...etc)
          7. Best practices for recording the output from unreal (OBS sometimes records black or tooltip text instead of the PIE or editor viewport)

          Hope this helps.

          Comment


            #6
            Hi all - I use UE as a VP engine with streaming serial data from a camera system / givng position data to UE - Regarding incoming video to UE I found one very important issue is the synch and since there is normally no synch input or output on webcams and the sorts I can recommend using a capture card like the "cheap" BMDs - not crazy expensive and you can get a one way card for I think 150 dollars. This also helps keeping latenzy low since sourcing through serial can be a pain sometimes. UE have done a good job on the BMD plugin that is now native and latenzy can be controlled with a BP as far as I understands. Using VIVE trackers is kind of an economic tracker but look out for framedrops. Check out how MoSys and Stargate Studios use UE and a pro tracker to drive the parallax on a monitor /however when you want lens data like zoom and focus VIVE will fall short as there is no input for these parameters / regarding lensdistortion : its one of the areas that the pro virtual studio providers keep close to the body / to do it properly you will need a proper lensdistortion mapping and there is several ways to achieve this / for a "quick and dirty" way for fixed lenses you can shoot a square grid on medium range focus and distort the UE generated image (or you could undistort lens) before overlaying. Its not always necessary though / off course field of view is important / that is easy to calculate if you can measure the distance to a wall with a vertical line / make a pan to each edge / get the angles and then use the fov formula. Just my 2 cents on the subject. Cheers Allan

            Comment

            Working...
            X