Announcement

Collapse
No announcement yet.

Mixed Reality Capture Feedback for Epic

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #31
    I've spent several hours trying to get a calibration to work with little luck. A few questions for you all.... Attached is my camera setup, is this right? The pilot light of the tracker is on the top arm. Does it have to be in a particular orientation to work?
    Click image for larger version

Name:	IMG_2607.jpg
Views:	563
Size:	85.1 KB
ID:	1649326
    The Gopro hero 4 is set to "linear" which is supposed to get rid of the worst of the fish-eye distortion, but I can't get the reprojection error better than about 1.5 after multiple tries. The camera is looking at my green screen but there is a fair amount of "not green" around all the edges.

    I go through the calibration process with the controllers, on my 50" tv monitor the circles and crosses indicating alignment are usually an inch or two apart. When I get to the 4th step of the guide https://docs.unrealengine.com/en-US/...ool/index.html I can't seem to get the images of the controller to align, some of them are always way out of place. For this step are you supposed to use the same controller for all the boxes?

    When I get to step 5, it doesn't look even remotely like the picture in the guide, do you need to uncover and be wearing the HMD for things to look like they do in the guide? I don't see the controllers being replaced by the hand models for instance. I didn't do much more experimentation because it was getting dark.

    I guess my main questions are:
    1. Does the camera/tracker mount look ok? The guide doesn't mention any particular orientation being needed.
    2. Could the Gopro "linear" mode be messing things up? Should I run it in "medium" instead?
    3. How close have you managed to get the circles and crosses to line up in step 4? For me they seem to be farther apart than in the guide.

    Just as feedback to Epic, the calibration tool UI is pretty rough. If you don't have time to improve it right now, better documentation or a video of someone doing the process would really help to make sure the usage is understood!

    I've used this same setup with LIV VR/mixed reality which uses the camera tracker and calibrates with just 3 controller clicks. It works perfectly, not sure I understand why this process is so much more involved. I'm probably missing something obvious here.

    Comment


      #32
      I tried and failed to get a successful calibration many times with this plugin, it really needs work. However I have got Mixed Reality virtual production stuff working pretty well without it. You can see some samples and an example project elsewhere on the forums (search for greg.corson) and on my YouTube channel. https://www.youtube.com/user/GregCorson

      Just for general mixed reality and virtual production work, Epic really needs a good solution for syncing up and matching the different latency of video, webcam, trackers...etc. Their current solution only seems to work with the "pro" video cards and LiveLink trackers.

      Comment


        #33
        Hello. My configuration (920 + htc vive controller) work very well in the tool, but there insn't camera tracking in my project and it spawn in the same position of the vr_pawn, on the floor level. I test also Virtual Reality Spectator Screen and it work well.
        Did i need the vive tracker?

        Comment


          #34
          Originally posted by marcomanni View Post
          Hello. My configuration (920 + htc vive controller) work very well in the tool, but there insn't camera tracking in my project and it spawn in the same position of the vr_pawn, on the floor level. I test also Virtual Reality Spectator Screen and it work well.
          Did i need the vive tracker?
          Ok. It's Unreal Engine 4.22. Work very well in 4.20.

          Comment


            #35
            Hi,
            I am trying to use the tool with a Magewell USB Capture device, but I do not get any valid video input.
            Does anybody know if there is a chance to get access to the source code to extend it to further devices?
            Thanks for your support and stay healthy!

            Comment


              #36
              Hi! Great topic, I got a lot of info needed for my work! For one, I was wondering why my camera orientation is way off (while using Vive Tracker), but apparently positioning it "on its back" behind the camera is the reason for that. Haven't yet tried re-calibrating while attaching it in 90 degree angle though.

              I agree that the calibration process itself isn't that simple to get done. I had lot of problems with lens calibration (I got it eventually, with 6th time, done with pretty OK result). At this point I just wanted to get it done by some extent to get at least some results, and therefore get the idea how to improve. As mentioned by many, documentation could be more comprehensive.


              I had a question about the MRC tool’s output in the actual project, but I’ll make separate topic out of it since it's not really a feedback for devs.

              Comment

              Working...
              X