No announcement yet.

Mixed Reality Capture Feedback for Epic

This is a sticky topic.
  • Filter
  • Time
  • Show
Clear All
new posts

    I've spent several hours trying to get a calibration to work with little luck. A few questions for you all.... Attached is my camera setup, is this right? The pilot light of the tracker is on the top arm. Does it have to be in a particular orientation to work?
    Click image for larger version

Name:	IMG_2607.jpg
Views:	125
Size:	85.1 KB
ID:	1649326
    The Gopro hero 4 is set to "linear" which is supposed to get rid of the worst of the fish-eye distortion, but I can't get the reprojection error better than about 1.5 after multiple tries. The camera is looking at my green screen but there is a fair amount of "not green" around all the edges.

    I go through the calibration process with the controllers, on my 50" tv monitor the circles and crosses indicating alignment are usually an inch or two apart. When I get to the 4th step of the guide I can't seem to get the images of the controller to align, some of them are always way out of place. For this step are you supposed to use the same controller for all the boxes?

    When I get to step 5, it doesn't look even remotely like the picture in the guide, do you need to uncover and be wearing the HMD for things to look like they do in the guide? I don't see the controllers being replaced by the hand models for instance. I didn't do much more experimentation because it was getting dark.

    I guess my main questions are:
    1. Does the camera/tracker mount look ok? The guide doesn't mention any particular orientation being needed.
    2. Could the Gopro "linear" mode be messing things up? Should I run it in "medium" instead?
    3. How close have you managed to get the circles and crosses to line up in step 4? For me they seem to be farther apart than in the guide.

    Just as feedback to Epic, the calibration tool UI is pretty rough. If you don't have time to improve it right now, better documentation or a video of someone doing the process would really help to make sure the usage is understood!

    I've used this same setup with LIV VR/mixed reality which uses the camera tracker and calibrates with just 3 controller clicks. It works perfectly, not sure I understand why this process is so much more involved. I'm probably missing something obvious here.


      MRC could be a huge thing for business users, but I'm a bit worried about how Epic basically farted out some kind of beta implementation and forgot about it. I have successfully used MRC with a real client, but the setup process is too slow, unreliable and far from user friendly.

      Does anyone know if it's possible to rotate the virtual mixed reality capture camera around the steamvr origin point? So that it could be calibrated as fixed camera without tracker, but from code rotate it's position to be able to show more of the surroundings?
      Last edited by tmammela; 12-15-2019, 10:09 AM.


        I tried and failed to get a successful calibration many times with this plugin, it really needs work. However I have got Mixed Reality virtual production stuff working pretty well without it. You can see some samples and an example project elsewhere on the forums (search for greg.corson) and on my YouTube channel.

        Just for general mixed reality and virtual production work, Epic really needs a good solution for syncing up and matching the different latency of video, webcam, trackers...etc. Their current solution only seems to work with the "pro" video cards and LiveLink trackers.