Mixed Reality Capture Feedback for Epic

It seems the tracker has to go in certain orientation to match ok with the live video. In my case im using a logitech c920 with a vive tracker, and a 3d printed adapter to locate the tracker just above the camera with LIV and never had any weird issue. In the Unreal mr plugin case, that tracker has to go rotated 90 degrees in Z (weird since its not correct if you check vive´s tracker documentation).
Since the sav file generated by the calibration tool, isnt editable as plain text. Now i have to redo all the calibration process just because i needed to adjust the tracker rotation axis. I know this tool is still in wip, but would help a lot if documentation is more clear about all this stuff.

Here is a video of the issue im having with tracker being rotated inside the editor, after succesfull calibration has been done:

already rebuilt a custom rig for setting the tracker in the proper direction with the camera and its working ok, i stiil have a couple of issues, the 3d hand models dont appear on the mixed reality and there is still an important offset when grabbing objects respect the controllers, is any way to adjust the offset of the controllers without redoing all calibration process? (btw on the calibration process hands/controllers were perfectly aligned…)
thanks in advance!

Will there be any new release on the calibration tool? We recently tried to implement this into a customer project and just couldn’t get it calibrated in the customers green room.

We run into multiple issues like not getting the first value below 1.8 to inverted controls etc.


I had test the pluging and worked properly fine! What I am looking now is to move the camera and I have two questions:

1- Attaching a Vive tracker with the proper mount to the camera it will work as plug and play or is needed to configure something in the plugin? In the tutorial shows the tracker attached to the camera but nothing said about the configuration.

2-Where can I find in the editor the mixed reality camera to move using the keyboard?

Many thanks

I have had pretty much the same experience. Quite easy to setup, BUT that the virtual spectator camera often end up 90 degrees relative to the expected position. This often happens when the pawn is moved inside the level (by teleporting) or when the player changes level.

Some documentation of how to perform movement without messing up the mixed reality camera would be highly appreciated.

Hi thanks for the tool, I´m using it with the Oculus, I have 2 touchs and a 3rd one as a VR Object, How can i use the VR Object as the tracket object?
In the documentation says what to use for vive
For the HTC Vive, the first tracker will be named “Special_1” in the attachments list.
And for the next release of the tool maybe you could move the tracket object detection before the Lens Calibration process or the Alignment Calibration.

sorry for the bad English

Hi, is this tread related to AR too? like the AR sample? or for more complex MRC?

I have purchased Magewell USB Capture HDMI 4K Plus and connected my DSLR camera through the capture card on my desktop computer and it doesn’t work. It shows only a white blank screen. I tested on another similar configuration and it won’t work. After I connected to my DELL M6700 Laptop and it captures the video from DSLR. Need help to solve this issue.

I have tried the Magewell Pro Capture PCI card, an Avermedia 4k pro pci card, Avermedia lgx2 usb and Elgato camlink 4k USB. So far only the elgato card works. The Avermedia stuff gives me a “can not open” error and the Magewell opens with over 300 formats shown, but there doesn’t seem to be video in any of them. I have tried a Blackmagic Intensity Pro 4k card which doesn’t want to work with my cameras at all, I suspect the card itself is defective.

I’m wondering if I’m using the right Mixed Reality setup or not. I’m looking to do something like a virtual set piece. For example, suppose you have a live video feed of a room and you wanted to make a model of a sports stadium appear on the floor in front of the presenter. For simplicity, the presenter is NOT going to walk in front of or occlude the model so no green screen or multi layer compositing should be needed. Is the MRCalibrate and the mixed reality plugin still the way to go?

Also wondering why the chroma-key is setup in MRCalibrate? Using MRCalibrate you have to enter colors as numbers which can be difficult. Why isn’t the chroma-key setup done in composure where you have a GUI and color pickers to adjust things? Related to that, how can you setup a multi-layer composite with some rendered objects behind the live video and others in front.

Maybe I’m missing something because I’m new to this and haven’t got the whole MRCalibrate thing to work yet. I think I might have been having the problem mentioned earlier in the thread about the reprojection calibration not working right in front of a green screen. I had a camera setup that saw almost 100% green screen in the background and couldn’t get the error below 5 to save my life. Later I had the camera setup with a wider angle lens that only had the green screen in the center and saw a lot of “non green” on all 4 sides, this calibrated first time. I need to go back and retry calibration of the first lens against some other background than the green screen and see what happens.

I am using a VIVE tracker on the camera. The camera and the tracker are mounted to an L bracket so the tracker is behind the camera several inches so it can see the VIVE lighthouses.

Are there any good MR sample projects out there, been looking but haven’t found a good one (yet).

I’ve spent several hours trying to get a calibration to work with little luck. A few questions for you all… Attached is my camera setup, is this right? The pilot light of the tracker is on the top arm. Does it have to be in a particular orientation to work?

The Gopro hero 4 is set to “linear” which is supposed to get rid of the worst of the fish-eye distortion, but I can’t get the reprojection error better than about 1.5 after multiple tries. The camera is looking at my green screen but there is a fair amount of “not green” around all the edges.

I go through the calibration process with the controllers, on my 50" tv monitor the circles and crosses indicating alignment are usually an inch or two apart. When I get to the 4th step of the guide How To Use The Mixed Reality Capture Calibration Tool | Unreal Engine Documentation I can’t seem to get the images of the controller to align, some of them are always way out of place. For this step are you supposed to use the same controller for all the boxes?

When I get to step 5, it doesn’t look even remotely like the picture in the guide, do you need to uncover and be wearing the HMD for things to look like they do in the guide? I don’t see the controllers being replaced by the hand models for instance. I didn’t do much more experimentation because it was getting dark.

I guess my main questions are:

  1. Does the camera/tracker mount look ok? The guide doesn’t mention any particular orientation being needed.
  2. Could the Gopro “linear” mode be messing things up? Should I run it in “medium” instead?
  3. How close have you managed to get the circles and crosses to line up in step 4? For me they seem to be farther apart than in the guide.

Just as feedback to Epic, the calibration tool UI is pretty rough. If you don’t have time to improve it right now, better documentation or a video of someone doing the process would really help to make sure the usage is understood!

I’ve used this same setup with LIV VR/mixed reality which uses the camera tracker and calibrates with just 3 controller clicks. It works perfectly, not sure I understand why this process is so much more involved. I’m probably missing something obvious here.

I tried and failed to get a successful calibration many times with this plugin, it really needs work. However I have got Mixed Reality virtual production stuff working pretty well without it. You can see some samples and an example project elsewhere on the forums (search for greg.corson) and on my YouTube channel.

Just for general mixed reality and virtual production work, Epic really needs a good solution for syncing up and matching the different latency of video, webcam, trackers…etc. Their current solution only seems to work with the “pro” video cards and LiveLink trackers.

Hello. My configuration (920 + htc vive controller) work very well in the tool, but there insn’t camera tracking in my project and it spawn in the same position of the vr_pawn, on the floor level. I test also Virtual Reality Spectator Screen and it work well.
Did i need the vive tracker?

Ok. It’s Unreal Engine 4.22. Work very well in 4.20.

I am trying to use the tool with a Magewell USB Capture device, but I do not get any valid video input.
Does anybody know if there is a chance to get access to the source code to extend it to further devices?
Thanks for your support and stay healthy!

Hi! Great topic, I got a lot of info needed for my work! For one, I was wondering why my camera orientation is way off (while using Vive Tracker), but apparently positioning it “on its back” behind the camera is the reason for that. Haven’t yet tried re-calibrating while attaching it in 90 degree angle though.

I agree that the calibration process itself isn’t that simple to get done. I had lot of problems with lens calibration (I got it eventually, with 6th time, done with pretty OK result). At this point I just wanted to get it done by some extent to get at least some results, and therefore get the idea how to improve. As mentioned by many, documentation could be more comprehensive.

I had a question about the MRC tool’s output in the actual project, but I’ll make separate topic out of it since it’s not really a feedback for devs.