Announcement

Collapse
No announcement yet.

Mixed Reality Capture Feedback for Epic

Collapse
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    I loved how easy was to integrate this on a project for a exhibition, but I'm having a bit of trouble trying to configure things to get better performance, I would like to be able to change resolutions and maybe other quality settings, (I havent't tried using with the ndisplays, but running the vr in one pc and the ar on another would also be great)

    Comment


      #17
      Originally posted by josgelissen View Post
      Hi Everyone,

      We have been testing the MRC plugin with an oculus rift & touch controller setup
      Setup and Calibration went well, no problems that couldn't be overcome.


      But in the further use, we can't find any documentation on camera control, how to follow the player etc.

      The only thing i could find was this from oculus: https://developer.oculus.com/documen...ts/unreal-mrc/
      But that seems to be out of date, the mentioned Oculus MR Casting Camera Actor is no where to be found.

      Can someone help us out?

      I now understand that the Spectator screen is the feed used for the MRC, but the documentation on spectator screen is not very helpful for a relative newbie like myself

      I still don't understand how to allign my map/scene to position the camera (spectator) right were i wan't it to be.

      Comment


        #18
        Originally posted by josgelissen View Post

        I now understand that the Spectator screen is the feed used for the MRC, but the documentation on spectator screen is not very helpful for a relative newbie like myself

        I still don't understand how to allign my map/scene to position the camera (spectator) right were i wan't it to be.
        From what I could gather, the camera is put in place in the calibration tool so the player is show exactly where the pawn is in the game , so the only way to align the map to the camera is to physically move it and recalibrate, or you can use a tracker on it, which I haven’t tested yet

        Comment


          #19
          Originally posted by vshade View Post

          From what I could gather, the camera is put in place in the calibration tool so the player is show exactly where the pawn is in the game , so the only way to align the map to the camera is to physically move it and recalibrate, or you can use a tracker on it, which I haven’t tested yet
          Thnx for the reply vshade.
          hm yeah well ok the map isn't loaded till after the calibration. But what i'm reading from your comment is that the Pawn position and orientation, at least in part, is responsible for the scene setting,
          So fiddeling with that, would give me control on X and Y camera position and horizontal rotation?
          Fysical camera position could give me control on Z position?

          Comment


            #20
            Originally posted by eh.cin View Post
            We went to an agriculture exhibition with MRC and didn't had any real problems with it. Calibration didn't work against the green screen again, but we were able to circumvent that (read the earlier post).

            We used MRC to bring a real field (about 2 hectares or 5 acres in size) to the showroom. A 3D model of the field was made from drone images (hence the bad resolution). We also had lots of data from past summer, including index layers (NDVI) from satellite images, a 3D model of barley made from a real plant (with photogrammetry technique) and some farm machinery (from manufacturers' CAD-models, converted with Unreal Datasmith). We also had a real weather station in our virtual field in the presentation and a real (green!) plant and managed to keep it visible in the videofeed. We used real data and models made from real things almost exclusively, nothing major was made by artists. Videofeed in the big screen had two sources: mixed reality using the external camera and presentators view from the headset (with virtual hands). Director switched between those two video sources based on what was happening in the presentation.

            The show started with a small introduction speech made by the presenter (industry specialist) and then he literally walked to the virtual field and put headset on. The presenter presented lots of information about cultivation methods and results from past summer. Viewers were mainly farmers and we used Mixed Reality as a presentation tool, not a plaything. That large screen was visible through three exhibition halls and worked as a lure to bring people to watch some educational content. The pillar of earth shown in the image is actually a hole in the ground of the real field (there was four holes/pillars in total). We just inverted the basic idea and used real images from the hole as texture for a pillar that ascends from the ground. There was also a panel discussion at the end where participants were standing in the virtual field. A single presentation lasted about 40 to 50 minutes and we ran it total of 10 times during three exhibition days. There was no issues related to Unreal Engine or MRC. We had to scale our presentation down (to one person and one camera), because we fried one of our PCs, but that was just a hardware issue.

            EDIT: The carpet was standard cheap and thin exhibition carpet in light green color and glued to the floor as usual, nothing special. We had exactly same carpet in other areas, but in different colors. We had tested a small piece of a carpet at the office in advance. It seems that the MRC works well even with a carpet that has some grooves.

            This is what happens when Unreal Engine, Unreal Datasmith and MRC are used outside of the gaming context. Thank you Epic for great tools you made available, this was a fun project!
            eh.cin very nice work! could you expand how did you manage to make it work? thanks in advance!

            Comment


              #21
              It seems the tracker has to go in certain orientation to match ok with the live video. In my case im using a logitech c920 with a vive tracker, and a 3d printed adapter to locate the tracker just above the camera with LIV and never had any weird issue. In the Unreal mr plugin case, that tracker has to go rotated 90 degrees in Z (weird since its not correct if you check vive´s tracker documentation).
              Since the sav file generated by the calibration tool, isnt editable as plain text. Now i have to redo all the calibration process just because i needed to adjust the tracker rotation axis. I know this tool is still in wip, but would help a lot if documentation is more clear about all this stuff.

              Comment


                #22
                Here is a video of the issue im having with tracker being rotated inside the editor, after succesfull calibration has been done:

                https://drive.google.com/file/d/1YNb...ew?usp=sharing

                Comment


                  #23
                  already rebuilt a custom rig for setting the tracker in the proper direction with the camera and its working ok, i stiil have a couple of issues, the 3d hand models dont appear on the mixed reality and there is still an important offset when grabbing objects respect the controllers, is any way to adjust the offset of the controllers without redoing all calibration process? (btw on the calibration process hands/controllers were perfectly aligned...)
                  thanks in advance!

                  Comment


                    #24
                    Will there be any new release on the calibration tool? We recently tried to implement this into a customer project and just couldn't get it calibrated in the customers green room.

                    We run into multiple issues like not getting the first value below 1.8 to inverted controls etc.

                    Comment


                      #25
                      Hello

                      I had test the pluging and worked properly fine! What I am looking now is to move the camera and I have two questions:

                      1- Attaching a Vive tracker with the proper mount to the camera it will work as plug and play or is needed to configure something in the plugin? In the tutorial shows the tracker attached to the camera but nothing said about the configuration.

                      2-Where can I find in the editor the mixed reality camera to move using the keyboard?

                      Many thanks

                      Comment


                        #26
                        I have had pretty much the same experience. Quite easy to setup, BUT that the virtual spectator camera often end up 90 degrees relative to the expected position. This often happens when the pawn is moved inside the level (by teleporting) or when the player changes level.

                        Some documentation of how to perform movement without messing up the mixed reality camera would be highly appreciated.

                        Comment


                          #27
                          Hi thanks for the tool, I´m using it with the Oculus, I have 2 touchs and a 3rd one as a VR Object, How can i use the VR Object as the tracket object?
                          In the documentation says what to use for vive
                          For the HTC Vive, the first tracker will be named “Special_1” in the attachments list.
                          And for the next release of the tool maybe you could move the tracket object detection before the Lens Calibration process or the Alignment Calibration.

                          sorry for the bad English

                          Comment


                            #28
                            Hi, is this tread related to AR too? like the AR sample? or for more complex MRC?

                            Comment


                              #29
                              Hi,
                              I have purchased Magewell USB Capture HDMI 4K Plus and connected my DSLR camera through the capture card on my desktop computer and it doesn't work. It shows only a white blank screen. I tested on another similar configuration and it won't work. After I connected to my DELL M6700 Laptop and it captures the video from DSLR. Need help to solve this issue.

                              Comment


                                #30
                                I have tried the Magewell Pro Capture PCI card, an Avermedia 4k pro pci card, Avermedia lgx2 usb and Elgato camlink 4k USB. So far only the elgato card works. The Avermedia stuff gives me a "can not open" error and the Magewell opens with over 300 formats shown, but there doesn't seem to be video in any of them. I have tried a Blackmagic Intensity Pro 4k card which doesn't want to work with my cameras at all, I suspect the card itself is defective.

                                I'm wondering if I'm using the right Mixed Reality setup or not. I'm looking to do something like a virtual set piece. For example, suppose you have a live video feed of a room and you wanted to make a model of a sports stadium appear on the floor in front of the presenter. For simplicity, the presenter is NOT going to walk in front of or occlude the model so no green screen or multi layer compositing should be needed. Is the MRCalibrate and the mixed reality plugin still the way to go?

                                Also wondering why the chroma-key is setup in MRCalibrate? Using MRCalibrate you have to enter colors as numbers which can be difficult. Why isn't the chroma-key setup done in composure where you have a GUI and color pickers to adjust things? Related to that, how can you setup a multi-layer composite with some rendered objects behind the live video and others in front.

                                Maybe I'm missing something because I'm new to this and haven't got the whole MRCalibrate thing to work yet. I think I might have been having the problem mentioned earlier in the thread about the reprojection calibration not working right in front of a green screen. I had a camera setup that saw almost 100% green screen in the background and couldn't get the error below 5 to save my life. Later I had the camera setup with a wider angle lens that only had the green screen in the center and saw a lot of "non green" on all 4 sides, this calibrated first time. I need to go back and retry calibration of the first lens against some other background than the green screen and see what happens.

                                I am using a VIVE tracker on the camera. The camera and the tracker are mounted to an L bracket so the tracker is behind the camera several inches so it can see the VIVE lighthouses.

                                Are there any good MR sample projects out there, been looking but haven't found a good one (yet).

                                Comment

                                Working...
                                X