How to set up a VR flythrough

Need some guidance in how to set up a VR flythrough for an Arch Viz that I am working on.


Without mincing words, this question is too generic and open-ended to answer :slight_smile:

Apologies for the vagueness,I was in a rush. I need to setup a camera that follows a certain path throughout my project, using matinee i presume.This camera should act as the active player screen when you start the simulation.It is intended for the Oculus Rift.I barely use matinee therefore I am clueless as to what to do.Would you have any concrete advice or know of any tutorials that talk about how to set something like this up.

Thanks :)!

Gotcha. These docs would be a good place to start:

I don’t have any specific examples of it being used for VR, though. Maybe someone else does.

I found a solution thanks to help from Rich at Hammerhead VR. Below worked for me, started with 6 cameras upped it to 18 for the best result in combination with Autopano Video:

120 fov should work.

In order to achieve the desired result we will need to create a template stitch since our standard ue4 scene isn’t detailed enough for the automatic algorhythmns to work. I’m afraid I’ve lost the screenshots of this but I’ll try to explain it and find an example image on Google.

Essentially what we want is a perfect stitch from ue4 using the exact same camera setup used in our matinee and to splice that footage into the first few frames of each matching video.

The only way I’ve been able to do this thus far is to create a cube with inverted normals or BSP and apply a unique, high fidelity photograph to each face. Then to capture the matinees (just a few frames) This will mean that the autopano / video stitch algorithms will be able to recreate the image. Splice those frames into the beginning of each video. And stitch from that frame, this will mean that the remainder of the footage in your video will be perfectly stitched without any extra work. This process is repeatable as well, once you have your template frames you can use them over and over again on new shots. You may like to label each face of the cube with a number in order to keep track of which is which, which way is up and down, orientation ect. I had to create a cardboard net out of a cereal box, number the sides and keep it on my desk in order to keep track of how my cube was arranged.

Please see attached for image examples of a cube room and a HDR image used as a texture in that room.

When working with 360 video in real life, people generally download a template stitch for their 360 hero rig or freedom 360 go pro rig in order to speed up processing or solve any technical challenges. Since we don’t have a standard rig it won’t work for us, but the benefit is that we can place an infinite amount of cameras on the exact same pixel so there should be less ghosting but of course each video will increase rendering time and workload. This is already an incredibly labour intensive process so be careful. I think this technique will work best on static images since when the camera moves there is a higher chance of seams. You should also be aware of automatic exposure control on your ue4 cameras since this will effect the continuity of your footage between the individual shots. One problem you’ll run into is with 6 cameras you may not have enough shots with overlap to create a completely smooth stitch. The best way to get around this is to put a fish-eye warp on your lens. You could either create this as a post process on-camera effect or reverse engineer the oculus integration. When rendering in matinee for some reason the renderer takes a little while to spin up and produce high quality footage. It will be difficult to render each video individually this way and make them sync up, so instead we should sequence each camera in the director group and then split apart the JPEG frames into folders. Be aware, sometimes matinee loses frames for no reason so it’s more tricky / slower for us to import them into premiere as a frame sequence.