Hello, my goal is basically to create a simple 360 stereo panorama viewer in VR.
I’m currently using an Oculus Quest 2 and the Oculus VR plugin. I have a lot of panorama pictures that work with stereo viewing. In other words, the panoramas were captured for stereo. In my UE level, I have two huge spheres with inverted normals. I can put down the spheres and apply the panorama photos as textures to the spheres so that they look like skyboxes, but are not infinitely away.
My issue is that I want to render each sphere individually to each corresponding eye. The sphere with the texture of the panorama photo corresponding to the left eye, to be rendered only to the left eye in the VR Head-Mounted Display (Oculus Quest 2). Then the sphere with the texture of the panorama photo corresponding to the right eye, to be rendered only to the right eye in the VR HMD. It sounds simple enough, but I can’t figure it out for the life of me.
I’m guessing I will need to use two cameras on the pawn I create (which will be selected as the Default Pawn Class in my GameMode). So I tried just putting two cameras on a pawn, but UE doesn’t seem to automatically use each one for each eye. I’m not sure where to go from there.
I’m aware there’s a $250 plugin for 360 stereo panoramas but I’d rather not buy a plugin if possible. I’ve found some related questions but they didn’t really answer what I needed. I’ve also tried messing around with Stereo Layers from UE but I couldn’t really get them to work. Using one texture with the Stereo Layer worked fine on the HMD (but same texture was rendered to both eyes); once I put a second texture, none of the textures rendered to the HDM. I feel like this would be a good learning experience as well. I’ve accomplished this before in Unity 3D by working a little on shaders too, so I assume there is a way to do it in Unreal Engine 4.27.
Thanks for any help.