I’m trying to implement a render texture / target in a VR project - mapping a camera pointed at another part of the level to a texture.
The issue I have is that when running normally (i.e. no HMD) the texture looks fine on a monitor and updates great.
When you look at it on a HMD, then each eye seems to see a different image, almost like it’s trying render stereoscopically, but the offset is way off. It’s so bad you can’t look at the object/texture without it almost hurting your eyes. If you close each eye, you can clearly see an image on the texture, which is correct, it’s just that the two images don’t align properly in stereo.
Does anybody know if there is an easy way to fix / resolve this issue?
Ideally it would be nice if it render correctly in stereo, so that it actually had depth, but if it had to be completely flat, it wouldn’t be the end of the world for me.