Also, I tried different IPD values (even zero) but it’s not working, I always get the same stereo disparity. I tested both OnAxis and Parallel, but still the final stereo render looks the same.
Are there any other settings I need to modify besides this one in 6-task-solution > Covert Settings?
Hi!
I will check and write to you!
You are right, for 2D i not added IPD, you can change manual variable here →
Open Camera360v2 actor and Find 2D stereo. Change IPD location.
You can also make a small shift of the camera to achieve the desired result.
It’s much easier to do this for 2d.
Thank you for your reply! Question: is the rotation value for Stereo Type: OnAxis? If I don’t add any rotation there’s not parallax, but Is it possible to render Stereo Type: Parallel instead?
If you not change rotation, you see paralel
I have a question, how do I render from a fustrum projected from the vr180 view plane?
I mean, I want to obtain the view plane on screen keeping the barrel effect curvature in the fustrum area from the vr180 capture.
Thanks, but I think that the cylindrical projection solution is not the one I am looking for, because although it preserves the horizontal curvature, it keeps the vertical lines straight, i.e. the curvature is lost there.
The cut of the fustrum would be more like the following example:
I also wanted to comment that I found a projection error in your 360 video below: Electric Dreams Env VR 360 Stereo/RV/Unreal Engine 5.2 /Camera360v2 - YouTube
Although the views are 360, they are static and produce an uncomfortable effect, i.e. they do not rotate as the parallel axis between the two eyes rotates. I share with you a drawing of what I am referring to:
Basically, when taking the 3D 360 capture, he did it with two cameras in a static position in the engine, separated from each other at a distance based on an average IPD, but when someone is watching the video and rotates the head inside the video, the points of view of each camera are in the same place and are not aligned with the axis of rotation of the head, this generates a strange distortion of perception, where the eyes seem to lose the coherence between the stereoscopic view and the angle of separation between eyes, because as I mentioned, although the head rotates, the capture cameras always remain in the same place.
It seems that a solution would be to take 360 photos, i.e. one photo per degree, at each angle of rotation from the axis of rotation of the head and parallel movement of the eyes, and interpolate between them with the movement in real time to improve the smoothness, but it does not seem feasible.
I am having this weird lighting issue, I am trying to use the capture in viewport mode and it seems like the cameras are using auto exposure. In camera Rec 360 I chose manual exposure and I have tried several different exposure values but it does not seem to make any difference. Any idea what could be wrong?
Hmm, I use the correct stereo and it meets the standards. I don’t quite understand.
Example in internet:
Solution from Paul Bourke (example cilindrical stereo):
My solution in 360 stereo:
This solution working in Camera360v2
Hi!
Of course, the best solution is to send me your scene for verification by email Lenina62-ivan@mail.ru . If you are using Dynamic SkyLight
[HELP WIDGET]CAMERA 360 v2
If you use Lumen, unfortunately it is not always possible to get good results, since only the correct Lumen can be supported in Camera360v2.
Looks like lumen was causing it, thanks for the help!
Good afternoon. When rendering, the color rendering is very dark, despite the fact that the exposure of the project is at its maximum and the usual render transmits in a good light. But the 6-layer solution renders it very dark, I use lumen and ultra dinamic sky in the project
Hi!
Initially, I recommend making your 2D rendering, setting up the final image, and only then proceed to 360 rendering.
360 rendering fully corresponds to the output from CineCamera, since it renders from this particular camera.
Try check this solution, maybe helped for you?
And please attach here image 2D , and your result problem.
Okay, that’s the right way, but remember that you are taking to build the scene in 360, two pictures of 180 degrees each, one in the direction and one in the opposite direction. Then the problem is that if it is a 360 3D, that is to say that it is stereo 360, then you have the position of the static camera, and two capture points, forward and in the opposite direction, instead you can turn the head, that completely changes the perspective. And now you have that the geometric projection is distorted as I showed in the previous example. The only solution to this would be to take many shots, one at a different angle, possibly 360 shots, one per degree, 180 degrees each, so that the projection remains consistent with the user’s angle of view. I don’t know if I’m making myself clear, or you still don’t get the idea?
Here you are generating two equirectangular captures, each one representing a camera position, in stereo view, of 360 degrees, the only problem is that the position of the two cameras in parallel do not rotate in the same way that the user’s eyes and the head viewing the scene rotate.
I understand you, I have two branches Camera 360 v1 and Camera360v2. I thought you were writing about Camera360v2.
Previously, Camera 360 v1 did have a position offset approach, but in Camera360v2 [SUPPORT]CAMERA 360 v2. System segments are formed and an average pixel is taken, which eliminates data inaccuracy, then stitching and mixing into a single stereo image takes place.I don’t have two 180 images stitched into one, otherwise the stereo would be incorrect.
No, I don’t think you are understanding me yet. When I refer to stereo, in your camera are the two eyes placed side by side, and then you take a render of each eye, join the two images and get the stereo scene. The problem with that approach is that it only works well for 180vr scenes, where you have your point of view towards the same area, now, when you change the direction of the head, you are rotating the two eyes horizontally, but the capture is static, it is not accompanied with the rotation of the eyes. So that generates an uncomfortable distortion for the brain. The images I sent you show the problem, you need to render light fields, not flat cubemaps.
Thank you for writing to me in more detail, I need to get acquainted with this information!
I want to make a first person camera that has a custom fustrum, with the natural projection as shown in the following video: Unrecord - Official Early Gameplay Trailer (youtube.com)
That is, replace the fustrum of this:
To this: