I’m hoping that I can get some help on solving some problems I’m having with sky spheres. My goal is to have multiple nested sky spheres that have transparent areas on their surface. The sky spheres need to exhibit parallax so that you will see shifting of the surface objects relative to each sky sphere when moving inside the interior space. I’ve constructed many sky spheres using various techniques and have not found one that works for my application. Here is a summary of the problems I’ve encountered:
- The texture image that is mapped to the surface stays fixed to the screen/camera when moving around inside (or outside) the sky sphere.This prevents parallax from happening.
- The texture image does not map correctly to the sphere. UV mapping is not correctly applied to the sphere and mirroring of the upper and lower hemispheres occurs.
Below are screenshots of the various attempts I’ve made to create the sky spheres and the problems with each of them:
First, I created a material using the current best practices method of having a ReflectionVectorWS driving a TextureSampleParameterCube (I also tried a CameraVector which had the exact same behavior). The texture used was an equirectangular mapped image in HDR format. The sphere was a StaticMesh sphere with the normals inverted and set to NoCollisions. The preview box looked correct and there were no errors. A copy of the texture with the color inverted was used as an opacity mask. So far, everything seemed okay and the transparency mask was working correctly. One surprising phenomena was that the texture stayed locked to the screen (didn’t move) when I panned the blueprint canvas around on the screen. I believe this a significant clue for understanding the parallax issue.
Inside the sphere, everything looks correct when looking around with the camera. The geometry is correct (it’s a geodesic dome and looks like one). But when moving right and left in the sky sphere the texture image does not more and instead is locked to the screen/camera view.
Looking at the sky sphere from a distance outside, the texture looks exactly the same as from being inside. The image has the same behavior of staying locked to the screen/camera. The sky sphere is not acting like an actual sphere with an image mapped to the 3D surface. No parallax effect is possible with the sky sphere acting this way.
Next I made a material using a TextureSampleParameter2d driven by a ReflectionVectorWS through a MaterialFunctionCall (LongLatToUV). I had to use the LongLatToUV to get the UV mapping to work correctly and have the image geometry to look right on the sphere. This time the texture used was an equirectangular mapped image in PNG format, with the transparency in the image used to drive Opacity. Again, the preview box looked correct and there were no errors and (unfortunately) the texture still was locked to the screen when panning the canvas.
The advantage with this method is that only one image is needed to create transparent areas in the sky sphere, and the file size is much smaller in PNG format.
Inside the sky sphere, everything behaved exactly as before in the HDR version.
Same thing outside the PNG based sky sphere, the texture is locked to the screen/camera. I am still unable to see parallax, one of my requirements.
The next thing I tried was a material using a TextureSample driven by a TextureCoordinate. The texture was the same PNG image from before. The geometry looked correct in the preview window with the default UTiling and VTiling values of 1.0 and 1.0.
In the main viewport the sky sphere surface geometry was wrong, with the texture mirrored between the upper and lower hemispheres. BUT, a big improvement was that the sky sphere was now behaving like a 3D sphere and I was seeing parallax and image movement when moving around in the interior and exterior of the sky sphere.
I tried changing the UTiling and VTiling values in the TextureCoordinate but nothing fixed the geometry. Settings of 1.0 and 0.5 gave me this result.
This is what the sky sphere looked like in the viewport with settings of 1.0 and 0.5.
Here is where things get weird. I have another image I used as a texture in PNG format. It’s identical in all ways to the previous image except for the image itself. When I create a material using this image, same as the last one with UTiling = 1.0 and VTiling = 0.5, the preview window looks right.
Inside the sky sphere everything looks correct and I see parallax and image shifting when I move right and left.
Outside the sky sphere nearly everything is perfect and it looks and acts like a 3D sphere.
The only thing not perfect is the far side of the sphere overlaps the near side of the sphere. I think this is because the normals are inverted on the StaticMesh sphere used for the texture.
Since some images work okay and some don’t, I still don’t have a workable way of creating nested sky spheres. I’m really hoping that someone can help me solve this problem.