I get the impression that mips actually do have a lot to do with the problem. My images are rendered in either 2048x2048 resolution or 1536x1536 per cube side. That’s more than enough.
The problem is, when mip mapping is switched off the aliased edges do get visible when viewed on Gear VR, no matter what resolution the image I import. If I switch mip mapping on, the highest resolution mip that gets generated is always way too blurry.
In other words, using an ‘‘un-mip-mapped’’ texture seems to always produce the jaggies and switching mip mapping on, causes a high-resolution texture to lose too much definition.
Do you or anyone reading this post have any experience in adjusting the resolution of the cubemaps so they would look sharp on displays with different pixel densities by any chance? I am working with Samsung s7 at the moment and trying to replicate the sharpness of GearVRs native picture viewer.
What I was saying is to render your cubemap as 4k and then resize it to 1k. That will anti-alias the edges and when you see it in VR it will look as if it’s affected by AA (when you render spherical panoramas or cubemaps in Blender, AA is not applied; so when I render 1k cubes, I get jaggy edges; so I render in 4k, downsample it in GIMP to 1k and have no jaggies).
Also, you don’t want to use non-power of 2 textures (1536^2 is good for viewing in Oculus Photos, but not good for use in real-time apps/games).
Thanks for clarifying Motorsep.
Interestingly enough I have found that 1536x1536 textures look slightly better than 2048x2048 in terms of jagged edges.
1536x1536x6 can still be compressed no problem.
As for rendering in 4K and down-scaling, there must be a more efficient way of doing this. Rendering a 6xstrip in 24576x4096 just to emulate anti-aliasing seems like a serious overkill to me.
I am using 3DS Max and getting anti-aliased cubemaps doesn’t pose a problem.
I believe this is also somehow related to how what is displayed is re-scaled for different devices. An app using the exact same cubemaps looks relatively ok on a Gear VR headset but the aliasing is much more visible when the same app is compiled and used with Google Cardboard.
I’m having exactly the same problem. @motorsep, the source cubestrip isn’t the problem as I’m sure the OP tried the same strip with the Oculus 360 viewer and saw everything proper.
@OP and anyone else interested, the reason why you see the jagged lines that look like poor AA is because without any mips UE4 cannot produce any filtering for your textures. Without filtering the render engine cannot map the textures to the phone’s screen pixels properly so you get that look.
Other observations that I’ve made are that UE4 doesn’t play well with 1536 cubemaps and will show that they are imported as 2048. These 1536 textures will always default to a lower mip. In Ps if you scale up and save as .dds those 2048 strips will show up well. The problem on my S7 testing is that no matter the pool size or the never stream options, with 2048 the textures always defaults to the 1024 LOD. Also, with 1024 strips the quality doesn’t even match the Oculus 360 viewer app displaying the same resolution because of the compression that UE4 applies to the mobile version textures which end up more blurry.
I’m very interested if anyone managed to fix these issues. I’m using 4.14 engine version for all the above.
@motorsep, if you own a Gear VR you can send me your ossig file and I’ll package a sample build to better understand what happens when you see an unfiltered image mapped onto a virtual cube over a screen where you can literally see the pixels of, as is the case with all mobile VR system available today. If you have a cardboard I can share a version for that. The image looks fairly bad, it’s not as on a desktop monitor.
Not only I own Gear VR, I develop for it. And I have no issues with my sky sphere without mipmaps.
Here is the thing. Mipmaps were invented to avoid aliasing of the textures as camera gets away from it. Also used back in the days to control “texture quality” settings. However, skyboxes were never mipmaped. I’ve worked with Quake and Doom games (modding and whatnot) and none of their skies had mipmaps.
In those games skybox brushes could be far away from you, or right in your face (you could be in 1m x 1m room with windows and you can see sky far away, yet actual brushes with sky material were right in your face). So if they used mipmaps, then brushes with sky material far away from the cam would be blurry and brushes with sky close up would be nice and crisp.
Same goes for UE4 - if you have mipmaps for sky sphere texture, and you sphere is tiny, but draw order is such that it is drawn behind all of the level geometry, then you will always be at mipmap 0, which is your original image. If your sky sphere is huge, it will be far away from the cam at all times, and whatever mipmap level will be forced to render, making it always blurry. And if there was a way to force mipmap level 0, you’d end up with original image.
All that considered, why on earth do you people keep trying to mipmap skies?! You will either end up always blurry, or always with original image. So in order to solve the issue, you need to render it out in high resolution and downsample it. And keep skies either at 2k or 4k resolution in-game.
I don’t think people want to use skies per se, or at least I don’t. The effort is to replicate the quality of the Oculus 360 photo viewer app. You can load stereo 12x1536px panoramas into that, which can effectively be replicated in UE4 by loading two cubemaps, one for each eye. Alternatively you could map a physical cube. With no mips you get no sampling of the textures that are mapped to a cube (virtual or not) and this means you’ll see skewed textured over a screen of which you can see the pixels of and as such the image will look very jagged if it’s at over 1K pixels per face. If you’re a Gear VR developer, I encourage you to replicate the Oculus app as it’s very likely that a client will require you to render a high quality panorama on the Gear, in the future.
Oh, I was under impression you need it for skybox (skysphere).
I am afraid you can’t I don’t think UE4 includes same rendering functionality Mobile SDK does. I don’t even know if Mobile SDK used in standalone apps (non-UE4/non-Unity based) can render the same way as they do in Oculus 360 viewer.
Oculus 360 viewer is really good, best I’ve managed was getting close to it.
I never considered Stereo Layers for this…definitely worth looking into. At the end of the day it all comes down to how optimal the app can run with all the other things that it’s processing.
This is a very helpful tip, thank you! I’ll keep this updated with my progress or perhaps I’ll make a new thread for this if it turns out good.