looking for a tutorial for make hotspot cubemap (360 stereo panoramic)

i rendered stereo cubemap(360 panoramic) with 3dsmax and i would like to do interactive application and connect this 3d panoramic with hospot
I’m a beginner in unreal engine, but could you give me any suggest or tutorial about stereo 360 panoramic in UE ?
thank you in advance

Hi, Tonino!

If you’ll make a simple realtime solution in UE, I believe you’ll sell it for millions for any of Production house :slight_smile:

But, seriously, I would advise you to check this topics:

thank you for your answer

So there are no way to render a stereoscopic cubemap in UE4? No way to render an object or texture to only one eye? (Apart from some serious messing with the source.)

Do you mean displaying a prerendered cubemap within UE4 as opposed to generating the stero cubemap within UE4?

If that is the case and assuming the format is two seperate cubemaps(I have no experience with stereocubemaps).
Then I reckon this would do the job.


Looks like it would do the trick! Thank you!

Do not forget, what stereo 30 render is not simple 2 cubemaps.

If you’ll do so, you will loose stereo parallax with turning your head left - right. Even more - after 180 degree you’ll get left and right eyes swapped position.

So, you need to take a many renders to cubemaps, placed on circle, and then composited into one frame.

But that is what you get from the current 360 stereo cameras, right?

Can you elaborate a bit on this?

I haven’t tried the plugins Roadstar linked to, but they appear to do what he’s suggesting. Blender’s new multiview stereo cameras do this automatically when set up as equirectangular panoramic (I’m told, haven’t tried it yet) though they don’t output a cubemap by default. This is what OTOY’s tool does, presumably. I think the point is that you can’t simple capture a 360-degree cubemap from one location for the left eye, then capture it again by moving your stereo camera to a new position. You have to at the very least render each face of the cube map with both cameras in a different position, and ideally you’d reposition both cameras continuously for each column or patch of pixels. If you capture a panorama from 1 location, then move the camera a couple of cm over to one side and capture a panorama from that location, it looks great in stereo when you look straight ahead. But if you look to the left (relative to the “forward” direction of the rendering) then each eye is seeing almost the same thing so there’s no binocular disparity for depth (essentially one camera is looking at the other). If you face backwards relative to the rendering, your right eye is seeing the view from the perspective of the “left” camera (it’s now on your right because you turned around). Though I suppose if you stood on your head it would look right…

Ok, now I get what you mean.

What I’m concerned with is not the capturing, but how to display a pre-rendered, static VR scene in UE4.

This is the good theory article:


Thank you, dgsharp, for explanation! :slight_smile:

In short: every frame you have to rotate your pair of cubemaps (“left” and “right”) n times (depends from quality, more times - better stitching). And then to stitch thin vertical lines of pixels into one stereo shot.