Training Stream - Stereo Panoramic Plugin - Oct 18th - Live from Epic HQ

So, this must be great! Thanks!

For whatever reason, I can’t get the sp.panoramicscreenshot to work.
I have enabled the stereo capture plugin, restarted, even set the sp.outputdir, but it just won’t do anything after I execute the command.
I’m currently using 4.12.5

Hey Sam,

I would love to know how you set up the whole viewer system, where you can cycle through the different pano’s…
I would then like to do it with a look at trigger so the user can look at an area in the 360 and then “teleport” there. This way I can use 360’s rendered with vray and create a fairly simple tour of the building. Appreciate all the training, thanks!

It looks like stereo capture doesn’t work when working with a VR pawn.
I switched to a FPS pawn and it worked.

can we get the image viewer and video viewer setup… Or we can have a sample ue4 project where these things are implemented.

I got the material built and working and would love to have the viewer blueprints and maps mentioned in stream as well. I could try to rebuild it from screenshots but if you got it handy that’d be great!

Sam, Did you manage to fix the encoding problem you had at the moment of the stream? I fought for a long time, and I managed to get some videos captured to render in GearVR, it has something to do with the bit rate, when you use CRF 18 in the h264 in FFMPEG options, you end up with a kind of suitable video (if you use the resolution that used ninja theory or lower, 4096x2048), if the bit rate is too high they dont even play on VLC player.

I have only managed to get a fluid result using a mono panoramic video, with a resolution of 3840x1920, any higher and it get frozen time to time, but that resolution on stereo (something like 3840x3840) would not be enough for a crisp video.

I think it is somthing about the bitrate, only videos under 20.000kbps plays smoothly, if I go above that, I always get a clunky result playing on GearVr or even Oculus rift.

Do you think it is possible to get even higher resolution to play on an android phone using unreal engine as a player? or it will face the limitation that all the players has in android (I had read it has a limitation in the vertical resolution)

If you can post the material to render the video, it would be of great help.

Hello Sam
Watching the stream now… your screen time for the viewer was not long and clear enough for me to recreate.
I came to the forums hoping to see the viewer files uploaded like you said you would. A few months have gone by … just want to remind you about the files.

Having them would be a great help.

Thanks

Yes a great tutorial. Also missing the viewer files! Could you please share them?

Like others, viewer BP would be great.

Hey, any update on this? The viewer would be a livesaver.

Hello, what about rendering stereo images for 3D TV or similar thing? So I need two side by side HDs or 2ks for each eye without looking around because I have defined shot with my camera. I want to make a simple stereoscopic movie. Thanks

Folowing the sample, I have a bug with oculus quest 2 and Unreal 4.26. When I lounch the sample in the device, Only a strip in the middle of the screen is stereo and the rest of the screen is mono. Really wird. Seems like smoething is going wrong with this stereo material implementation on Oculus Quest 2. I started the project with the VR Game tamplate and the MotionControllerPawn BP. Help!!!

Same here. Have you found a solution?

Thanks!