Stitching screen recodings into cubemap for VR output

I’m going to try and simplify this as best as I can, and as a result, I apologize if this is in the wrong section.

I want to screen record 6 x 90° angles (front, left, right, back, up and down). I will achieve this with an in-game replay viewer which assists incredibly in the concept. Without it, I doubt this idea would be possible.

With these 6 x synchronized recordings, I wish to stitch them together to create a cube map using tools I have purchased for after effects.

The math checks out for everything but I think there may be hiccups.

For starters; has anyone on the forums ever done this?

Secondly; even with an in-game FOV of 90° for exact 360° coverage would the resolution affect my outcome?
I figure actual 4K resolution used in VR is typically the texture standard of 4096x4096 and NOT the commercial standard… So, will screen recording at 1080p (1920x1080) give me incorrect and effectively non-connected recordings? At the end of the day, the 6x recordings need to cover all visible angles at any given time so this may be an issue for me.

Thirdly; The unreal engine based game itself does not have advanced xyz information available for the end user. So as long as I cover every angle (as discussed in second point), will the starting location matter?

If anyone has ANY information that could help me with some information regarding this very complex task, it will be sincerely appreciated and the outcome will likely end with a very popular YouTube channel to which you will be thanked.

I will attempt 6x screen records tomorrow as a test to see if they correctly stitch together in 1080p and I will of course post my results.

Thank you for reading this lengthy topic and again, thank you to anyone who participates.

It turns out this actually works!

I ended up doing hoops researching how to do it and finally put it into action.
Instead of the 6x angles I initially planned I ended up recording around 14 to ensure I covered every possible angle.
This ensured that when I stitched the footage together there were no missing areas of the 360x180 sphericle footage.

I will post some of the steps I took and an example in a few days time for anyone interested in doing this themselves.

Are you recording through Matinee/Sequencer? :slight_smile:
I tried this before. But its not perfect because i think its just stiching two cameras. Which is not enough for stereo in all directions

Good luck

;523730]
It turns out this actually works!

I ended up doing hoops researching how to do it and finally put it into action.
Instead of the 6x angles I initially planned I ended up recording around 14 to ensure I covered every possible angle.
This ensured that when I stitched the footage together there were no missing areas of the 360x180 sphericle footage.

I will post some of the steps I took and an example in a few days time for anyone interested in doing this themselves.
[/QUOTE]

Would be great to see a step-by-step for seeing how this is done! :slight_smile:

@TroJanVirus: There is a lot of info about stereoscopic 360 deg. rendering here: Stereoscopic Panoramics

Thanks for that! I unintentionally learned a lot of things such as degrees of focal length and the different outputs VR can use while going through all of this which has been incredibly insightful! Thanks for your help :smiley:

I will have a tutorial up soon, I’m attending a funeral today and I will be back tomorrow afternoon. I can begin structuring a thread then and post a link here!

I am recording through OBS, purely for the fact that I tend to have optimised frame rates.

As you can imagine, 1FPS drop or raise could destroy this entire project!

That looks very interesting though, any idea how to get that running on an unreal game without any particular debugging/console whatsoever?

Here is the video to show what I’ve accomplished: