360 video in-game quality

Hi there,
I’m currently developping an interactive 360 film on OculusGo.
Can someone tell me which settings and/or compression to use in order to get the best video quality during play ?
The video is displayed on a 400* scaled sphere, as an unlit material diffusing a media texture (defined itself by a mediaplayer playing the file).
Actually I use 20 MBps 4096x4096 stereo H.265 mp4 videos, but the result looks pixelated (I tried 30MBps as well as 60 but no difference). H.264 videos appear as a green uniform screen, but with sound.
Currently using the 4.23 version.

Here is a screenshot of the rendered quality. The footage has been shot with an Insta360, 6400*6400 (6K 3D).

Please note that if the image is pixelated, so are the 3d widget placed in front of it, but NOT the controller mesh.

For thoses who are looking for these infos :
The issue comes through the stitching of the UV’s in the material. In order to set only half the texture to the entire background sphere for each eye, the UVs are stretched (1;0,5). To avoid this problem, I made a sphere in 3dsmax and unwrapped it to the upper half of the UV square. This way the unstretched texture is mapped on the entire sphere, and you just have to add a V offset for right eye.
Still looking for the best params to have a nice display (20MBps, 4096*4096 is pretty fine for well-lighted environment, but lags a bit and takes a certain time to load) , but the result is much better now.