Hi, I’m trying to use the Media Framework to play in app high resolution videos on a Galaxy Note 4. While the Qualcomm H.264 video decoder on this device has the ability to decode 4k video at 30 FPS, I’m getting far lower fps in my project.
Is the Media Framework plugin for Android currently using the hardware Qualcomm H.264 video decoder on this device or is it using some sort of software cross-device decoding?
Is there a way to enable the hardware accelerated decoding of videos for this specific device?
The movie support for Android uses android.media.MediaPlayer. There are two movie player systems; one that uses Slate (fullscreen) which the startup movies uses, and the other is the media framework plugin. The media framework plugin uses a shader to swizzle the rgba / bgra order before a glReadPixels into a memory buffer that is queued and copied back to a texture to be used by the material based on the playback rate. This adds a number of steps which for a 4k video would be expensive after the decoding.
Thank you for the explanation, Chris. Shall we expect any improvement to this in the near future?
Just for my personal understanding, if it wasn’t for the rgba / bgra conversion the video decoding would not impact the workload of the gpu, would it?
The biggest performance hit is the glReadPixels into a buffer which is queued for the media framework system to then turn around and upload to another texture. In future I’d like to see this step skipped and the FBO created be queued for direct use by the material instead, but the framework expects an RGB buffer. You can see the work done by MediaPlayer14.java in copyFrameTexture().
Thank you, I’ll give a look into it. So, right now I’m out of luck if I need to play hi res videos on a Note4? I’m trying to find a way to reproduce 360 videos in gear VR using UnrealEngine, do you think it is feasible at all, considering I should keep the global framerate (not the video one) at 60fps?
I’d say keeping 60 fps may be difficult if you need 4k resolution; I’d try scaling them down and see how it performs. Some changes are being made which will allow bypassing the extra copies, but it would be 4.10 at the earliest before this will likely be in for Android.
Currently I have to stay at around 480p if I want to keep 60fps in a very basic scene (only unlit materials, no translucent materials) with the GearVR. 720p causes already a big drop in the framerate (it goes as low as 15-20fps). Of course, this resolution is too low for a 360 video, likely I’d need at least 2048x2048 (it is totally feasible on this device with native applications, most of the 360 videos in oculus home are this resolution, even at 60fps - the video itself, not just the game)
The hooks needed for optimization made it in 4.9, but reworking the Android media player code to use them will not. I will be looking at doing so for 4.10.
Any update about this? Can we expect to see this in 4.10?
Also, in 4.9 I’m experiencing a few bugs related to the media player in Gear VR, like this one. Are those known issues?
I haven’t had any issues playing mp4’s with or without GearVR in my testing for 4.11 with the performance improvements. I looked at your material setup; I usually have the texture connected to both base color and emissive and set shading model to unlit.
I am using 4.11 preview 4 and seeing similar performance degradation when playing back a 1024x1024 h.264 mp4 encoded video with bit rates variable between 5 and 7 mbs on my Note 4. Any suggestions on what I might due to improve the playback performance. When the video ends, fps goes from 15-20 to 60fps