Hi,
I tried to modify WebMMediaPlayer to support the playback of videos with Alpha channels.
I successfully decoded the video frame data and submitted it to Samples.
But I found that both my modified WebMMediaPlayer and the engine’s WebMMediaPlayer plug-in will skip some audio and video frame data at the beginning of the video.
For example, if the video file is ten seconds long, WebMMediaPlayer will skip the first second and only play the last nine seconds.
After digging deep into the MediaFramework, i found the problem.
The UMediaSoundComponent::UpdatePlayer() function will register a receiver to the FMediaPlayerFacade, and the FMediaPlayerFacade::ProcessAudioSamples() function will dequeue one frame from IMediaSamples and enqueue it to receiver, before enqueue the frame, receiver will check itself is valid, but the frame data is already dequeued.
The problem is the UMediaSoundComponent::UpdatePlayer() function called in the UMediaSoundComponent::TickComponent(), and the FMediaPlayerFacade::ProcessAudioSamples() function also called in a tick function.
So we can’t be sure when the receiver will be registered, that will make FMediaPlayerFacade::ProcessAudioSamples() drop many datas.
My current solution is make IMediaPlayer::GetPlayerFeatureFlag(IMediaPlayer::EFeatureFlag::UsePlaybackTimingV2) return true to use the new style time control logic, this method will check the audio reciver was already exists, and then determine whether the frame data needs to be processed.
But this method also has a problem that is FMediaPlayerFacade::GetTime() will return the CurrentFrameVideoTimeStamp/CurrentFrameAudioTimeStamp, this variable will not update after FMediaPlayerFacade::Seek() called.
I cant found any way to update the variable, maybe i missed something?