Media Framework Documentation for 4.5 Preview

Hi all,

We’ve been working on a new feature called the Media Framework for the past few weeks. It will add media playback functionality back into the Engine. A preview version is already in the Master branch, and it will also be available in 4.5. The feature is not quite production ready yet, as we’re still working on a number of problems. Nevertheless, we’d like to provide some preliminary documentation for it, so you can start playing around with it and let us know how it works for you.

Don’t we already have media playback from UE3?

Unfortunately no, UE4 did not have this capability until now. In UE3 we used a third-party solution called Bink, but since we decided to make the entire source code of Unreal Engine 4 available to the public, we could, of course, no longer include Bink out of the box. Some of the old code pieces were still present in the Engine code base, but they were never implemented again for UE4. Rather than integrating one particular media playback solution in UE4, we decided to provide a API instead, which will allow anyone in the community to create their own player plug-ins for audio and video playback, including exotic formats and proprietary hardware. Of course, we will also implement some plug-ins at Epic.

But wait, isn’t there some movie playback code in the Engine already?

We currently have the MoviePlayer system, but it is very limited and can only be used for playing startup movies while the Engine loads up. It cannot be used to play movies in-game, such as on the UI or on a static mesh in your level. The Media Framework will provide much more general media playback capabilities. Here is a list of the most important additions:

  • Engine & Slate agnostic
  • Localized audio & video tracks
  • Content Browser, Material Editor & Sound system support
  • Blueprint & UMG integration
  • Streaming media
  • Fast forward, reverse, pause & scrubbing
  • Pluggable players

At some point we will use the Media Framework to also re-implement startup movie functionality. In the meantime, both the MoviePlayer system and the Media Framework will exist side by side.

What exactly is the Media Framework?

The Media Framework is largely a collection of C++ interfaces, a couple helper classes for common use cases and a media player factory that can be extended with so called Media Player plug-ins. All the interesting work of playing movies and other media files happens in the plug-ins. Currently, we have a player plug-in for Windows, which uses the Windows Media Foundation API under the hood. It is the most complete implementation, but it still has a few issues, particularly with certain movie formats, such as H.264 encoded .mp4 files. We are working on resolving these. We also have a plug-in for MacOS, which uses Apple’s AV Foundation. Its feature set is currently quite limited, but it should also work on iOS (we haven’t tested it yet). Work has begun on a player plug-in for Android.

The framework itself is both Engine and Slate agnostic, which means that we can use it in pretty much any application, not just the game engine or the Editor. There are additional layers on top of the framework that provide media playback capabilities to other sub-systems, such as Engine, Blueprints, **Slate **and UMG. This should cover all expected use cases, such as in-game textures & UI, in-Editor video tutorials, and Marketplace videos.


Programmers can find the code for the Media Framework in the following location: /Engine/Source/Runtime/Media/
The existing player plug-ins are located in the /Engine/Plugins/Media/ directory.

How can I play media in the Engine?

Most users, especially content creators will interact with media directly in the Content Browser. There are three new asset types that you can create there: MediaPlayer, MediaTexture, and MediaSoundWave.

The MediaPlayer, as the name suggests, represents a player for a media source, such as a movie file on disk or a streaming media URL on the internet. Unlike UTextureMovie in Unreal Engine 3, MediaPlayers do not actually contain any data - they only store the path or URL to the media source. You can create a new MediaPlayer through the right-click menu in the Content Browser, or you can drag and drop a supported media file type into the Content Browser. What file types are supported depends on which player plug-ins you have installed:

Important: As of this writing, local movie files on your coputer have to be located inside the /Content/Movies/ directory of your project. This is currently the only location that will be packaged correctly. We will improve this in the future.

MediaPlayers can output various content streams. Currently supported stream types are audio, video and caption texts. If you double-click on a MediaPlayer object in the Content Browser, the Media Player Editor will open, and you can inspect the properties of the media source (note to programmers: the media asset editor is also the reference implementation for using the Media Framework with Slate):

Once you have a MediaPlayer, you can create from it a MediaTexture asset, which will allow you to extract a video stream. The MediaTexture allows you to pick a video stream (if more than one are available), and you can use it just like any other texture in the Engine. If the video doesn’t show on the texture but plays in the media asset editor then most likely the wrong video track is selected. You can change this by double-clicking the MediaTexture, which opens the texture editor.


**MediaSoundWaves ** work exactly the same way for sound streams. They are still in a very early stage of development, however, and need a lot more work, so I’m not going to cover them just yet. Feel free to experiment with them on your own.

Can I control media playback with Blueprints?

Yes, all Blueprint based control of media playback is handled through the MediaPlayer class. It exposes a number of Blueprint functions and events that allow you to manipulate the media source, and to gather various properties and capabilities of the media itself:


Does it work with UMG?

You bet it does. Since MediaPlayer provides a Blueprint API and UMG is able to consume Engine materials, adding media playback to UMG is straightforward. The **ContentExample **sample project now actually contains a demo level that implements an entire media player using UMG. The name of this level is UnrealMotionGraphics, which is currently located inside the project’s /Content/Maps/TestMaps/ directory:

Double-click the /Game/UMG/MediaPlayerWidget asset to open up the UMG Editor:

There are various Blueprint scripts that implement the buttons, slider and text labels. You can test the UMG media player UI right in the Editor using ‘Play in Viewport’ and walking up to the first demo pod.

mediaplayerumgbp.jpg mediaplayerumg.jpg

What about video capture devices, such as webcams?

We are currently looking into the possibility of adding support for video capture hardware, such as webcams and movie production equipment.

What else should I know?

The Media Framework is not quite ready for production yet. Only Windows (and to some extent MacOS, and possibly iOS) is supported right now, and even there it can be a pain to find a working video format. Microsoft’s own formats, such as .wmv generally work without problems. Other common formats, such as .mp4 are more difficult. So far we only found the MPEG-4 codec to work reliably – H.264 is still having some problems, and .avi sometimes renders upside down. Microsoft is currently helping us to get these issues resolved and also to make everything work on XboxOne. We also still haven’t found a reliable way to embed caption text tracks inside media files, but hopefully that will soon be solved as well. Once everything is more polished we will release tutorials on how to properly create media content for your game.

Android support is still a few days out, PS4 will follow afterwards. There are also some remaining usability issues with media playback that will be addressed over the next month or so. Audio tracks are not quite usable yet - audio will play, but may sound funny, and definitely won’t work in reverse yet.

In the coming weeks, the Media Framework API may change a bit in order to improve performance and usability, and to possibly accommodate other use cases, such as real-time video capture.


This is very good news! I appreciate all the work that has gone into allowing source code access without an NDA or anything like that, but still providing the kinds of features we need. Keep up the good work!

Hi there,

So I see that packaging should work as long as I put the videos into the Content/Movie folder.

I have the movie playing fine when I am doing play as Standalone game, however when I packaged the game all I see is a white screen, so I am not sure what I am doing wrong here.

Here are everything I am doing
-I got the 4.5alpha version from early last week.
-Set Media Player URL to the exact full path on my local machine, inside the Content/Movies folder.
-Set to Auto Play on Playback
-Packaged development

Thanks for the help! I am still looking into this.

Since you will eventually support video capture devices, will you also allow capturing a still frame from a video source such as a web cam or the camera on the mobile device? That would be a really useful features in creating avatars and user profile pics.

Thanks for all the cool additions to the engine! :slight_smile:

The timing on this feature could not be better. I’ve recently started working on a project that would rely heavily on playing video files on a surface. My question as someone who has yet to commit to an engine for the project is when will this be available? I looked for a release date on 4.5, and there does not seem to be one. Understandable of course, there must be a lot going into the release.

Is it available to use in an alpha build? Or is there a rough idea of when 4.5 is slated to release?

Thank you.

Interesting. Are you looking at adding some sort of live video chat via webcam during gameplay as a use case?

If that is the case perhaps you should be looking into using WebRTC since it has a BSD Licensed, C++ API and is maintained by Google.

Even if you are not looking at live video chat, you should still take a close look at the source code:

They have implemented video capture for multiple platforms using the respective native APIs. The code is all BSD licensed so I’m sure it will be of help even just as a reference.

This is what I’ve been waiting for since UE4 went ‘live’. Thanks Epic!

Wow! This is great! Oh man! This can make creating tutorial videos much easier, and this could have some nice useful gameplay uses.

One question. Is it possible to show a current scene going on the game?

Fantastic news!

Do we have support for alpha channels or does transparency work in a similar way to UE3 where you could run a alpha video in the opacity node to get transparency?

I am having this problem as well. Any ideas?

Would this work through streaming with a h.264 mp4 video hosted in Amazon S3?

Same issue here. Works fine with PIE but when I launch my media texture is white


Will audiocapture alone be supported so that we can plug mic input into an audioactor?

Where do you store your movie files?
Maybe this here is the issue (just guessing) ??

Yeah this is what I am doing. Tried a few different settings as well with no luck :frowning:

Has anyone got it working in a built game? What codec works? Im using Mpeg-4 in a .mp4 with noooo luck :frowning:

For me, WMV works. Tried other formats as well without luck.
But WMV is ok. Once its on the screen, you cant taste the difference :smiley:

Are any of the .wmv clips you use in the examples available?

No luck on my end with wmv. Works in Editor, not while built :frowning:

Thank You !

Thanks for all your hard work !

Awesome tutorial / update!

[FONT=Comic Sans MS]:heart:


Wow- very excited to see this feature.

Seriously, you guys just need to implement some kind of animated vertex cache (like alembic, not just morph targets) and I doubt the competition will “Ryse” to the challenge. :wink:
Very impressed with how in the loop the UE4 team has kept its subscribers. I was a beta tester for another engine and had no idea what was going on half of the time. Good stuff!