We’ve been working on a new feature called the Media Framework for the past few weeks. It will add media playback functionality back into the Engine. A preview version is already in the Master branch, and it will also be available in 4.5. The feature is not quite production ready yet, as we’re still working on a number of problems. Nevertheless, we’d like to provide some preliminary documentation for it, so you can start playing around with it and let us know how it works for you.
Don’t we already have media playback from UE3?
Unfortunately no, UE4 did not have this capability until now. In UE3 we used a third-party solution called Bink, but since we decided to make the entire source code of Unreal Engine 4 available to the public, we could, of course, no longer include Bink out of the box. Some of the old code pieces were still present in the Engine code base, but they were never implemented again for UE4. Rather than integrating one particular media playback solution in UE4, we decided to provide a generic API instead, which will allow anyone in the community to create their own player plug-ins for audio and video playback, including exotic formats and proprietary hardware. Of course, we will also implement some plug-ins at Epic.
But wait, isn’t there some movie playback code in the Engine already?
We currently have the MoviePlayer system, but it is very limited and can only be used for playing startup movies while the Engine loads up. It cannot be used to play movies in-game, such as on the UI or on a static mesh in your level. The Media Framework will provide much more general media playback capabilities. Here is a list of the most important additions:
- Engine & Slate agnostic
- Localized audio & video tracks
- Content Browser, Material Editor & Sound system support
- Blueprint & UMG integration
- Streaming media
- Fast forward, reverse, pause & scrubbing
- Pluggable players
At some point we will use the Media Framework to also re-implement startup movie functionality. In the meantime, both the MoviePlayer system and the Media Framework will exist side by side.
What exactly is the Media Framework?
The Media Framework is largely a collection of C++ interfaces, a couple helper classes for common use cases and a media player factory that can be extended with so called Media Player plug-ins. All the interesting work of playing movies and other media files happens in the plug-ins. Currently, we have a player plug-in for Windows, which uses the Windows Media Foundation API under the hood. It is the most complete implementation, but it still has a few issues, particularly with certain movie formats, such as H.264 encoded .mp4 files. We are working on resolving these. We also have a plug-in for MacOS, which uses Apple’s AV Foundation. Its feature set is currently quite limited, but it should also work on iOS (we haven’t tested it yet). Work has begun on a player plug-in for Android.
The framework itself is both Engine and Slate agnostic, which means that we can use it in pretty much any application, not just the game engine or the Editor. There are additional layers on top of the framework that provide media playback capabilities to other sub-systems, such as Engine, Blueprints, **Slate **and UMG. This should cover all expected use cases, such as in-game textures & UI, in-Editor video tutorials, and Marketplace videos.
Programmers can find the code for the Media Framework in the following location: /Engine/Source/Runtime/Media/
The existing player plug-ins are located in the /Engine/Plugins/Media/ directory.
How can I play media in the Engine?
Most users, especially content creators will interact with media directly in the Content Browser. There are three new asset types that you can create there: MediaPlayer, MediaTexture, and MediaSoundWave.
The MediaPlayer, as the name suggests, represents a player for a media source, such as a movie file on disk or a streaming media URL on the internet. Unlike UTextureMovie in Unreal Engine 3, MediaPlayers do not actually contain any data - they only store the path or URL to the media source. You can create a new MediaPlayer through the right-click menu in the Content Browser, or you can drag and drop a supported media file type into the Content Browser. What file types are supported depends on which player plug-ins you have installed:
Important: As of this writing, local movie files on your coputer have to be located inside the /Content/Movies/ directory of your project. This is currently the only location that will be packaged correctly. We will improve this in the future.
MediaPlayers can output various content streams. Currently supported stream types are audio, video and caption texts. If you double-click on a MediaPlayer object in the Content Browser, the Media Player Editor will open, and you can inspect the properties of the media source (note to programmers: the media asset editor is also the reference implementation for using the Media Framework with Slate):
Once you have a MediaPlayer, you can create from it a MediaTexture asset, which will allow you to extract a video stream. The MediaTexture allows you to pick a video stream (if more than one are available), and you can use it just like any other texture in the Engine. If the video doesn’t show on the texture but plays in the media asset editor then most likely the wrong video track is selected. You can change this by double-clicking the MediaTexture, which opens the texture editor.
**MediaSoundWaves ** work exactly the same way for sound streams. They are still in a very early stage of development, however, and need a lot more work, so I’m not going to cover them just yet. Feel free to experiment with them on your own.
Can I control media playback with Blueprints?
Yes, all Blueprint based control of media playback is handled through the MediaPlayer class. It exposes a number of Blueprint functions and events that allow you to manipulate the media source, and to gather various properties and capabilities of the media itself:
Does it work with UMG?
You bet it does. Since MediaPlayer provides a Blueprint API and UMG is able to consume Engine materials, adding media playback to UMG is straightforward. The **ContentExample **sample project now actually contains a demo level that implements an entire media player using UMG. The name of this level is UnrealMotionGraphics, which is currently located inside the project’s /Content/Maps/TestMaps/ directory:
Double-click the /Game/UMG/MediaPlayerWidget asset to open up the UMG Editor:
There are various Blueprint scripts that implement the buttons, slider and text labels. You can test the UMG media player UI right in the Editor using ‘Play in Viewport’ and walking up to the first demo pod.
What about video capture devices, such as webcams?
We are currently looking into the possibility of adding support for video capture hardware, such as webcams and movie production equipment.
What else should I know?
The Media Framework is not quite ready for production yet. Only Windows (and to some extent MacOS, and possibly iOS) is supported right now, and even there it can be a pain to find a working video format. Microsoft’s own formats, such as .wmv generally work without problems. Other common formats, such as .mp4 are more difficult. So far we only found the MPEG-4 codec to work reliably – H.264 is still having some problems, and .avi sometimes renders upside down. Microsoft is currently helping us to get these issues resolved and also to make everything work on XboxOne. We also still haven’t found a reliable way to embed caption text tracks inside media files, but hopefully that will soon be solved as well. Once everything is more polished we will release tutorials on how to properly create media content for your game.
Android support is still a few days out, PS4 will follow afterwards. There are also some remaining usability issues with media playback that will be addressed over the next month or so. Audio tracks are not quite usable yet - audio will play, but may sound funny, and definitely won’t work in reverse yet.
In the coming weeks, the Media Framework API may change a bit in order to improve performance and usability, and to possibly accommodate other use cases, such as real-time video capture.