My apologies: it is this page. On the right, in “related pages”, is the broken link to the page.
Thanks, I forwarded it to our Docs team.
Hi, @gmpreussner,
Not trying to bother, but any news on this one? I would love to know if we can expect it on 4.15 or it is coming later down the line, for project planning reasons.
Thank you!
It will not be in 4.15. I am currently working on it, and I hope to have it all done for 4.16. We have a prototype of this already working in a separate stream, but it uses a log of bubblegum and shoestring.
So how about the video capture card?
Hi, I’m using the Media Framework to play video files in my system - I’ve found it all runs better if I use Native Audio Out. Is there a way to control the audio level this way though. I can’t seem to get a handle on any kind of reference to a volume multiplier.
Dan
****, haven’t had time yet, but there’s a community project that integrates the DeckLink SDK.
Dannington, there’s currently no way to control native audio out volume, sorry. I’m adding it for 4.16. I also rewrote the audio resampler for 4.16, and I’m adding audio and video frame-sync, so playing audio through the Engine should then work great, too.
Hi Gerke - Thanks for the quick reply! Your reply to @ might help immeasurably too, so that’s good serendipity. With regards to your re-work of the system for 4.16, let me know if it becomes available on the Git. I’m using the engine for a broadcast TV project (Live broadcasting in April - Yikes!) and having finer control over the video subsystem will make a huge difference to me.
Wow, great, thank u!
Glad to see this is on the radar! Media Framework definitely opens UE4 to the live broadcast world. Live capture support would streamline the workflow, integrating multiple source devices/formats into a single input, and thus remove the constraining need to convert/reconfigure.
Unfortunately, I’m a few weeks too late for the community project - the link is a 404.
I think you probably just have to log on to GitHub. This plugin is great - it’s obviously how they got the high quality video into the engine for the recent GDC car sequence. I’m really hoping for video out support - it’ll be a huge benefit for me in integrating my project into a studio. - I’m using a teranex at the moment. Frame accurate engine to sdi would be amazing, and if I could output a gbuffer to a secondary output then you’ve got a full broadcast effects system.
Yeah, Eric made the repository private until he can figure out what the copyright situation is for this code. I am currently doing another API refactor on the Media Framework, and I will likely rewrite this plug-in myself by the end of the month.
Can you envisage video output for this? I’ve had a tinker with the spout plugin but I get memory leaks and it’s frustrating to have an extra layer in the way of pure output. Also, there’s the issue of the engine running perfectly at given standardised frame rates/formats. Do you think the engine will ever really be truly broadcast compatible? How did you get round frame sync in your gdc car demo? Could you lock the engine frame rate to the incoming video stream (I’m assuming you streamed in some of this data - it was mentioned in the presentation, or is that only for on-site pre-vis for the camera and environment map).
I imagine with all of these non-game uses for the engine you must have broadcast quality output on the roadmap somewhere.
Cheers!
Edit - just realised I sound like a bit of a broken record -
So, for 4.17 I’m trying to consolidate all our output stuff into the Media Framework API as well. Currently, the API only handles input, but yeah, we need to be handling output as well.
For the GDC demo, we locked the Engine to 24 FPS and enabled vsync (projector was running at 24 FPS as well). The video frames (coming in via a new EXR player I wrote) were locked to the Engine frame rate. Since we didn’t output back to AV equipment during the demo, the problem of syncing output did not yet arise. Time code support for output is on the roadmap for 4.17 as well. We certainly want to be 100% broadcast compatible soon.
I’ve just been testing it a bit again in 4.15 and found a crash after going fullscreen while it’s playing in editor.
Does youtube work yet and if so how do you go about setting that up?
Once the new audio engine is released will it be possible to do a anything with video audio like effects and visualisers?
Well, youtube support should be reagarded production-ready only when both mp4 h264 and webm vp9 & vorbis/opus are supported. (plus vp8 maybe)
I wasn’t able to use webm vp9, but it does MP4 fine.
Secondairly, there should later be support for very high bitrates and one of the more quality oriented codecs, because this is a modern engine we’re talking about, not talking about whole movies here, just some short clips such as in-game computer screens etc
I’m not sure maybe if that’s already dealth with other things like RadGameTools, but if the engine is striving to keep colors and all the GFX things as good as possible, the engine should be able to handle the importing of extra high quality video content as well.
The point is that if someone tries to do a really good scene, but then can’t import a short 1-minute video that’s over 20 megabits for example, with no chroma subsampling (4:4:4) with 10-bit colors, etc. If the materials/textures support HDR, ACES Gamut,
then the Media Framework should strive to support similar kind of level to reach that wider gamut and HDR obviously, so you don’t end up running SDR-sourced video clips for that part while the rest of the sceene is in HDR,10-bit, Rec 2020 for example.
It would probably make the video stand out a lot and the difference would be easily noticable if it’s both WCG and HDR that’s missing.
Yeah, this is a known issue. Resizing the Editor viewport while playing media may cause a crash, because the Engine incorrectly reports IsInRenderingThread() is true during render thread suspension. It will be fixed in Media Framework 3.0.
YouTube is not supported out of the box yet. The closest to supporting it is the VlcMedia plug-in. However, it does not currently have a parser for YouTube video descriptors, so you cannot play YouTube URLs directly. You’d have to extract the actual stream URIs by hand - those can be played. LibVLC does not support YouTube video descriptors directly. Instead they implemented a Lua based parser in the VLC Player application. We have to re-implement this parser in C++ for UE4. If someone wants to work on that, please feel free to send me a pull request on GitHub. I probably won’t get to it myself until later this year as it is fairly low priority right now.
Yes. Media Framework 3.0 will be integrated with the new Audio Mixer sub-system. Unfortunately, this means that existing content will have to be updated as the MediaSoundWave assets go away and will be replaced with a new sound component instead. But that shouldn’t be too much of a hassle. Similarly, MediaTexture assets will likely be removed as well (I’m still working on that right now), and users will get a special kind of sampler node for regular Materials instead. This will allow combining multiple videos in a single material, i.e. for green screen compositing or effects.
10-bit pixel formats / HDR support are on my to-do list. I’m also working on support for 360 video, stereoscopic video, and performance improvements for 4K and 8K.
It’s so great to see how the media framework is evolving.
Are you planing to expose the framerate and dimension of the current video stream in blueprint?
Hey, I’m really appreciating all the efforts that are going into the improvements to the media framework. I saw in your original post on this thread that you mentioned that you were going to look into supporting Google’s VP9 codec. Have you had any success in this regard? Am very interested in how you see support for 4K and greater evolving.
Thanks. It’s good to see the direction unreal is going with all the media, audio and sequencer improvements. It’s going to be quite useful for creating all sorts of content other than just games.