Looking for alternative to video textures and feature regression in new External Texture API


According to this AnswerHub entry the new External Texture API removed support for using video textures in pixel shaders. This has already been entered into the issue tracker and hopefully will be fixed for 4.19 (but the tone of Epic’s reply makes it seem that it might not happen soon). In the meantime, I need to use depth and color data recorded in video files (Zed camera captures) as textures to drive the vertex world displacement offset in a material. This works beautifully in 4.17 but I need it for a 4.18 project that I can’t downgrade. In 4.18, trying to connect the video texture (that now gets marked as Sampler Type External) to any pin that belongs to the vertex shader produces the error [FONT=courier new][SM5] (Node TextureSample) Invalid node used in vertex/hull/domain shader input!.

As a workaround I tested using a SceneCapture2D camera, as suggested in AnswerHub, but that introduces pixel location displacement (it’s hard to align the video plane properly to the capture camera) and affects the precision (there seems to be detail degradation in the re-rendering of the texture). While this gets fixed (if it does), I was wondering if anyone here would know of another workaround that doesn’t involve camera captures.To be honest this is very frustrating since I’ve been wanting to use streams of depth data in UE4 for some time and when I finally get hold of the hardware, this feature regression makes it now harder to accomplish.


shit still broken.

Instead of using a SceneCapture2D, you could work around it using a CanvasRenderTarget and rendering a simple UI domain material using your video texture as an input into it using the DrawMaterial function. This way you can ensure pixel perfect alignment. I don’t think you can write into the alpha-channel of a CanvasRenderTarget, but you could use a separate render target and UI material just for the alpha.