How to create texture with D3D11_USAGE_DYNAMIC

I thought UTexture2DDynamic served this purpose. But it doesn’t. So, how to do it in Unreal without having to change source?

My goal is to use the texture in a material, setting it through blueprints, so it must be the resource of a UTexture class descendant. Event if I had to create my own UTexture and FRHIResource descendants, is it possible?

Yes! It is possible! \o/

Thank you Epic for such a well commented code and for letting us see it! And, mostly, for it being C++!


My intent was to use these textures with Kinect (on Windows) buffers for updating each frame, so I focused on dynamic textures for D3D11 only. I’m not trying to improve the engine with new nor portable features as my C++ level is not good enough for it. Also, I wasn’t very eager to alter the source code as with every new engine version, I would have a lot of work re-doing my changes.

I’m using this texture with pixel formats PF_B8G8R8A8, PF_G8 and PF_G16 without problems, just like it was with UTexture2D.

How I did it

First, I copied from the UTexture2DDynamic class. Everything in it, as I didn’t know how to create a working UTexture child class from scratch and that class is really simple compared to UTexture2D.

Then, I saw it used Texture2DDynamicResource to issue the actual texture creation on GPU from RHI. So, I also made my own TextureResource child class.

The Texture2DDynamicResource::InitRHI method was calling the global method RHICreateTexture2D and, looking at the source, that was when it created the ID3D11Texture2D with D3D11_USAGE_DEFAULT. So, that was the method I had to recreate in order for it to do as I wanted without altering the engine’s source.

Most of that method’s inner calls were static methods easily found in Public headers. No big deal there. Although, some other methods called were either from D3D11RHIPrivate or private members of the original method’s class. Those, I copied as globals on my own headers. It was not a problem compiling, but when packaging the project, it gave me Linker problems, as there were DLL exported globals with the same name on the D3D11RHI module. The solution was simple: put my modified D3D11RHI methods inside a namespace. Also, for insurance, I added _mod to their names.

How to use it

I leave attached in this answer the .h and .cpp files necessary to use my DynamicTexture2D Frankenstein’d class. You’ll only have to adapt the first .cpp #include to your project/plugin PrivatePCH.

The code to create a texture from it is very similar to that of creating a UTexture2D. The updating part that is totally new, as Unreal does it using ID3DDeviceContext::UpdateSubresource and it seem to not work with D3D11_USAGE_DYNAMIC. (for Unreal standard ways of updating textures in real time, read this). For ease of use, I left a method in the class just for updating the texture.

#include "DynamicTexture2D.h"

void UYourClassName::CreateTexture(int32 Width, int32 Height, EPixelFormat PixelFormat)
	// Texture is a UDynamicTexture2D*
	Texture = UDynamicTexture2D::Create(Width, Height, PixelFormat);
	Texture->MipGenSettings = TMGS_NoMipmaps;
	Texture->CompressionSettings = TextureCompressionSettings::TC_VectorDisplacementmap;

void UYourClassName::UpdateTexture()
	// must do it on render thread!
			uint8* Buffer = /* get your CPU buffer address */;
			/* Your buffer size in BYTES.
			Eg., if your format is PF_B8G8R8A8, it will be width * height * 4 */
			SIZE_T BufferSize = /* Buffer size in BYTES */;

			if (Buffer)
				Texture->UpdateContents(Buffer, BufferSize);

Don’t forget!

If you use the attached files, you must add the RenderCore, RHI and D3D11RHI modules to your project/plugin Build.cs file in order for all the headers included to work. Below is that part from my Build.cs:

PublicDependencyModuleNames.AddRange(new string[] {
	"Core", "CoreUObject", "Engine", "RHI", "RenderCore", "D3D11RHI"

I’m not really certain about RenderCore and RHI, but I did use some globals from RHI.h (without the need to include it in any .h).

#Updated for 4.11

As Epic has changed the way their game and render threads worked (for the better) there’s a small change to make in the RHICreateDynamicTexture2D method, inside D3D11Modifications.h. Just replace

if (GRHIThread)


FScopedRHIThreadStaller StallRHIThread(FRHICommandListExecutor::GetImmediateCommandList());

And, inside the same method, in the end, you’ll add some lines. It’ll be like this:

Texture2D->ResourceInfo.VRamAllocation = VRamAllocation;

if (Flags & TexCreate_RenderTargetable)

return Texture2D;

Great coverage of how you solved it. Thanks for providing such thorough detail.

I am loading texture run-time from hard drive using victory plugin…it will create moire effect during rendering.victory plugin is unable to generate mips(Minmaps)…Is there any way to generate mips run-time ?

I don’t know, as for real-time updating textures we don’t want mips. But there’s a parameter when creating a texture for if you want mips or not. I don’t know if setting that to true would automatically create them. You could try.

I had already tried it with Set Mip Gen Setting blueprint node,It’s possible for only imported texture not for runtime…Is it possible with your solution?..Or any idea about how to get minmaps with runtime texture loading.

I think it won’t be possible in blueprints. They don’t have any function for manipulation of textures in runtime. You’ll have to mess with the plugin in C++ code. Copy the function that creates a texture from a file and alter it for mip generation on texture creation.

Here is the function that I’m using without any success.

UTexture2D* UVictoryBPFunctionLibrary::Victory_LoadTexture2D_FromFile(const FString& FullFilePath,EJoyImageFormats ImageFormat, bool& IsValid,int32& Width, int32& Height)

IsValid = false;
UTexture2D* LoadedT2D = NULL;

IImageWrapperModule& ImageWrapperModule = FModuleManager::LoadModuleChecked<IImageWrapperModule>(FName("ImageWrapper"));
IImageWrapperPtr ImageWrapper = ImageWrapperModule.CreateImageWrapper(GetJoyImageFormat(ImageFormat));

//Load From File
TArray<uint8> RawFileData;
if (!FFileHelper::LoadFileToArray(RawFileData, *FullFilePath)) return NULL;
//Create T2D!
if (ImageWrapper.IsValid() && ImageWrapper->SetCompressed(RawFileData.GetData(), RawFileData.Num()))
	const TArray<uint8>* UncompressedBGRA = NULL;
	if (ImageWrapper->GetRaw(ERGBFormat::BGRA, 8, UncompressedBGRA))
		LoadedT2D = UTexture2D::CreateTransient(ImageWrapper->GetWidth(), ImageWrapper->GetHeight(), PF_B8G8R8A8);
		if(!LoadedT2D) return NULL;
		Width = ImageWrapper->GetWidth();
		Height = ImageWrapper->GetHeight();
		void* TextureData = LoadedT2D->PlatformData->Mips[0].BulkData.Lock(LOCK_READ_WRITE);
		FMemory::Memcpy(TextureData, UncompressedBGRA->GetData(), UncompressedBGRA->Num());

// Success!
IsValid = true;
return LoadedT2D;


This is the line you wanna play with. TMGS_FromTextureGroup says that the mip map generation will depend on what group is set in the texture. I didn’t find a way to change the group, so try changing TMGS_FromTextureGroup to other TMGS types and see if it works. There are ones more blurry and more sharpened. The extreme ones would be:




Try both and see what works best for you. There’s also TMGS_SimpleAverage, which I believe to be the default one for textures imported in the editor. You could try it too. If none makes a difference, then only Epic can help you… sorry.

I followed your steps. While playing the game when event occur, it closes the editor without any prompt or crash report.

Try this to debug: launch the game from Visual Studio. This way, when the error occurs, it will break at some line of code and display all the functions stack to get to that point.

Open the project solution in Visual Studio, press F5 then Yes.