Want to draw NormalMap to RenderTarget, but not watery yellow!

Hello everyone :slight_smile:
After a lot of trial and error, I was able to get the basecolor of the TextureSampleParameter2D used by the material to RenderTarget with the original color.
After drawing the pixel values from the RenderTarget to the Texture, I exported the Texture to a .jpg using the ‘Export to Disk’ node in the Blueprint, and the basecolor was drawn in the original color as expected.

But… The problem is the normalmap.
When I take the normalmap to the RenderTarget, I get a watered down yellow color instead of the original bluish color.

If I draw the Texture with the pixel values from the RenderTarget, and then use the ‘Export to Disk’ node, I get the same watery yellow .jpg file.
I tried all sorts of things in the code, including changing the CompressionSettings and SRGB values, but the normal map was still drawn to the RenderTarget in wash out yellow. Below is my code.

void ADynamicTextureAtlas::CheckNormalColor(UTexture2D* InTexture)
{
	TextureCompressionSettings OldCompressionSettings = InTexture->CompressionSettings;
	TextureMipGenSettings OldMipGenSettings = InTexture->MipGenSettings;
	bool OldSRGB = InTexture->SRGB;

	InTexture->CompressionSettings = TextureCompressionSettings::TC_VectorDisplacementmap;
	InTexture->MipGenSettings = TextureMipGenSettings::TMGS_NoMipmaps;
	InTexture->SRGB = false;
	InTexture->UpdateResource();

	const FColor* NormalTexture = reinterpret_cast<const FColor*>(InTexture->GetPlatformData()->Mips[0].BulkData.LockReadOnly());
	
	UE_LOG(LogTemp, Log, TEXT("R: %d, G: %d, B: %d, A: %d"), NormalTexture[0].R, NormalTexture[0].G, NormalTexture[0].B, NormalTexture[0].A);

	InTexture->GetPlatformData()->Mips[0].BulkData.Unlock();

	InTexture->CompressionSettings = OldCompressionSettings;
	InTexture->MipGenSettings = OldMipGenSettings;
	InTexture->SRGB = OldSRGB;
	InTexture->UpdateResource();
}

bool ADynamicTextureAtlas::TestFunction(FString BaseOrNormal, UTextureRenderTarget2D* InputRenderTarget, UTextureRenderTarget2D*& OutRenderTerget)
{
	if (!Avatar || !BaseOfDynamicMaterial)
		return false;

	UCanvas* BaseColorCanvas;

	FVector2D SizeOfAtlas(AtlasSize, AtlasSize);
	UKismetRenderingLibrary::BeginDrawCanvasToRenderTarget(GetWorld(), InputRenderTarget, BaseColorCanvas, SizeOfAtlas, BaseOrNormalContext);
	{
		TArray<USceneComponent*> AvatarCompList;
		Avatar->GetRootComponent()->GetChildrenComponents(true, AvatarCompList);

		for (USceneComponent* Component : AvatarCompList)
		{
			if (Component->GetName() == TEXT("Torso")) //Torso, Legs, Feet
			{
				USkeletalMeshComponent* SMComponent = Cast<USkeletalMeshComponent>(Component);
				UMaterialInterface* SMCMaterialInterface = SMComponent->GetMaterial(0);

				TArray<FMaterialParameterInfo> OutParameterInfo;
				TArray<FGuid> OutParameterIds;
				SMCMaterialInterface->GetAllTextureParameterInfo(OutParameterInfo, OutParameterIds);
				
				for (const FMaterialParameterInfo& ParaInfo : OutParameterInfo)
				{
					if (ParaInfo.Name.ToString().Contains(BaseOrNormal)) //base, normal
					{
						UTexture* TextureForMID;
						SMCMaterialInterface->GetTextureParameterValue(ParaInfo, TextureForMID);

						//UTexture2D* InTexture = Cast<UTexture2D>(TextureForMID);
						//if (InTexture)
						//    CheckNormalColor(InTexture);

						FVector2D ScreenPosition(Row * TextureSize, Column * TextureSize);
						FVector2D ScreenSize(TextureSize, TextureSize);
						FVector2D CoordinatePosition(0.f, 0.f);
						
						BaseColorCanvas->K2_DrawTexture(TextureForMID, ScreenPosition, ScreenSize, CoordinatePosition, FVector2D::UnitVector, FLinearColor::White, EBlendMode::BLEND_Opaque); //FLinearColor::Transparent, BLEND_Translucent, EBlendMode::BLEND_Opaque
					}
				}
			}
		}		
	}
	UKismetRenderingLibrary::EndDrawCanvasToRenderTarget(GetWorld(), BaseOrNormalContext);
	OutRenderTerget = InputRenderTarget;

	UTexture2D* RTToTexture = UTexture2D::CreateTransient(InputRenderTarget->SizeX, InputRenderTarget->SizeY, PF_B8G8R8A8, FName(*BaseOrNormal));

	RTToTexture->CompressionSettings = TextureCompressionSettings::TC_VectorDisplacementmap;
	RTToTexture->SRGB = false; //InputRenderTarget->SRGB
#if WITH_EDITORONLY_DATA
	RTToTexture->MipGenSettings = TMGS_NoMipmaps;
#endif
	RTToTexture->UpdateResource();

	TArray<FColor> OutRenderTargetPixel; //FFloat16Color
	FRenderTarget* RTResource = InputRenderTarget->GameThread_GetRenderTargetResource();
	RTResource->ReadPixels(OutRenderTargetPixel); //RTResource->ReadFloat16Pixels(OutRenderTargetPixel);

	uint8* TextureData = reinterpret_cast<uint8*>(RTToTexture->GetPlatformData()->Mips[0].BulkData.Lock(LOCK_READ_WRITE));
	{
		--TextureData;
		//FMemory::Memcpy(TextureData, OutRenderTargetPixel.GetData(), OutRenderTargetPixel.Num());
		for (int32 i = 0; i < OutRenderTargetPixel.Num(); ++i)
		{
			const FColor& Color = OutRenderTargetPixel[i];

			*(++TextureData) = Color.B;
			*(++TextureData) = Color.G;
			*(++TextureData) = Color.R;
			*(++TextureData) = Color.A;
		}
	}
	RTToTexture->GetPlatformData()->Mips[0].BulkData.Unlock();

	RTToTexture->CompressionSettings = BaseOrNormal == FString("base") ? TextureCompressionSettings::TC_Default : TextureCompressionSettings::TC_Normalmap; //TC_Default, TC_Normalmap, TC_HDR
	RTToTexture->SRGB = true; //base is true, normal is false
	//RTToTexture->MipGenSettings = TMGS_LeaveExistingMips; //TMGS_FromTextureGroup
	RTToTexture->UpdateResource();

	if (BaseOrNormal == FString("base"))
		AvatarTextureAtlas.BaseColorTex = RTToTexture;
	else
		AvatarTextureAtlas.NormalTex = RTToTexture;

	return true;
}

I was so frustrated that I checked the pixel colors of the normalmap before it was drawn to the RenderTarget, but of course they were the same as the original colors.
When I checked the pixel values of the RenderTarget in ‘RTResource->ReadPixels(OutRenderTargetPixel)’, the B value of RGB became 0 and the RG value was also different from the original.
I’m wondering how on earth I can get the normalmap colors to be drawn to the RenderTarget as they were originally…!
I’m anxiously waiting for your help ;_;

Normal maps, like all color textures store rgb values in the [0,1] range. You can not have a pixel color value of -1, for example.

We use normal maps to store 3 component vector data, a direction, and this direction sometimes needs a negative value (for example, a (r) value of -1 might mean the normal vector points to the left, where a (r) value of 1 would mean it’s pointing to the right.

So if normal data needs negative values, but storing that data in a texture doesn’t support a negative value, what do we do? We encode the data! (this is also called packing and unpacking normals).

Since the texture can only store values in the [0,1] range, when we know it’s a normal map, we read the pixel and then remap it into [-1,1] range before we use it. Conversely, when we want to store a [-1,1] vector in a texture, we remap it back to [0,1].

This conversion is usually as simple to go from [0,1] → [-1,1] , we take our value, multiply it by 2 (remapping it to [0,2]) and then subtract 1 (remapping it to [-1,1]). To go from [-1,1] → [0,1], we add 1 then divide by 2.

This also means that, a vector (in the range of [-1,1]) with a .z (blue channel) value of 0, when it gets encoded to a value that can be stored in a texture we gotta remap it to [0,1] range, so we take that 0 value, add 1 and then divide by 2 = 0.5. This is what gives the normal textures that “blue” look, because there is usually a non-zero value in the blue channel

So it just sounds to me like you’re missing this encoding, you’re reading the raw values in, and when they write to the texture you’re writing the raw vector values, and maybe you should be writing the encoded values.

Note: Some texture types I think can store negative values, I think some Render Texture formats can, but most textures it’s a good idea to make sure they’re in the correct [0,1] range before writing texture data to them and packing/unpacking data as needed.

1 Like

Thank you very much for your answer.
After experimenting with my code, which I modified one more time before trying your advice, I found something interesting.

First, I used the Blueprint nodes provided by unreal engine, not my own code, to see how the NormalMap would look like on the RenderTarget and what would happen if I exported it as a jpg file.
I realized that the result was the same as the result of my code.

In the Blueprint logic above, “T_Test” contains the original bluish NormalMap texture. “TRT_Test” contains a RenderTarget of type UTextureRenderTarget2D.
When I ran the above logic, the RenderTarget drew a watery yellow color, just like the RenderTarget in my code.
When I checked the pixel values of the RenderTarget, the B value of RGB was 0, and the RG value was also different from the original NormalMap value.
The original NormalMap had pixel 0 at (R:139, G:134, B:254, A:255), and the RenderTarget had pixel 0 at (R:190, G:191, B:0, A:0).

Interestingly, my modified code, shown below, results in very similar pixel values.

bool ADynamicTextureAtlas::NormalTestFunction(UTextureRenderTarget2D*& OutRenderTerget)
{
	if (!Avatar || !BaseOfDynamicMaterial)
		return false;

	UCanvas* BaseOrNormalCanvas;
	BaseOrNormalRenderTarget = UKismetRenderingLibrary::CreateRenderTarget2D(GetWorld(), AtlasSize, AtlasSize, RTF_RGBA16f, FLinearColor::Transparent);
	BaseOrNormalRenderTarget->CompressionSettings = TextureCompressionSettings::TC_Normalmap; //TC_VectorDisplacementmap, TC_Normalmap, TC_Default
	BaseOrNormalRenderTarget->SRGB = false; //false
	BaseOrNormalRenderTarget->MipGenSettings = TextureMipGenSettings::TMGS_NoMipmaps;
	BaseOrNormalRenderTarget->UpdateResource();

	FVector2D SizeOfAtlas(AtlasSize, AtlasSize);
	UKismetRenderingLibrary::BeginDrawCanvasToRenderTarget(GetWorld(), BaseOrNormalRenderTarget, BaseOrNormalCanvas, SizeOfAtlas, BaseOrNormalContext);
	//Inside this function, substitute the UWorld member variable CanvasForRenderingToTarget for BaseOrNormalCanvas.
	{
		TArray<USceneComponent*> AvatarCompList;
		Avatar->GetRootComponent()->GetChildrenComponents(true, AvatarCompList);

		for (USceneComponent* Component : AvatarCompList)
		{
			if (Component->GetName() == TEXT("Torso")) //Torso, Legs, Feet
			{
				USkeletalMeshComponent* SMComponent = Cast<USkeletalMeshComponent>(Component);
				UMaterialInterface* SMCMaterialInterface = SMComponent->GetMaterial(0);

				TArray<FMaterialParameterInfo> OutParameterInfo;
				TArray<FGuid> OutParameterIds;
				SMCMaterialInterface->GetAllTextureParameterInfo(OutParameterInfo, OutParameterIds);

				for (const FMaterialParameterInfo& ParaInfo : OutParameterInfo)
				{
					if (ParaInfo.Name.ToString().Contains("normal")) //base, normal
					{
						UTexture* TextureToDraw;
						SMCMaterialInterface->GetTextureParameterValue(ParaInfo, TextureToDraw);

						FVector2D ScreenPosition(Row * TextureSize, Column * TextureSize);
						FVector2D ScreenSize(AtlasSize, AtlasSize);
						FVector2D CoordinatePosition(0.f, 0.f);
						
						//UMaterialInstanceDynamic* MaterialToDrawing = UMaterialInstanceDynamic::Create(BaseOfDynamicMaterial, GetWorld());
						//MaterialToDrawing->SetTextureParameterValue(TEXT("Brush"), TextureToDraw);
						//BaseOrNormalCanvas->K2_DrawMaterial(MaterialToDrawing, ScreenPosition, ScreenSize, CoordinatePosition);

						BaseOrNormalCanvas->K2_DrawTexture(TextureToDraw, ScreenPosition, ScreenSize, CoordinatePosition, FVector2D::UnitVector, FLinearColor::White, EBlendMode::BLEND_Opaque); //Transparent, BLEND_Translucent, BLEND_Opaque
						goto Escape;
					}
				}
			}
		}
	}
	Escape:
	UKismetRenderingLibrary::EndDrawCanvasToRenderTarget(GetWorld(), BaseOrNormalContext);
	OutRenderTerget = BaseOrNormalRenderTarget;

	ConvertRenderTargetToTexture("normal", BaseOrNormalRenderTarget);

	return true;
}

Only A of the RGBAs in the RenderTarget had a value of 255, but the rest of the RGB values were the same as the RGB values of the RenderTarget drawn through the Blueprint logic.

In the end, my conclusion is that it’s no wonder I get a watery yellow color when I draw a NormalMap to a RenderTarget.

I also noticed that any RenderTarget, whether drawn from Blueprint logic or generated and drawn from within my code, that goes through the “RenderTarget Create Static Texture Editor Only” node (compression setting is Normalmap) produces a bluish texutre. This means that the same texture as the original Normalmap has been extracted.

Now I’m trying to figure out how to make the texture that is drawn using the RenderTarget drawn in my code a blue color!
Thanks again for your attention, help and answers :slight_smile:

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.