Converting Hex colors to FColor using built-in capabilities erroneous.

I’m of the mind that 5.3’s C++ systems for converting Hex strings to FColor are all broken. Can you the community check my insanity please and tell me where I’ve gone wrong as I’m trying to not fight the engine.

Using various online converters as well as photoshop, "FED80DFF" converts to R=254, G=216, B=13, A=255

However in UE each conversion I found results in erroneous values.

//Examples both with and without the leading "#"
FString strHexWith = TEXT("#FED80DFF");
FString strHexWithout = TEXT("FED80DFF");
FString str;

FColor colorA;
colorA.FromHex(strHexWith);
str = colorA.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=137,G=125,B=0,A=241)		<- INCORRECT

FColor colorB;
colorB.FromHex(*strHexWithout);
str = colorB.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=0,G=8,B=197,A=0)			<- INCORRECT

int32 intValA = FParse::HexDigit(**strHexWith);
FColor colorC = FColor(intValA);
str = colorC.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=0,G=0,B=0,A=0)				<- INCORRECT

int32 intValB = FParse::HexDigit(**strHexWithout);
FColor colorD = FColor(intValB);
str = colorD.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=0,G=0,B=15,A=0)				<- INCORRECT

uint32 uintValA = FParse::HexNumber(*strHexWith);
FColor colorE = FColor(uintValA);
str = colorE.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=216,G=13,B=255,A=254)		<- Incorrect Order

uint32 uintValB = FParse::HexNumber(*strHexWithout);
FColor colorF = FColor(uintValB);
str = colorF.ToString();
UE_LOG(LogTemp, Log, TEXT("%s"), *str);
//LogTemp: (R=216,G=13,B=255,A=254)		<- Incorrect Order

I have no idea what you are doing wrong sorry, I had a quick look at FColor::FromHex() and it contains nothing weird. I tested your two strings with FromHex() and they worked as expected:

FString strHexWith = TEXT("#FED80DFF");
FString strHexWithout = TEXT("FED80DFF");

FColor Colour1 = FColor::FromHex(strHexWith);
FColor Colour2 = FColor::FromHex(strHexWithout);

UE_LOG(LogTraversabilityGraph, Log, TEXT("Colour1 = %s"), *Colour1.ToString());
UE_LOG(LogTraversabilityGraph, Log, TEXT("Colour2 = %s"), *Colour2.ToString());

/*	
	LogTraversabilityGraph: Colour1 = (R=254,G=216,B=13,A=255)
	LogTraversabilityGraph: Colour2 = (R=254,G=216,B=13,A=255)	
*/

You are calling static functions on an instance, which I would consider strange (and undesirable), but there is nothing wrong with it technically and it should work just the same.

:person_shrugging:

1 Like

Edit/Update:
Just go the work PC logged in this morning, and suddenly no issue. Checked the log, all looks good, re-loaded the editor and it stopped working again with two of the entries reporting different values from yesterday.

I added FColor::FromHex (good catch) to the test and only it produces correct results. The other six methods all produce erroneous results.

Thank you, I was loosing my mind trying to figure out why the engine hated me.


Original Reply:

It confuses the hell out of me as i wouldn’t expect it to not work across the board.

The actual use is within a custom Slate Rich Text Decorator where the hex values are extracted from the rich text so the static declaration of strHexWith and strHexWithout is merely to remove another potential cause of error.

At first I assumed it was the debugger not correctly reporting watch variables (something UE and VS does apparently) so I threw together the posted test method. I even rebooted the editor, engine, and computer several times as that’s a thing too apparently.

Oh, ok I can see some problems with your latter original examples. The first two should be fine though, the second is weird as you are ‘dereferencing’ the string to pass a char array, but FromHex() only takes an FString so it is just being implicitly convert back. Both should work though (and should have in the example you originally supplied).

The third is correct, the double deref means you are effectively passing the first char into HexDigit(), in this case ‘#’, which is not a Hexadecimal digit in which case 0 is returned.

Fourth is also correct, this time passing ‘F’ into HexDigit() gives back 15, which you then construct a color with (constructing with an int will just set the 32 bits raw, so the format is actually ARGB).

So the 5th and 6th are correct for the same reason (the int construction, and therefor ARGB format).

1 Like

If I’m understanding this correctly:

I left the * in on test #2, didn’t notice that FParse::HexDigit takes a single character not an array of them, and on top of that FParse::HexDigit returns an int, and not the uint that FColor uses to initialize so regardless it’ll always be off from what I expected.

As for the last 2 being out of expected order, it looks like my mental model & little endian is what did me in. I just don’t think in endianness anymore as I’ve been so insulated from it for such a long time.

/*FColor.h*/
#if PLATFORM_LITTLE_ENDIAN
	union { struct{ uint8 B,G,R,A; }; uint32 Bits; }; //<- I was seeing
#else
	union { struct{ uint8 A,R,G,B; }; uint32 Bits; };
#endif

/*The visual order of a  hexcolor  R='FE' G='D8' B='0D' A='FF'*/
union { struct{ uint8 R,G,B,A; }; uint32 Bits; }; // <-My mental model

Stupid oversights on my part explain why the conversion wasn’t as expected. Thank you for your time & assistance.

1 Like

Close, a couple of clarifications:
The ‘digit’ thing is all that really got you on the HexDigit() ones, the int/uint thing doesn’t really matter, the int you pass will just be treated as a uint and will otherwise work as expected.

The last two are not an endianess thing, but a pixel format thing. Standards are strange and there isn’t really one of course anyway, but in file formats and image editing programs we are accustomed to RGBA, while in hardware land the pixels on your video card are stored ARGB and this is what the FColor uses internally and what the ctor taking an int sets ‘raw’.
To correct your original last two examples you would rotate the bits right 8 places before constructing the color.