Hey Rama, I'm hoping you can shed some light on an issue I'm having while attempting to use your plugin. I have a 4096x4096 texture imported from my image editor of choice, Krita, and have used your plugin to generate an array of pixels from the image. I then convert to hit location of a trace and find that within the array of pixels, determining the color of the pixel in the array at that coordinate. Now, this works for pixels that have binary RGB values, such as (0, 1, 1) / (0, 255, 255) or (1, 0, 1) / (255, 0, 255). However, whenever I attempt to read to pixel color of a pixel in the texture that is anywhere between 0 and 1, the result is highly inaccurate to both UE4's color picker, and my image editing software. sRGB is also turned off, and all other requirements for the node to work are in effect.
I have attempted to use (0, .5, 0), in my image editing software, this would be (0, 128, 0), and I then import the image. The pixel color result via your node comes out to something around (0, .29608, 0), and even when using UE4's color picker on the ingame texture, I still come out to around .5~. Needless to say this is due to the color profile in UE4 conflicting with the color profile of the image editing software I am using, but I don't exactly know UE4's color profile or how to convert my RGB from my software to the engine accurately. Relatedly, trying to use .bmp's with the node crashes UE4.
I have attempted to use (0, .5, 0), in my image editing software, this would be (0, 128, 0), and I then import the image. The pixel color result via your node comes out to something around (0, .29608, 0), and even when using UE4's color picker on the ingame texture, I still come out to around .5~. Needless to say this is due to the color profile in UE4 conflicting with the color profile of the image editing software I am using, but I don't exactly know UE4's color profile or how to convert my RGB from my software to the engine accurately. Relatedly, trying to use .bmp's with the node crashes UE4.
Comment