Hi,
I’m using a compute shader in a postprocessing pass. When sampling the scene colors texture, I’ve noticed that I always end up with a few black pixels, even when the brightness is at its maximum.
I made a histogram CS, and this scene to test it :
[Image Removed]
In the black box, the expected result is correct:
[Image Removed]
but in the white one, there is still pixels counted as black:
[Image Removed]
The weird part is that the number of black pixel increase by making the viewport bigger or smaller (i’m dragging the scene outliner side tab)
[Image Removed]
In the CS code i’m taking account of the texture size:
if (DTid.x >= TextureSize.x || DTid.y >= TextureSize.y)
{
return;
}
and I retrieve the texture size with:
int32 Width = TextureSRV->GetParent()->Desc.Extent.X;
int32 Height = TextureSRV->GetParent()->Desc.Extent.Y;
[...]
PassParameters->TextureSize = FUintVector2(Width, Height);
Is this normal behavior ? Am I missing something ? Perhaps i’m getting the texture size wrong, or I’m overflowing in the CS ? I don’t know if I’ve forgotten something obvious.
Thank you in advance,
Best regards,
Jérémy
[Attachment Removed]
Steps to Reproduce
Minimal project to observe the issue :
https://github.com/jealvcim/ViewExt\_Compute
[Attachment Removed]
Hi there,
As you described, the black pixels appear and the count changes when the viewport size changes. This behavior can occur when sampling outside the valid region of the texture or when reading undefined pixels.
In the Unreal Engine rendering pipeline, SceneColor textures are often larger than the actual viewport. The texture size represents the allocated render target size, not the valid rendered region. Only ViewRect contains valid pixels. The remaining areas of the texture may contain black or undefined (garbage) data.
I made the following changes in the repro project, which appear to resolve the issue.
[Image Removed]
I hope this clarifies the case. Please let me know if you have any further questions.
Best regards,
Henry Liu
[Attachment Removed]
Thank you.
I tryed with View.UnconstrainedViewRect. It works perfectly in standalone game, but In PIE, the width/height of the rectangle is even greater than before, causing me to read even more bad pixels.
(I pushed the change to the repository)
Best regards,
Jérémy
[Attachment Removed]
Hi,
From my testing, I wasn’t able to identify any differences between the standalone and PIE. Could you please provide more details about the issues you’re encountering in PIE? Also, when you mention ‘bad pixels,’ are you referring to black pixels? And is the rectangle you’re describing the view rectangle?
Cheers,
[Attachment Removed]
Hi,
Yes, when I was talking about “bad pixels”, I was referring to pixels that I think were sampled outside the view rect and counted as black pixels 
When the viewport is small, it work :
[Image Removed]No pixels counted as 0 (black) by the CS :
[Image Removed]Here the values sent to the CS :
[Image Removed]
But if i start to slightly increase the size :
[Image Removed]Many pixels are starting to be counted as black :
[Image Removed]All this by simply increasing the X size of the display window by 17 pixels in this exemple :
[Image Removed]
The more I increase the size of the viewport, the more the number of Histogram[0] (black pixels) increases, despite a completely white screen.
I am indeed using the view rect :
[Image Removed]Then in the function, the width/height sended to the CS’s TextureSize parameter :
[Image Removed]
Best regards,
Jérémy
[Attachment Removed]
Thank you for the detailed information.
It looks like UnconstrainedViewRect isn’t ideal for this case. I tested using SceneColor.ViewRect instead and did not encounter the issue.
Cheers,
[Attachment Removed]
SceneColor.ViewRect fixed the issue! It perfectly matches the texture size.
Thank you for your help 
Best regards,
Jérémy
[Attachment Removed]
Thank you for the update. I’m glad I could be of help.
I will close the case now, but don’t hesitate to reach out if anything else comes up.
Cheers,
[Attachment Removed]