Apparently I can’t create textures larger than 128x128? That’s quite small for a game engine I think. Theoretically, a texture 4096x4096 should be more than enough to achieve this task.
But even though I’m only loading floats into the red channel shouldn’t a 4096x4096 texture still have enough? That has 16 million pixels, and if each pixel holds one float in the red channel I feel like there should still be enough pixels to hold the data.
I guess I may be misunderstanding the difference between rhi resolution and texture resolution. Are those not the same?
They are the same resolution between texture and RHI - but from the log you showed in the other post it was trying to allocate a texture 32433x32433 - the maxiumum is 16384x16384.
I just tried creating one here at 4096x4096xR32f and one at 16384x16384xR32f and they seem to work ok…
Okay so I found the problem, it was quite an obvious mistake on my part. The error says that the texture exceeds the maximum allowed dimension, not size. In my code I was treating the texture like a float array so I had the dimensions of the texture as (data.Num(), 1), and since the data is typically millions of floats long this clearly exceeds the dimension limits.
It’s nice to have the textures structured like this because it makes shader sampling index wise quite simple, but it seems I will just have to pack the data asymmetrically. I tried making the texture (data.Num() / 1080, 1080) and the texture was successfully created. Thanks for everyone’s help!