Texture generation from Buffers

Hi,

I have some gameplay C++ code that compute and store information in two uint32 Buffers every frame (or couple of frames).
Information stored on each uint has a granularity at the bit level (not byte), for example buffer 1 encode like that :


//  0000 0000000 0000000 0000000 0000000
// | A  |   B   |   C   |   D   |   E   |

and buffer 2 something like that :


// 0000 0000 0000 0000 0000 0000 0000 0000
//  A    B    C    D    E    F    G    H

I would like to use this computed data in a material. My initial approach was going to use “import buffer as Texture2D” (https://docs.unrealengine.com/en-US/…e2D/index.html) and then doing some bit shifting in the material to get back structured data for each pixel.

I have questions though: my goal is to save memory because I would have multiple gameplay actors that each compute those 2 buffers.

  • Is it an efficient approach ? data buffers are 4096x2048 and there is quite a lot of information stored for each pixel, that’s why I pack data into a uint instead of having more than two textures (which would allow me to just encode data at byte level). But I suppose bitshift is rather fast and won’t be a problem in the material

  • How could I handle the update of the buffer efficiently, I suppose having a BufferToTexture called every time my input buffers change is costly. Isn’t there a way to share texture buffer between material and my C++ actor that compute stuff ?

Thank you for your advices,