How to create an image field or array to recreate a captured 3D volume


With NeRF I have created 1000 2D image slices of a forest (more or less like a CAT scan would scan slices of a head). I would like to create 1000 planes next to each other, each getting a consecutive texture from the slices sequence, recreating the 3D volume in Unreal.

I am a total noob in Unreal, but in Notch or Fusion I can do this pretty easily.

In Notch I’d use a thing they call Image Fields, In Fusion I just dupe a plane and offset the texture animation 1 frame per dupe. But can’t find anything that sounds like a starting point on this besides just grind some more tutorials and maybe run into it that way :thinking:

Anybody care to point me in the right direction to get me started on this experiment?


You could spawn a plane in a for loop with an offset.
Then set its texture parameter with a create dynamic material instance → then set its texture (you would need to specify the texture as a parameter in the material)

Depending on how the images are stored you would have to take different approaches as to how to get the image (is it one file with offsets like a sprite sheet or many indexed files?)


Hi and thanks for your reply. It’s an image sequence of 1000 png files.

Creating the setup itself isn’t that complex. I just hope you can load in an image sequence as an array to not have to do a 1000 files by hand

This script should pretty much generate what you need. Put it in a blueprint. Make a variable of type texture2d and make it an array (right icon needs to be the 3x3 square) called Picture Array and a float variable called Height Offset (distance between slices)

And here is the material for the slice

Make sure to convert the texture to a parameter for it

Oh and to fill in the images just drag them into the picture array (in it’s details once you click it)


Dude! Thx will try this right away!


I managed to get it working!

Though, I need to play the project in order to see the spawned planes (obviously since it’s a script). Is there a way to ‘bake’ this into a single entity that I can see in viewport while building stuff around it? This would enable me to, among other things, build stuff around it while also seeing it. Also DOF doesn’t seem to work on it. Forgive me if these are obvious things and I sound like a noob again =D


1 Like

The only way for it to spawn “in level” would be to have the script run in the constructor instead of at begin play. Not sure if all of the nodes will work in the constructor though.

Just did a test and it seems to work OK. Here is a revised version of the function

Just call the custom event from the constructor and it should work. It also now deletes the slices on rebuild so that it doesn’t duplicate them by any chance.

Regarding depth of field, maybe you need to change the material parameters.
Maybe switching the translucency pass to “Before DOF” may change this.

Also check if you have “Mobile Separate Translucency” checked by chance. It will omit DOF


How did you recreate the 3D volume in Unreal, after loading the consecutive textures? Did you rewrite a volume rendering algorithm or something?

It’s just poly planes with texture slices (like an MRI scan)

1 Like

I tried the above method, and it seems to be invisible from the angle perpendicular to the plane. Is it necessary to have much enough texture slices to achieve the above effect?Or what steps am I missing?

Planes have no depth to them so it would only be seen at an angle. You would need to make a volumetric material and project them on to 3d scaled cubes to get it to be visible perpendicular.

How to project 2D texture slices to a volumetric material?Do you have specific examples? I have really encountered big difficulties here.