Your thoughts on and comments to Volume Rendering in Unreal Engine 4.

I made a temporal volumetric data set and it had no issues rendering it. The problem was that the VRAM got filled very quickly. 17 frames was about what I could stuff in there in OK resolution before I started seeing some issues. This is in Unreal, naturally, and will not apply to houdini. I was unable to figure out a method for swapping in and out textures from VRAM so UE4 complained about massive memory usage.

I was wandering if would be possible to make a custom node similar to this one (from material M_VolumeRayMarch_Lit_LinearSteps_Preview):

… in that way I can use RGB texture like this:

In a similar way to this:

Maybe that is not much but since a 4d texture is such complicated stuff and currently seams to treat input texture as a grayscale I thought at least we can add little time variation by this HUEShift- based approach.

This is easy enough to do. If you open the custom node you see that it is only using the R channel, but you can instead return the entire float4 texture value if you modify the shader code in the engine folder. This is taken from my common.usf file:


//** Tex       **// Input Texture Object storing Volume Data
//** inPos     **// Input float3 for Position, 0-1
//** xsize    **// Input float for num frames in x,y directions
//** numFrames **// Input float for num total frames
float4 PseudoVolumeTextureColour(Texture2D Tex, SamplerState TexSampler, float3 inPos, float xsize, float numframes)
{
	float zframe = ceil( inPos.z * numframes );
	float zphase = frac( inPos.z * numframes );

	float2 uv = frac(inPos.xy) / xsize;

	float2 curframe = Convert1dto2d(xsize, zframe) / xsize;
	float2 nextframe = Convert1dto2d(xsize, zframe + 1) / xsize;

	float4 sampleA = Tex.SampleLevel(TexSampler, uv + curframe, 0);
	float4 sampleB = Tex.SampleLevel(TexSampler, uv + nextframe, 0);
	float sampleLen = length(sampleA.xyz)*length(sampleB);
	float4 sample = lerp( sampleA, sampleB, zphase );
	if(sampleLen < 0.01){
		sample.a = 0;
	}
	return sample;
}

Then you have to update the logic of the custom node in your picture to use all 4 channels. I am using it differently so I cant just give you mine. My project is open source so you can check it out to try to get an idea of how I am using it, but it will not be compatible with that material.

GL

That plugin actually has a few shaders that do that already. Check the ray march shader with IBL in the name. There is also a velocity preview shader that displays the RGB of a velocity texture you could use.

Thanks guys!
I’ll try that.

Have you disabled depth testing? Is it a post process material?

I usually only see that when I change the number of steps without recompiling the material, or when I use drawscale3d on the mesh. Using drawscale 3d isnt handled by the box intersection math at the moment. Other than those two things, I have not seen that.

Looks like you may be non uniform scaling the box.

on the topic of 4D flipbook, it is possible. I did it a while ago.

[video]Volumetric Fire WIP test in Unreal Engine - YouTube

You only get a certain amount of frames and pretty much have to use a 4k/8k texture. I was able to get 128^3 IIRC.

One trick is to use the rgb channels as different times if you only care about grayscale. The basic idea is that you sample incoming time (from a blueprint as a parameter) and use that to index into the texture in the same way as the Coord2dto1d function but now you have 3 extra indexes to do.

That is great! I was also thinking of trying to use video textures to try and do that as well. It would probably give you much more range.

You can either voxelize empty space around the character to make the volume a cube, or you can un-squash in the rendering material by dividing the position by the 3d scale at every lookup.

looks like you normalized this along Z. option 1 means that you should see a ton of black frames in there if you encode a cube. option 2 (un squash) means you divide the position of the 3d lookup by the non uniform scale inside the raymarch shader. also gotta clamp.

I think I found a little problem with the PseudoVolumeTexture code ( in Common.ush) but not sure:

I’m build on @ 's volume render Plugin, when I tested this texture

I got this result (the colors are from a transfer function):

In the top of the number 8 there’s a seemingly wrapping number one. When looking for the possible cause I ended up on the function PseudoVolumeTexture where the first line caught my attention:

float zframe = ceil(inPos.z * numframes);

when a changed from ceil to floor, got this:

There is still a one on top of the 8, but the thickness of one’s is smaller, makes me think maybe its a interpolation between last slice and the first one.

Is this right?

Cheers

@lucastakejame Just in case you didn’t see yet, there is Ryan’s blog on the subject at this location: here

@NilsonLima I actually found that post before this thread, i just posted here cause the function I mentioned is in the engine folder, not the in code of the post

I am not able to see what comes after the “:” on your post… are there any pictures? If yes, they are not showing properly, not even a link. Upload them clicking on the underlined A above and selecting Picture icon.

Trying to upload again:

I edited 2 lines in the PseudoVolumeTexture function and it seemed to resolve the problem


float4 PseudoVolumeTexture(Texture2D Tex, SamplerState TexSampler, float3 inPos, float2 xysize, float numframes,
    uint mipmode = 0, float miplevel = 0, float2 InDDX = 0, float2 InDDY = 0)
{
    // float zframe = ceil(inPos.z * numframes);
    // [EDIT] the first z frames were appearing after the last ones
    float zframe = floor(inPos.z * numframes);
    float zphase = frac(inPos.z * numframes);

    float2 uv = frac(inPos.xy) / xysize;

    float2 curframe = Tile1Dto2D(xysize.x, zframe) / xysize;
    float2 nextframe = Tile1Dto2D(xysize.x, min(zframe + 1, numframes) )/ xysize;

    float4 sampleA = 0, sampleB = 0;
    switch (mipmode)
    {
    case 0: // Mip level
        sampleA = Tex.SampleLevel(TexSampler, uv + curframe, miplevel);
        //sampleB = Tex.SampleLevel(TexSampler, uv + nextframe, miplevel);
        // [EDIT] Without this it seems there is a interpolation between the last and first frame
        sampleB = Tex.SampleLevel(TexSampler, float2(saturate(uv.x + nextframe.x), saturate(uv.y + nextframe.y)), miplevel);
        break;
    case 1: // Gradients automatic from UV
        sampleA = Texture2DSample(Tex, TexSampler, uv + curframe);
        sampleB = Texture2DSample(Tex, TexSampler, uv + nextframe);
        break;
    case 2: // Deriviatives provided
        sampleA = Tex.SampleGrad(TexSampler, uv + curframe,  InDDX, InDDY);
        sampleB = Tex.SampleGrad(TexSampler, uv + nextframe, InDDX, InDDY);
        break;
    default:
        break;
    }

    return lerp(sampleA, sampleB, zphase);
}


[EDIT] Following @NilsonLima 's comment
Here is the result


editedPVT1.PNG.jpg

Since I’m working with other people, how can I embbed this code on my material (instead of directly changing the engine shader) without just copying this code on everyplace the function is called?

One way to do it is with custom node as Ryan explains in this video: Custom Material Node: How to use and create Metaballs | Live Training | Unreal Engine - YouTube where you nest two custom nodes.

Currently this trick only works for DirectX shader compiling path, while OpenGL and Vulcan is still not working and it is reported.

Another way is to create a global shader plugin in C++ where you can expose that function, which is the best approach since it is portable to other platforms. You can follow this thread to guide you: https://rcaloca.blogspot.com.br/2017…rs-to-ue4.html

Also, could you post the result of your test with the change, so others can use the info as reference?

Done, I just edited the previous post =)!

Valeu pela resposta Nilson!

:smiley: valeu! There is always a brazilian sneaking around :))