Possible to use my own pure HLSL and GLSL shader code?

I’ve written a few shaders one of them being a really nice HLSL and GLSL refraction and dispersion shader. I was wondering if it is possible to just dump my shader codes somewhere ?

Redoing this in the material editor would literally be impossible without faking it and i dont want to fake it if i have the means not to.

1 Like

You won’t be able to use your GLSL code directly, all GLSL code is auto converted from the HLSL code. You have two options for using HLSL.

  1. Custom node in the material editor, this has the obvious limitations of having to work within the material framework, so it would not be a direct copy and paste, you will have to modify your code to work with the material editor.

  2. Create a custom USF (Unreal Shader File) and paste your code in there, then you will need to access that from the renderer, using either a FGlobalShader, FLocalVertexFactory (Or one of its flavours), and you would need to set up and pass in all the required variables and such.

Below is a link to a custom Vertex Factory I wrote for the metaballs for you to peruse and hopefully get some ideas. The code is missing the geometry shader code, but it is not required for learning (Vertex Factories by default don’t allow geometry shaders). If required I can add the remaining files.

Thank you for the quick reply! After a quick glance im quite certain these are exactly the droids im looking for. Cheers!

Thanks for this resource !

However, I still have difficulties with how to use the shader:

  • how to use the vertex factory class ?
  • when is it compiled, is there something to do for it ?
  • how does it integrate with the deffered renderer ?
  • how to have geometry and pixel shaders ?

Hi , it’s probably easier if you start off the FSimpleElementVS and FSimpleElementPS on SimpleElementShaders.h/.cpp as a reference. Geometry shaders are handled the same way. You can see on Texture2DPreview.cpp how they’re used. Vertex factories are a whole different beast :wink: there’s probably no real reason to make a new one.

Thanks for the answer, this is definitely useful ! However I am a bit confused about the different shader ‘types’: what is the difference between using a FGlobalShader and a FVertexFactory ? Is a global shader on the whole render target (postprocessing), or can it be applied to a mesh (ie, setting the shader as the “material” (in the general sense) of a specific mesh to be rendered) ?

My intuition is that a vertex factory is only needed if the format of the vertices is not standard, is it right ?

In my case I want to generate camera-facing billboards in a geometry shader (but before that I’m gonna try with a variant of my algorithm with a vertex shader), but the generation of the billboards is a bit non-standard (there are a lot more things to do when generating it). Then I have a special lighting model in a pixel shader.

Up.

I tried to find some doc about the workflow but found nothing. I am able to get UE to compile a shader, to create a callback class (with IMPLEMENT_SHADER_TYPE & co), but I don’t know how to link everything:

  • I have some shader callbacks classes (vertex & pixel shaders for the start, I’ll be adding a geometry shader when the rest will be working)
  • I have a VertexDeclaration render resource
  • I have a mesh following that declaration

I tried without a VertexFactory… to no avail.

So, how do I render the mesh using my shaders ?

I tried to do the following:



TShaderMapRef<FCustomVS> CustomVS(globalShaderMap);
TShaderMapRef<FCustomPS> CustomPS(globalShaderMap);

static FGlobalBoundShaderState shaderState;
SetGlobalBoundShaderState(RHICmdList, ERHIFeatureLevel::SM5, shaderState, GCustomVertexDeclaration.Declaration, *CustomVS, *CustomPS);

FMatrix localToWorld = GetLocalToWorld();
FMatrix worldToScreen = View->ViewMatrices.ViewMatrix * View->ViewMatrices.ProjMatrix;

const uint32_t nbTriangles = meshBuffer.Indices.Num() / 3;
const uint32_t nbVertices = meshBuffer.Vertices.Num();
const uint32_t vSize = sizeof(FCustomVertex);
const uint32_t iSize = sizeof(int32);
const void* iBuffer = meshBuffer.Vertices.GetData();
const void* vBuffer = meshBuffer.Indices.GetData();

CustomVS->SetParameters(RHICmdList, localToWorld * worldToScreen);
DrawIndexedPrimitiveUP(RHICmdList, PT_TriangleList, nbTriangles, 0, nbVertices, iBuffer, iSize, vBuffer, vSize);


It renders nothing - I don’t see the mesh at all. The pixel shader simply outputs yellow, and the vertex shader just transform position from world to screen:



float4x4 worldToScreen;

VS_OUTPUT main(VS_INPUT input)
{
	VS_OUTPUT o;
	o.position = mul(float4(input.position, 1), worldToScreen);
	return o;
}


I’ve tested that in a render command enqueued in the TickComponent of the component containing the geometry, or in the DrawDynamicElements of its associated PrimitiveSceneProxy (but not both at the same time). No difference, nothing rendered
At this point I don’t understand. Any help from a dev ?

Thanks !

@UnrealDevTeam, we need your help!

I’ve just found this very convenient on github, check it out :wink:
//GitHub - Temaran/UE4ShaderPluginDemo: A tutorial project that shows how to implement HLSL Pixel and Compute shaders in UE4

1 Like

updated temaran’s to UE4.17+: GitHub - ValentinKraft/UE4ShaderPluginDemo: A tutorial project that shows how to implement HLSL Pixel and Compute shaders in UE4

1 Like