I’ve written a few shaders one of them being a really nice HLSL and GLSL refraction and dispersion shader. I was wondering if it is possible to just dump my shader codes somewhere ?
Redoing this in the material editor would literally be impossible without faking it and i dont want to fake it if i have the means not to.
You won’t be able to use your GLSL code directly, all GLSL code is auto converted from the HLSL code. You have two options for using HLSL.
Custom node in the material editor, this has the obvious limitations of having to work within the material framework, so it would not be a direct copy and paste, you will have to modify your code to work with the material editor.
Create a custom USF (Unreal Shader File) and paste your code in there, then you will need to access that from the renderer, using either a FGlobalShader, FLocalVertexFactory (Or one of its flavours), and you would need to set up and pass in all the required variables and such.
Below is a link to a custom Vertex Factory I wrote for the metaballs for you to peruse and hopefully get some ideas. The code is missing the geometry shader code, but it is not required for learning (Vertex Factories by default don’t allow geometry shaders). If required I can add the remaining files.
Hi , it’s probably easier if you start off the FSimpleElementVS and FSimpleElementPS on SimpleElementShaders.h/.cpp as a reference. Geometry shaders are handled the same way. You can see on Texture2DPreview.cpp how they’re used. Vertex factories are a whole different beast there’s probably no real reason to make a new one.
Thanks for the answer, this is definitely useful ! However I am a bit confused about the different shader ‘types’: what is the difference between using a FGlobalShader and a FVertexFactory ? Is a global shader on the whole render target (postprocessing), or can it be applied to a mesh (ie, setting the shader as the “material” (in the general sense) of a specific mesh to be rendered) ?
My intuition is that a vertex factory is only needed if the format of the vertices is not standard, is it right ?
In my case I want to generate camera-facing billboards in a geometry shader (but before that I’m gonna try with a variant of my algorithm with a vertex shader), but the generation of the billboards is a bit non-standard (there are a lot more things to do when generating it). Then I have a special lighting model in a pixel shader.
I tried to find some doc about the workflow but found nothing. I am able to get UE to compile a shader, to create a callback class (with IMPLEMENT_SHADER_TYPE & co), but I don’t know how to link everything:
I have some shader callbacks classes (vertex & pixel shaders for the start, I’ll be adding a geometry shader when the rest will be working)
It renders nothing - I don’t see the mesh at all. The pixel shader simply outputs yellow, and the vertex shader just transform position from world to screen:
I’ve tested that in a render command enqueued in the TickComponent of the component containing the geometry, or in the DrawDynamicElements of its associated PrimitiveSceneProxy (but not both at the same time). No difference, nothing rendered
At this point I don’t understand. Any help from a dev ?