I haven’t tried it on an AMD GPU, but I’m not surprised. You could theoretically do VXGI faster than 4-5fps on an AMD GPU, but VXGI is an NVIDIA library, so I’m sure it’s designed to leverage GPU features where they do have a big advantage over AMD. It’s unlikely to be an easy thing to fix, even if you have access to the VXGI GameWorks code. However, if you use the “profilegpu” command it should give you an idea what is so slow. My guess is the voxelization.
This happens in the non-VXGI build too. It’s some floating point problem in the cloud material that I was too lazy to fix.
Hi. So impressive. I started Unreal 4 and have interest at making voxel random world generation. So I saw this page, and tried to open from Unreal 4.10preview2, but can’t. How should I do? Thanks.
I’m procedurally generating some meshes using some techniques from your brick game and from the native procedural mesh component in unreal. I’m not getting normal maps to show up either.
Do you mind saying what you might have done to fix your issue so that maybe I can see if it applies to my problem. Thanks in advance. Your brick game has been so helpful!
Not completely dead, I just haven’t had time to work on it lately. I did just merge a pull request somebody made to add support for translucent materials!
This is the commit that fixed the bug in BrickGame: /commit/aafcaa1d9607cb589aa7dafe788afc4ffe97ab18
So, I checked it out. Doesn’t seem to be my problem, I think… I’m generating normals, it seems (If I go counter-clockwise on my triangles, the normals flip, so that means it’s not the raw normal generation that’s a problem, right?). I’m applying an instanced material, and the diffuse/roughness/metallic qualities all seem to be applying, just not the normal map. If you have any random thoughts on why a normal map wouldn’t apply to a procedurally generated mesh, I’d be all ears. Otherwise, thanks anyway.
You also need to make sure the tangent vector is correct. The normal defines what direction the normal Z maps to, but not how to interpret the X and Y components.
Also check that the W component for the normal is correct (that’s what my change fixes). Unreal doesn’t store a full 3D tangent basis for every vertex, it only stores X and Z vectors, and a sign. The Y vector is derived from a cross product between the X and Z vectors, and multiplied by the sign stored in the 4th component of the “normal”. The bug in BrickGame was that FPackedNormal(FVector4(…)) was setting the 4th component to 0, so the derived Y vector was being multiplied by zero.
You’re a lifesaver. I was already computing the tangent vector of my triangles, but I had no idea that there was a W component in the FPackedNormal that factored in in this way. I arbitrarily set the W to 1 to see if this was the issue, and as soon as I did this, I started seeing my normals.
Now, one last question. For your situation, you were using cubes that were always aligned the same way, so your math worked a particular way. What is the correct way to derive the W value for a triangle in the wild?
FYI I updated BrickGame (and its fork of Unreal) to 4.11. Performance seems better, but at least partially because I think the LPV quality is lower. I’m still thinking about a suitable alternative, since LPVs are very expensive for what they give (I’m just using them for emissive bricks).
How did you get such a beautyfull and straight shadows? I have the same setup as yours, but use UProceduralmeshComponent and shadows are a bit flickering and very dark, almost no light bounces
How did I forgot to look for this thread! I will keep an eye on it from now on.
The project is alive and well, its just that most people were not engaging. Right now the project is getting tons of Issues with new ideas and whatnot.
(I wish Github provided for a tab that was not called Isues but Questions, or Suggestions. It would sound a lot more possitive, haha)
Im the translucency guy, BTW. I laughed a little bit when I saw Scheidecker say that it would be a simple feature to add. Not like he is wrong, because he is not. To an experienced programmer adding features like that should come out as simple. Is just that I myself remember saying how hard it had been, and well, I guess I was showing how much of a begginer I was. But hey, that’s exactly the appeal of Scheidecker project: A Minecraft UE4. What’s not to like? For a begginer like myself it really was a fantastic piece of code to look at and learn.
I am obsessed with fluid simulation, so rest assured that will be a feature at some point.
This is really impressive work you have done here Andrew. I have a background in C++ but I just started with UE4 about two months ago. I’m working on a open world survival game with destructible terrain and procedural terrain.
The problem my game has is that I have dozens of separate mesh components, each about 32 meters wide. If I make them any bigger, then the update time becomes too high.
It looks like BrickGame only has a single component for the entire visible terrain, which is awesome. From what I can tell in the source code, you’re using some kind of direct access to modify the mesh instead of updating the entire thing every time someone changes a block. This is done using the “Rendering Hardware Interface” I think?
I’m hoping that you might give me some advice about where I can learn how to do this. I’d really like to have a 500 meter visibility like your game does. Anyway, here’s what I have so far, if you want to take a look.