How to place single GPU particles at specified locations?

Hello everyone,
I would like to place single GPU particles (from the same emitter) at specified location.
I’ve built a data table containing coordinates for single points (from csv file) where I would like to place particles, and I’m able to extract from this table splitted x,y and z values.
I’m even able to create all the particles needed for spawning in a single emitter shot, and to set them to stay still and to never die (disappear) but I can’t get the right way to place single particles to the desired position.
I’ve made several tests with sprites (and everything is working fine!) but I get very poor performances even with 2.000 / 3.000 sprites.
GPU particles are very optimized and I can get 60 fps even with 70.000 particles.

My final project is to render a Point cloud using GPU particles.

Can anyone help me?

Thank you.

(Sorry for my english)

1 Like

The easiest way to go about this is probably to convert your csv file into a bitmap image. Then you can use the bitmap image as a lookup table in the particle material and use it to offset particle position. Dunno what tools or programming experience you have, but I used Mathematica to convert a CSV to a bitmap. If you don’t want to get that, here’s some suggestions on how to do it with python. Make sure all your x,y,z values are scaled to between 0 and 1.
buncloud.bmp (48.1 KB)
Here’s a 128x128 image made from 16384 vertices from the Stanford bunny.

Import your bitmap. Open the asset. Under Level Of Detail, set the Texture Group to ColorLookupTable, under Compression, set Compression Settings to VectorDisplacementmap(RGBA8).

In your Particle system, under Initial Location, set the Min X as 0 and Max X as the number of points. Set distribute over NPoints to the number of points as well, and set the distribute threshold to 1. This way, you can use the particle’s x position as an index for your lookup table. Here’s what the material setup looks like.

Here’s the point cloud it creates.

649013cbd1cbe44c5b5f744737a6c755e7723124.jpeg

Let me know if you have any questions.

WOW ! This suggestion open a whole new path for the project!
Thank you so much! I’m testing right now and I will post an image of the result as soon as possible!

So I found a flaw in my method. Distributing N particles over N points does not guarantee that all points are occupied. While the particles will be found at integer points, many are occupied multiple times and many are not occupied at all. You can up the spawn to increase the probability that a spot will be occupied, but it takes many times the number of points to be confident they will all be filled.

If it is ok that the cloud does not draw instantly, you can guarantee that all spots are filled by using velocity. Remove the initial location, set the initial velocity x to the number of points divided by the emitter duration. Set the spawn to the same as the initial velocity x. Set the lifetime to the emitter duration. It will take the duration of your emitter to completely draw the point cloud. I’ve found I can decrease the emitter duration down to 0.25 seconds but below that, with a large number of points, large velocities cause position inaccuracies due to floating point errors.

I just found a method with guaranteed placement that can be drawn within two ticks. Set spawn rate, lifetime and initial velocity to 0. Add a spawn per unit module. Set it up like this.

4623525f5233245b09ad36700d414b73f29b5c81.jpeg

Then make a blueprint based on your particle system. Wait one tick, then move your particle system 1 unit per point like this:

Here’s the resulting cloud:

SPUbuncloud.jpg

Nice very interesting method right there! I was doing some messing around before to spawn star systems from data on the Milky way and this would be quite an interesting way to test that out too.

Trying to get it working

@: I attempted to implement the updated method from your most recent post. But I’m not getting anything displayed.

I imagine (I hope!) there must be something obvious that I missed. I wonder if you or anyone would like to take a look at my project and advise?

Here is the work in progress. To try it out you can either clone it with Git or just download the zip file.

Thanks!

@Michael_Geary Looking at your project, you have BunnyParticles in your level, but you need BunnyPrints in the level instead. BunnyParticles only emits when it’s moved, which is what BunnyPrints does. Second, see the warning in your particle system. You need a fixed relative bounding box when using GPU particles, and you’ll need a big one or you’ll only be able to see the particles from limited angles. In the cascade editor toolbar, to the right of the Bounds button there is a downward facing triangle, click it select Set Fixed Bounds. Then in the main Particle System details tab, scroll down to Bounds and set them all to something like Min: -100000, Max:100000. Also, BunnyPrints should be at 0,0,0 in your level, because the material uses the x location of each particle. In the material, use the Constant3Vector added right before World Position Offset to control the location of the cloud. You can change it to a collection or dynamic parameter if you want to move it around with blueprints.

@ - thanks for the helpful tips! That definitely got me farther along.

I took BunnyParticles out of the level and put BunnyPrints there instead, and I added the fixed bounds as you mentioned.

I was a little uncertain about the 0,0,0 location for BunnyPrints. I had a Make Vector in the blueprint following your screenshot, with X=16384 and Y=Z=0. I changed the X to 0 as well.

I don’t see a World Position Offset in the blueprint. The Make Vector feeds into the SetWorldLocation. Is there a World Position Offset here somewhere too?

I don’t care about moving the point cloud around at this point (pun intended) - for starters I would just like to get it to display a bunny at all. :slight_smile:

In any case I do see something new after these changes. It’s not a bunny but it’s something at least. If I go down below the floor, I see this:

Thanks for helping with my dumb questions and errors. My goal is to get a working demo of your technique that people can download and try out without having to make the same mistakes as me. :slight_smile: The new version of the project is on GitHub at the same URL.

@Michael_Geary When you place bunny prints in your level, look at its transform in the details tab. Set that to 0,0,0. Then, after the first tick, you want the blueprint to move it to 16384,0,0, so change that back. It will then place 16384 particles, one at each integer value of x.

The world position offset is in materials, not blueprints. You need to make a material based on the one in my first post in this thread and set your particle material to that.

@ - D’oh! So I forgot to create the material. How silly.

I set up a BunnyMaterial and made it the particle material, along with the other changes you mentioned. The material screenshot doesn’t show all the settings for the individual nodes, of course, but most of them were fairly obvious. A few I had trouble with:

Your Texture Sample has two input pins, UVs and Level. Mine ended up with three: UVs, Tex, and Level. This would seem to tell me I have one of the Texture Sample settings wrong, but after trying all of them I can’t get rid of the Tex pin. The settings I have for Texture Sample are:

MipValueMode: MipLevel
Sampler Source: From texture asset
Const Coordinate: 0 (grayed)
Const Mip Value: -1
Texture: thumbnail shows the BunnyCloud texture
Sampler Type: Linear Color
Is Default Meshpaint Texture: unchecked

Also, there is a Multiply(,-1) node near the right that takes its A input from the Particle Position. It overlaps the Add node to its right, so it’s hard to tell where its output goes. Since the Multiply(,1000) node below them goes to the Add node’s B input, I assumed that the Multiply(,-1) goes to that Add node’s A input.

Finally, the BunnyMaterial itself has a number of settings. I set them rather unscientifically: I noticed that your DefaultParticle2 had four of the input pins enabled and the others grayed out, so I tweaked settings until mine looked like that. (Cargo cult programming at its finest!) The settings I have are:

Phys material: None
Material Domain: Surface
Blend Mode: Masked
Decal Blend Mode: Translucent (grayed out)
Shading Model: Unlit
Two Sided: unchecked
Use Material Attributes: unchecked
Subsurface Profile: None (grayed out)

The remaining settings after the Material group are all on their defaults.

With these changes, I still have that colorful bar artifact instead of a bunny. The main thing that’s changed is that bar is now horizontal instead of sticking down at an angle.

Of course another option would be if you have a working project and would like to share it. Then you wouldn’t be hearing my dumb questions. :slight_smile: But no problem either way, I will continue to investigate and try to see what I missed, and of course welcome any further tips. The GitHub repo is updated with this latest code.

Thanks!

@: Thank you in BIG LETTERS for your kind and patient assistance with this. Your last couple of tips in a PM got it working!

As you mentioned, the two remaining problems were simply:

• That Multiply(,-1) node I was asking about? I had a silly typo and it was doing Multiply(,1). It’s funny, if this were code in any programming language, I have to think I would have noticed an expression like (foo*1) and wondered why I was multiplying by 1! :slight_smile: But in the “blueprint-style” material editor I didn’t notice it.

• I had the BunnyMaterial hooked into the BunnyParticles the wrong way. Instead of being attached to the particle system as a whole, it needs to be connected to the emitter’s Required tab.

With those two changes, it works!

I don’t know how to thank enough, both for creating this very clever technique and for the assistance here and in PMs.

I just realized this technique has a limitation that makes it unsuitable for my use case. Because the X/Y/Z particle coordinates come from a texture sample lookup, they are limited to an integer range of 0-255 on each axis. So you only have 256 discrete positions available in each direction. This results in some pretty severe quantization/aliasing errors.

You can see this effect if you load and run the project and press the up arrow to move “inside” the bunny, then look around. You’ll find a lot of places where the particles line up in neat horizontal/vertical/diagonal lines, like this:

I experimented using a .png file with 16-bit depth instead of 8-bit as in the original, but when I import it into UE4 and try to set the compression settings, it still only offers me “VectorDisplacementmap(RGBA8)”. I was hoping to see something like an RGBA16 option there. (I must confess that I didn’t actually test this further to see if I was getting 16-bit precision.)

Even 16 bit resolution wouldn’t be perfect. That would give 65,536 discrete positions in each axis. Maybe good enough, but I bet there would still be some aliasing. What I really want to do is take an array of Float3’s (or Vector3’s or whatever Unreal likes to call them) and use that as the lookup table instead of the TextureSample that our material uses now.

One roundabout way to accomplish this would be generate not just one lookup texture, but two or even three or four, where one texture has the low-order 8 bits for each axis, the next texture has the next-higher-order 8 bits, and if we want more than 16 bits of precision we keep going from there. Then instead of just the single texture lookup that the material uses now, we also have lookups for each additional 8 bits of precision, doing some bit shifting after the lookups to combine each 8-bit slice into the final coordinate.

As I think about it, I’m pretty sure that could be made to work. I’m just wondering if there might be a much simpler approach using a single linear array of Float3 coordinates. If I were writing this from scratch I know that’s how I would want to do it, but I’m trying to find a way that works in Unreal’s material system.

Any ideas are most welcome! In the meantime I may experiment with this multiple-texture idea.

@Michael_Geary You can use up to 4096 float3s in an array within a custom code node. Set output as CMOT float3, input as x.

const float3 pointcloud[4096]={{1234.5678,1234.5678,1234.5678},…{1234.5678,1234.5678,1234.5678}};
return pointcloud[x];

just make sure you have plenty of newlines in the data or it will fail with a source line too long error.

I have been looking into a way of automating the image creation from data using CanvasRenderTarget2Ds. Shouldn’t be too much of a stretch to extend it to multiple render targets to increase precision. I’ll post something soon.

@ - Neat, that custom node data could come in handy for something. 4096 isn’t enough for my project, so will probably stick with textures for the point data itself.

For the increased precision, instead of the multiple textures I’m experimenting with an EXR texture, which has 32-bit floats for the RGBA data. As a quick and dirty test, I converted the bunny cloud bitmap to an EXR file, imported it and set the texture’s Compression Settings to HDR instead of VectorDisplacementmap, and it worked. Of course there wasn’t any visible difference yet, because the point coordinates had already been quantized to 0-255, but it looks promising so far.

Interesting.
So let me get this straight for the other philistines of the forum:

  1. You take your point cloud (pcl)
  2. You sample it to fit the pixel count of a square texture (in this case 128x128 so 16384 points as you pointed out)
  3. You convert each x,y,z position to a 0-255 scale
  4. You convert your “xyz” csv into a texture for which each raw equal to a pixel with its rgb values
  5. You “extract” those rgb values inside UE4 so that it reads it as a coordinate system for a 256x256x256 cube
  6. You can “beef up” this cube density with a greater image depth which allow you to save rgb values with extended decimals

That seems quite a complicated system to import a point cloud and it comes with a lot of limitations.
How many points can you expect to load before it critically impact the performance ?

Maybe pursuing the integration of the pcl library inside UE would be a more productive way to toy around with pcl inside this engine (https://forums.unrealengine.com/showthread.php?90408-Point-Cloud-Library-in-UE4) ?
To be honest I wanted to do it, but my C++ skill are not that good and I was not able to move forward on this project.

Also, I was impressed by the gpu particle system you can see in https://www.youtube.com/watch?v=nfhEULONgs8.

@as3ef2th1 - Yes, you have understood and explained this technique very well!

I suspect that even @ would admit that it is a bit of a kludge, but as kludges go I have to say it is a fairly magnificent one. :slight_smile:

How it scales up to a large number of points remains to be seen. For my use case I may need something on the order of a million points.

I mentioned experimenting with an EXR texture to increase the precision. I looked through the engine code, and unfortunately when you use an EXR texture it uses 16-bit floats internally, even if the source EXR had 32-bit floats. (I’d initially misinterpreted the information the Editor displayed about the texture - wishful thinking on my part!) I’m not sure if a 16-bit float is enough precision to render a good-looking point cloud - my guess is it isn’t, but I could be mistaken.

Regarding my idea of using multiple textures to increase precision, now I think I would probably use a single texture instead, either with adjacent pixels containing the bit slices for each point, or by having the first block be the low-order 8 bits, a second block with the next-higher-order 8 bits, etc. This would end up being a non-square texture, but that’s OK on desktop platforms like I’m targeting. Mobile platforms would be another story - I read that at least some iOS devices only accept square textures, but that doesn’t matter for my project which will only run on a very high-end desktop.

Integrating PCL into Unreal is a very interesting idea - thanks for mentioning it. I will have to look into that a bit. I was planning on using it for some preprocessing of the incoming point cloud data (I need to load in point cloud data from a LIDAR scanner in real-time) - but hadn’t thought of using it for the rendering.

Well, I am also working with LIDAR data, both terrestrial and aerial one.
I was also expecting at some point to be able to load point cloud data inside the engine so that my fellow researchers can look around the dataset with the Oculus/Vive. This way, people could have a better understanding of the data scale, quality and hopefully be more effective with classification.

Not only the PCL library would allow us to grab LIDAR data more easily inside the engine, it also comes with specific perks related to pcl visualization such as real time octree decimation (not the right term but you get the idea). I have a potree example here : http://vritage.nazg.org/Besancon/Besancon.html You will never have more than 1 000 000 points at a time depending on your “location” inside or outside the cloud, which comes pretty handy to spare processing power.

It seems that it’s possible to have this kind of behavior directly inside the engine as MrBushido pointed out on reddit at some point : https://www.reddit.com/r/unrealengine/comments/3bhak3/help_trying_to_implement_lidar_point_cloud/csm9rl2

Right now I’m still writing a matlab script to transform my lidar data into a 16bits png !

Ok, I finally got it to work with some of my .las file.
For some reason (meta data about scale or something like that) I have to convert the .las into a .ply and then into a .csv before processing it with matlab. If I go straight with .las to .csv this will not work.

The matlab script is pretty straightforward, anyway :

filename = ‘yourfile.dat’; %I’m used to rename my file.csv to file.dat before using matlab but it’s still csv data anyway
A = csvread(filename);

A = A - min(A( : )); %useless spaces for ( : ) because it’s converted to a…smiley
A = A/max(A( : ));
A = round(A, 6); %not really useful but…

A = reshape(A,128,128,3); %reshape your data so that it’s a 128x128 image with 3x[0-1] values for colors
image(A); %preview
imwrite((A), ‘D:\PROJETS\UnrealCloud\sample.png’, ‘png’) %saving as a 16 bits .png*

479bab2a19f597f0e0d866ed223bee041e19cd57.jpeg

In kludging together a way to position GPU particles as OP requested, I forgot to ask, why use particles at all? They have lots of unnecessary overhead if the particles are gonna sit still most of the time. It’s possible to treat a static mesh composed of a stack of disjointed polygons the same way. I haven’t fully tested the new system, but with 1 million particles, I was getting a dozen or so misplaced, and noticeable flickering. For some reason, moving the system further than 48576 units was causing my particles to disappear, so I had to reduce the spacing between them. I believe using a static mesh solves that problem.

Tonight, I tested constructing images in Mathematica, and loading them into UE4 with DownloadImageFromURL to update a texture. I’ll hook it up to the static point cloud tomorrow and put up an example project.