Hi There,
Is there any UE4 class could render point cloud ? E.g. input is an array of point locations ?
Thanks.
-Hui
Hi There,
Is there any UE4 class could render point cloud ? E.g. input is an array of point locations ?
Thanks.
-Hui
I helped develop a method to render point clouds. It involves using a texture as a lookup table to place geometry. Initially particles were used, but I switched to a static mesh to reduce overhead. Read this thread to see how it’s done. Here’s a link to the projectfound on that thread. If you have an NVidia card, I found the artifacts can be eliminated by using masked blend mode and a sphere gradient 2D, as long as the edges of the quads aren’t rendered the artifacts go away.
If you want to do it all in engine, rather than import an image, you can draw an array to a custom CanvasRenderTarget2D blueprint and use it as a texture parameter in a dynamic material instance. If you need any help with that let me know. The modified technique in this thread is more in that spirit. Here’s the project from that thread.
Hi there,
I just replied you in the 100 000 points thread, but you ignore or didnt saw my message. So i hope you check this out. I have a question about lidar effect project and point cloud also. So first question about how to create not only white points for custom depth objects. Basically for the custom depth i saw custom stencil function that can provide any colors to custom depth objects. But in your project custom depth pp used from capture component blendables not for all map. So i dont see way to use custom stencil from the capture component PP volume blendables. Can you please describe me this thing.
If I’d had an elegant solution I would have let you know. Basically the only way to do it is:
Duplicate the capture component.
Apply a post process material to capture component 2 that outputs custom stencil value.
Have capture component 2 use a separate render target as a texture target.
Use the same UVs that sample the first render target texture in LidarCloudMat to sample the second one.
Anything else involves a loss of precision. Using final color(like when a post process is applied) as a capture source only allows for 3 channels. If a fourth was an option then the stencil value could be put in that channel.
Thank you for your answer!
After the short break now i have some time to implement your idea.
“Use the same UVs that sample the first render target texture in LidarCloudMat to sample the second one.”
I don’t quite understand this thing. Maybe you mean CustomDepthViewerMat? I dont see any render targets in LidarCloudMat. Or you mean PCDynRT is texture target. I’m a little confused in this part.
Hi,
I’m trying to render a point cloud using a depth stream from a kinect camera. I’m pretty new to unreal and blueprints and came across what you were able to do here. Do you think it is possible to use a similar technique to do the same with a depth stream. I have it coming in right now as a 2d texture parameter.
Thanks