The plugin does right-hand to left-hand conversion automatically on import (hence why the Y appears negative). This is to make it consistent with other content brought to UE (from, say, 3DS, which also auto-performs such a conversion).
In other words, if the content matches in the right-handed system, it will also match once brought to UE.
@ Yes, but number of actor is changed every moment, I have one point cloud actor 100x100 m and when you will move through enviroment, then is many actors loaded (and unloaded) every 100 m.
@ That would be good. And currently the original coords are lost forever, or after that feature will be implemented then original coords could be recovered again even from today edited/aligned clouds?
The original coords are not lost at the moment. They are retained in non-BP exposed way, as we are using double precision.
You can shift the asset to its original location by:
calling RestoreOriginalCoordinates on the asset (available in BP)
opening the asset editor (double-click the asset), then toggling the Center option from the toolbar twice (the first will center the pivot, the second will reset the pivot to its original coordinate)
If you reset multiple assets like this, they should still align well in UE just that it wonât be around the 0,0,0 location, but rather at the location at which they were exported.
Hi. Iâm still struggling with the right/left-hand conversion. My Lidar data is coming from a right-hand system, so importing them in Unreal means all Y-values become negative (which is correct when converting from right to left-handed).
The thing is, I made a blueprint which casts a trace on the pointcloud, execute some calculations, and returns data with x/y/z-values, all saved in a json file. Ofcourse all Y-values are negative because of the conversion, but how do I convert all Y-values back to a right-handed system?
What light types are now officially supported? Noticing an issue with spot/point light that cuts out any rays which point upwards away from the source. Using 4.27.1.
I have a question about line tracing the point cloud. I want to perform a line trace on a point cloud at specific points and get the Z-value of the hit location. It worked, kind of, but the hit location isnât correct. The screenshot below shows the gap between the hit location (end of the red line) and points. Is this a matter of accuracy of the point cloud? Or did it hit something invisible?
I have a problem with light in my scene. For some reason, the point cloud is black.
The obj model of the same building and in the same place is partially enlighted so I think it is not a matter of âlack of lightâ
Hi @frost-xyz
All lights are generally supported, but since we donât have orientation info from the points, we assume as if they are flat and aiming up - hence only stuff lit from above reacts correctly. If you calculate normals and switch point orientation, you should get a fully correct response to light.
Hi @Lionhouse_NL
This would be a matter of the collider accuracy. If you want very accurate results without having to build heavy colliders, you can use lidar-dedicated trace functions (available in BP and C++)
Hi @ziutek27
We donât currently support ray-traced lighting with point clouds.
Which BP do I use for lidar trace? Is it LineTraceForLidarPointCloud?
What do you mean with âwithout having to build heavy collidersâ? No need of generating a collision?
edit There is still a big difference between hit z-value and the actual z-value of the point (should be around 423, but the hit is at 262). Iâve used LineTraceForLidarPointCloud BP