The plugin does right-hand to left-hand conversion automatically on import (hence why the Y appears negative). This is to make it consistent with other content brought to UE (from, say, 3DS Max, which also auto-performs such a conversion).
In other words, if the content matches in the right-handed system, it will also match once brought to UE.
What is it you are trying to do?
@phoboz GetAllActors would work, but that would be needed to call every tick and that would be probably too expensive.
If the number of actors doesn’t change, you can run GetAllActors on start, cache the results into a local array and iterate on this
@phoboz Yes, but number of actor is changed every moment, I have one point cloud actor 100x100 m and when you will move through enviroment, then is many actors loaded (and unloaded) every 100 m.
Then you either have to query them when the change occurs or if you want the global param, you’d need to adjust the plugin code
@phoboz Ok. And that custom version - you said it will take a while. When I can expect it?
Hi! Do you have any plans to add possiblity to use plugin under Android?
I’ll try to get something done next week
Not at this time
@phoboz It is intentional that when you will import point cloud inside UE, align and then export it - it will have different coordinates?
For example original Y is 965340 and after export from UE it is 504.
It will broke any further possible edits outside of UE.
Yes, it is.
It will export the asset the way you have it set - if you center it or align it, it will use those coords instead of the original ones.
I’ll note a feature request to be able to select how to export the data.
@phoboz That would be good. And currently the original coords are lost forever, or after that feature will be implemented then original coords could be recovered again even from today edited/aligned clouds?
The original coords are not lost at the moment. They are retained in non-BP exposed way, as we are using double precision.
You can shift the asset to its original location by:
- calling RestoreOriginalCoordinates on the asset (available in BP)
- opening the asset editor (double-click the asset), then toggling the Center option from the toolbar twice (the first will center the pivot, the second will reset the pivot to its original coordinate)
If you reset multiple assets like this, they should still align well in UE just that it won’t be around the 0,0,0 location, but rather at the location at which they were exported.
Hi. I’m still struggling with the right/left-hand conversion. My Lidar data is coming from a right-hand system, so importing them in Unreal means all Y-values become negative (which is correct when converting from right to left-handed).
The thing is, I made a blueprint which casts a trace on the pointcloud, execute some calculations, and returns data with x/y/z-values, all saved in a json file. Ofcourse all Y-values are negative because of the conversion, but how do I convert all Y-values back to a right-handed system?
Simply negating the Y value again should work, I imagine.
What light types are now officially supported? Noticing an issue with spot/point light that cuts out any rays which point upwards away from the source. Using 4.27.1.
I have a question about line tracing the point cloud. I want to perform a line trace on a point cloud at specific points and get the Z-value of the hit location. It worked, kind of, but the hit location isn’t correct. The screenshot below shows the gap between the hit location (end of the red line) and points. Is this a matter of accuracy of the point cloud? Or did it hit something invisible?
I have a problem with light in my scene. For some reason, the point cloud is black.
The obj model of the same building and in the same place is partially enlighted so I think it is not a matter of “lack of light”
Another point cloud in the same scene:
I thought that dragging the light source would help but I see no difference.
I would be grateful for help or any advice on where to look for a solution.
I am using UE 5 if that matters.
All lights are generally supported, but since we don’t have orientation info from the points, we assume as if they are flat and aiming up - hence only stuff lit from above reacts correctly. If you calculate normals and switch point orientation, you should get a fully correct response to light.
This would be a matter of the collider accuracy. If you want very accurate results without having to build heavy colliders, you can use lidar-dedicated trace functions (available in BP and C++)
We don’t currently support ray-traced lighting with point clouds.
Which BP do I use for lidar trace? Is it LineTraceForLidarPointCloud?
What do you mean with ‘without having to build heavy colliders’? No need of generating a collision?
edit There is still a big difference between hit z-value and the actual z-value of the point (should be around 423, but the hit is at 262). I’ve used LineTraceForLidarPointCloud BP