So, all my images seem to place at the right location in space - except one set… this set seem to have the images oriented the right way, and they are correct in respect to the other images in the same series - however - this row of images is placed to have been taken many meters under ground… where as they are taken 1,6 meters above ground. the result is that the whole model seem to implode in the area these images are… I have taken this series of facade imagery to ensure more detail on the building facade, so I don’t wish to just take it out… I will try to reallign images - also this serie is already tied well together with the other image-series through lots of control points - so I would expect them to be correct…
Additional info… even if all sets agree on where all the control points are inside the images - it seem still in the 3D-world all those control points that link to this image-set is displayed completely wrong in all axis… not sure how this can happen - but I will try to delete all those control points and redo them, and add even more points from the drone imagery to hopefully help realitycapture nail the actual location of these images. Should I turn off ‘priority pose, absolute pose’ for the iphone photos taken from the ground? and then just let the drone images be the base… this seem to give me a correctly rotated model - the iphone photos often seem to make rotateion of the model all messed up.
Hm, I also get a warning that the points I place is too far away (unprecise) - but I believe this is based on the currently loaded model? which is already done wrong… And when the progrm finds 5-10 new automatic points to place, they are all very wrong… it even shows points on images that can’t even see that side of the building… I hope all I need to do is to place more points and remake model - if anyone has had the same issues then please respond I imagine this is a quite common issue, but not sure how the program places images in the axis system - will these placements change based on a remodel, or do I have to somehow move the images manually? Last way would be impossible to get it all right.
I found that when I first added my control points, I ended up deleting a lot of them the first time, because the program would state they were too far off (50+ px) - I deleted all of those, which in turn made the model not able to correct the location of the images… So I added the control points again, even if they were claimed to be too far off (they were too far off in the current model because the current model was wrong ofc.) - when I then realigned the model were fixed, and the camera locations are now the correct place.
I also tend to first align with all images location active, then if the model does not end up straight (like rotated/tilted) then I will remove and set unknown on my phone images and let only the drone images set up the placement, and then it tends to get corrected when I realign
So this week we scanned a location, outdoor and indoor.
Currently I’m doing some test with the different datasets to check if everything is fine.
Unfortunately the drone datasets have the same issue als above.
So all images are taken at an height of around 50 meters with an inspire 2.
When I align the images (jpg with gps data) with default settings some images are like 80 meter high in. This is not correct because the images are taken at around 50 meters.
Also if I align the images with the location data off the same happened.
I think this is just a difficult dataset because I don’t use any GCP and etc. but I don’t understand. Because this never happend before.
I took the images with the dronedeploy app, with 70% overlap. Nadir and Oblique and the overlap combined is very dennse.
I’ll be home in a couple hours and post some screenshots. Maybe someone can explain what is happening and maybe I’m just doing something wrong!