Any ideas on how to fix this?

So, this is the fourth or fifth time I have tried to process this model. The source images look good and all were aligned properly. I’ve also seen the missing cubes/holes many times before, and usually, the suggested ‘fix’ of changing the reconstruction region (as per the related post in this forum) gets rid of this issue. However, even though I have processed this same model several times now, each time tweaking the region, I continue to get missing chunks of data, though they have been in different places with each reconstruction attempt. I have only tried to process the model at Normal level, as it would take too long to do on High, and I do not need that level of detail.

The workstation has 128GB RAM, super fast SSDs with tons of free space, 32 cores (dual Xeons), and a 1070 GTX, so I don’t think the machine itself would be the issue. This is the first time I have had this problem even after repeated tweaking and attempts.

I am hitting the same problem on a model I’m processing. It consistently fails on random chunks when doing High Detail reconstructions. Reducing the size of my reconstruction region works fine, but I would very much like to process the entire model without sections failing.

Since reducing my reconstruction region builds successfully, I assumed it was a RAM issue. My PC only had 16GB, and watching my Task Manager I could see that I was maxing out memory while RC was reconstructing geo. So I doubled RAM to 32GB, but haven’t tested it yet. Now seeing that you’re getting the same issues with 128GB, I’m doubting that more RAM will fix the problem.

It doesn’t help with troubleshooting when RC gives such a vague error. I can see that sections failed, what I need to know is why. Curious to hear what the devs have to say about it.

Nice model, by the way! You’ve got some real sharp detail there.

Quick update: On the 7th attempt, after doing nothing more than continuing to tweak the reconstruction region, the model came out perfectly. I wish I hadn’t had to waste so many 8+ hour reconstruction attempts, though. 

Update from me: after upgrading my RAM to 32GB, I was able to process without issue. So that confirms that my problem was RAM related. However, with 128GB, your problem is more of a mystery. I’m curious about how many cameras do you have in your scene.

I wonder if there’s a correlation with RAM requirements and number of cameras. My scene has about 600 4K images, and 16GB RAM was under the fail threshold.

Beats me. My model was created from 287 X ~27MB RAW files. That said, I don’t guess the problem for either of us is the RAM. I’ve processed way more photos than this before and have not had the same issue. On the other hand, I understand people often process large models with 8GB or 16GB of RAM without any issue, as well. I’m not sure why the developers have never given a really good response on this (or maybe they have and I just haven’t seen it, though I have scoured the web). All I have seen anyone recommend is to adjust the reconstruction region and try again, but it would be good to know why that fixes the issue, as we could tailor our workflows accordingly and not have to re-process models multiple times to get a good result. 

 

I had this same issue, never knew what is was.

According to the support, this was because I was mixing very close and distant views of the same parts… 

What I found out is that when this is starting to do this you better start a new project from scratch.