After exporting (unwrapped) fbx from RC, editing in 3rd party app, importing to RC, I then unwrapped and ran Texture, got these patches of color (from gutter?) around the model:
The model seems to have the correct coordinate system and orientation, most the color is registered correctly, so what explains it? Note, I did not retopologize, did smooth surfaces and decimate, also deleted stray polys and surfaces. I don’t see how the placement of the patches would relate to any of that. Any thoughts?
I hadn’t exported textures, hadn’t even unwrapped. The model from the external editor also wasn’t unwrapped, simply modified geometry. I did some cleaning, possibly RC having a problem unwrapping properly because of that. I tried a second mesh, much less an issue, but still some patches. About quads, I had retopologized in a while back in ZBrush, preserved the original UVs, that model reprojected textures fine. Maybe, I’ll try that again. Hmmm…
I found the problem, viewed the location of the patches, noticed these occurred in larger, planar sections, gave me the thought this might be large triangles generated after smoothing> decimation. I made selections of those areas back in Recap Photo, subdivided, exported, imported, Unwrap, viewed UVs, problem gone. So, that’s at least one culprit, too few subdivisions. I wonder if that Large triangle removal setting pertains.
That figures, thanks for clarifying, a better idea than adding subdivisions to planar shapes. Now onto a tougher issue, dealing with errors during tessellation, straight edges rendered squiggly. You’d think there were an algorithm or ten to spot/adjust points that relate to straight, curved, or otherwise continuous lines, so prevalent in man made environments, perhaps the biggest barrier between viable high-end photogrammetry applications feeding pipeline with assets as is versus the status quo, assets only serving environment artists a template against which to remodel most everything, which currently applies to LiDAR and photogrammetry-derived models.
I agree, getting these hard surface models to work without a lot of manual cleanup or re-modelling would be ideal. I would guess at a smarter simplify that did that would be nice.
though it would probably require a pretty decent classification system as well for that to work.
Simplify (decimation) by itself, I believe, won’t alleviate the issue, as the damage is done between point cloud and polygonal mesh, i.e. tessellation. I’m going to move this discussion over to Feature Requests. If you happen onto that, do pipe up with your thoughts, may help produce some volume to better be heard.