Hi there,
I am working on reconstruction of places with huge lighting changes : not only the temperature of color changes (natural, cloudy, tungsten, fluo…), but also the quantity, with a range of sometimes more than x10000 (between a very dim room lit where I need to pose 30s on tripod with electric candles to a bright natural lit corridor with large windows handheld-shot at 1/500s). Even going on a color-accurate workflow (Color Checker, RAW…) does not work because we do want to capture the lighting as part of the model, and even trying to get the object exact color does not work when there is some overlap, because several lightings types could appear on the same picture and then you can not find a process setting for the picture.
Could someone from CR explain how the texturing algorithm works ? What I feel from working with RC is that RC tries to normalize all the pictures (edit : I should better say : use the pictures as is whithout taking the exposure info into account) and works from there, blending (on multiple frequencies, which works really very well when the lighting is almost homogeneous) and texturing on the model.
And of course, when the light changes suddenly from one close area to the other with great extent, this gives color shifting and texturing is then not so homogenous.
Could the EXIF information be used (maybe they are ?) to improve the process : we know from the EXIF data the exposure (as well as the white balance and a lot of other settings), so would that be possible to improve the texturing from there : for example, let the user choose to texture for a given exposure : for the shadows (and overexpose the bright parts) or for the highlights (and lose the darkest parts) ? Or maybe texture in high dynamic range ?
I had a look at the TIFFs exported, and they are not really 16 bits, right ?
Thanks for any hints, thoughts, …
Jonathan.
Maybe my post was a bit confusing…
Any workflow hints for scenes with a huge dynamic in the lighting ?
Leaving the camera sets the exposure on such scenes guarrantees we have usable pictures, which is good, but use that information (in EXIFs) later during texturing could be great too. Or have a real high dynamic range texuring would be great too.
Thanks !
Hi Jonathan,
I think the problem is inherent in digital color representation - it can only do so many steps between black and white. It cannot mimick the capabilities of the eye to flowingly adjust to the amount of light. So if you have one scene where you have glaring sunlight outside and candle light inside, I think it’s just not possible to squeeze all of that in one texture map with any decent quality.
That is why we are supposed to have HDR : not the tonal compressed one that we see everywhere, but the real high bit depths one. We could have the data and we could use it. Then this is a matter of rendering and visualization. I am doing VR (I am a Unity3D developer), so we can definitely build shaders with eye adaptation (or tonal compression, depending on what we want), but we need to have the data.
Ah, I see. I haven’T really looked into HDR, but from looking at the images I gather that the difference is not limitless. And to be honest they look unnatural to me, which of course is just a matter of perception.
Yes, because you are talking about these tonal compressed pictures we see everywhere, that look so unnatural, I agree !
The real HDR data can not be visualized like this.
Ah, interesting. Thanks for clearing that up!
So with the right software it would be possible to adjust the rendering according to the circumstances? As in pull it up when in a dark room and tune it down in an exterior, all with the same texture map?
Yes, that is the idea - with eye adaptation and everything.
I wonder if a 16bit tiff would retain enough fidelity (gradations between steps in color and luminosity) to take exposures only changing shutter and then changing exposure in lightroom or what ever raw interrupter and exporting in a 16bit tiff? I’ll try to make a sample set and test and then use to explain this further.
Thanks Steven for pointing that out !!! That is really good news.