Algorithm behind exposure bias in HDR imaging texture

Hello guys,

I’m trying to find a solution to get better light result of my scene using a scene capture cube.
I followed this thread when a user (moderator) wrote that the emmisive materiel effects not rendered may be a bug (message saying that may be modified or deleted ? ).
8d7a1b7566bf671bac5728798d2e6492867de96a.jpeg

As suggested in this thread, I tried to use a Tonemapping Algorithm, which give good results but not like I want. (I’m using the same algo than photoshop but in Opencv3.0). I’m able to get these results:
279c0957269b28e812ef5450c7d192eceb69f8f2.jpeg
As you can see, the lights arn’t so bad but the amount of luminosity from the window in the second picture decrease the global luinosity of the picture. Creating a 360 video with these, the light change is very brutal.

My goal is to reproduce the light effects of the “Exposure Bias” parameter inside Unreal Engine 4 Texture preview.
c423c08f5abdaf3c0553b7777041d56389068225.jpeg

I know HDR image format contains several exposure of the same scene, and these are used to output a tonemapped ldr image, my goal is to choose the exposure time from the .HDR picture wrote by UE4. Like the exposure bias parameter (which is not taking into accoutn during rendering of the texture).

How can I do that ? Do you have any idea how to select the best exposure from an HDR picture ?

thanks a lot in advance,

Working around with openCV i was able to get better result :
See my StackOverflow thread for more information and code sample.