I think I managed to make it work (Using Unreal as an ACEScg rendering space, and viewing it with the Correct OCIO config right in the viewport).
think the reason the image on the right for you Sinekraft looks oversaturated because you’re indicating through OCIO that an sRGB gamut picture is an AP1 one (which is not, you’re performing an Aces-to-sRGB conversion onto an already sRGB render, resulting in a super-saturated image). I believe Unreal’s final image by default is in a linearized sRGB space (When you export it in Render Queue as an EXR for example, unreal thankfully takes off the sRGB display curve, guessing EXR is an intermediate working file, for every other 8-bit formats (thankfully) he bakes that in, guessing that’s a final image).
Btw I think **ACES **(at least for a CG artist) isn’t really a “magic switch” that makes your renders instantly look prettier - unless you have a scenario with extremely saturated colors of materials and lighting like the guys had at Lego Movie (although they did not render in ACEScg, according to this video they used LegoP3 primaries (which is also quite wider than the sRGB), and used ACES workflow after that, i guess during the composite+grade). ACES has a much noble goal than just improving CG renders, However if Unreal calculates Lighting & Shading correctly on this wider ACES gamut (like AP-1), it can give you more accurate render results to the real world, because it can work with many colors that’s visible to the human eye but aren’t exist even in the traditional linear workflow (which is unfortunately limited to the sRGB gamut) - that’s used by most render engines including unreal by default. Not because render engines are bad, AFAIK render engines are color-space-agnostic, they’re just calculating whatever you throw at them; it’s because what input artists give to them.
I tried to recreate Chris Brejon’s simple-but-spectacular example in Unreal with success I guess, which he achieved in Guerilla Render, and shared his knowledge in this really great and helpful read about this topic:
The Left image is the** “vanilla” unreal render** with it’s own tonemapper (No OCIO or ACES was used here, it’s basically The final image! If it’s an EXR, an sRGB display-transform (Not ACES sRGB!) is needed to look like a final-image. If it’s exported as jpg or png, unreal automatically bakes the sRGB gamma-correction to the image). Actually this is unreal’s ‘traditional linear workflow’ result.
The **Middle **one is with the unreal tonemapper turned off (so I handled it with an **IDT **of ‘Utility - Linear - sRGB’ in OCIO). The tonemapping is done via ACES. **ODT **is ALWAYS dependent on your outpute device, for 90%+ of the time it’s a regular monitor with -> Output-sRGB.
With the correct ODT added, there is automatically an S-curve like tonemapper in the ODT that gives a pretty close result to unreal’s - after tweaking unreal a bit: removing the blue correction, the fake wide gamut, and pre-gain your final image with 1.45 according to this thread by Thomas Mansencal on ACES central. You may want to add the Blue correction Fix as an LMT to the final render if your bright-saturated-blues are becoming violet, that’s an ACES specific problem that can be fixed by an LMTwritten by the ACES guys - unreal actually uses this same exact code when tonemapping the final image. The middle image hasn’t got the richer lighting details because (sadly) Lighting & Shading is still calculated within the sRGB gamut.
AFAIK by default Unreal operates in a Linearized-sRGB gamut, although (only for tonemapping) the image is transformed into ACES space, gets tonemapped (and gets some other magic juicy stuff), then comes back to sRGB when done, resulting in a Linear-sRGB image again at the end, but with a tonemap that was applied at ACES space. You can check it in this code, around line ~330.
The image on the **Right **however was rendered in an **ACEScg **working space, previewed in the **viewport OCIO **with an IDT of ‘ACES - ACEScg’ & **ODT **of ‘output - sRGB’ (if you render it as a Linear, untonemapped, no OCIO baked in EXR, that’s considered as an ACEScg AP-1 EXR - and THAT’s how you should deliver to the comp / colorist department -, and you can preview or work on the render in ANY application that has OCIO | ACES support: RV, Nuke, Fusion, Resolve, AfterEffects, Blender, Maya, Houdini, Unreal 4.26…). The right image for instance **has richer GI details **(the indirect **lighting **happily **works with **4.26 GPU Lightmass & RTX GI as well, & with Photometric camera + light values)!
If you guys are interested, I’m planning to make a tutorial on how to achieve this (Please note that I’m not an official ACES teacher, I’m just a regular CG artist excited about the topic, probably i’m doing some stuff wrong, but i’m pleased to show you what I know by far - dear ACES gods: please do correct me if I’m doing anything wrong!).