4.26 OCIO implementation


With the latest update we can now color manage our views through OCIO. If I were to turn it on - results are incorrect - I assume one must pre convert every texture into ACES, right? Or is there some way to change color space of a texture within the editor? If no - will there be such functionality added in the future? I have found some blueprint API reference but dont think it is available.


Define ‘incorrect’. What OCIO config are you using? What result are you expecting to see?
I’m also exploring the new OCIO plugin that is so far only designed as a viewer manager. I was also interested in ACES workflows.
I commented on this video asking them to cover this topic some more.

From what I understood after some testing is that you probably should use Utility - Linear - sRGB as IDT because rendering is linear but limited to sRGB and so are the materials.
The ODT should be Output - sRGB.
This ‘look’ is slightly darker (not sure why) but mostly similar as the built-in Filmic Tonemapper which is actually what ACES is but without the IDT/ODT management.
Once you enable OCIO the Unreal Tonemapper is disabled.

One of the issues is that the mini camera viewer on top of the main viewer doesn’t get the OCIO transforms applied. You have to view through the camera in the main view.
The stuff I’m still curious about is if we’d be able to have ACEScg as the working colorspace for calculations instead of sRGB. Maybe it works if you’d convert your albedo and single color based materials to ACEScg and use that IDT instead but for HDRI textures you are only able to set it to linear-srgb in the properties.
Maybe we’ll get more integrated support in the future. It’s still 0.1 BETA now.

This is all the documentation there is for now.…nce/index.html
There’s even a tip there referring to a page that doesn’t exist yet. Might be up soon.

Hey, I am using the latest ACES 1.2 config files. What I mean by incorrect is setting the colorspace to ACEScg without preconverting all the color assets - since there is no way in UE to set the textures to anything else than linear/sRGB, as you mentioned. I am comparing everything to offline renderer with the same config files and textures, pre converting everything and using ACEScg as IDT gives similar results (taking RT/offline into account), I’m yet to test if using just Utility - Linear - sRGB would look the same. On the first glance it does but the eye is easily tricked so I’d rather use other ways of comparing images :smiley:

Yes it is impossible to do your rendering with ACEScg textures as of now. If you’d convert diffuse to ACEScg the texture in the material is still being read as standard sRGB and the rendering will still be Linear sRGB so the colors/gamma will be incorrect. Maybe compare offline render with UE non converted textures and Utility-Linear-sRGB as I mentioned and see if it yields similar results.
You will not have the benefit of rendering in wider gamut, so intense colors won’t map as nicely back to display space.

There is however some workflows related to the use of HDR in UE. I don’t know if you could tweak some stuff with that or that HDR rendering only is about mapping srgb to wider gamut and making your UE lights work in a higher output range.

Probably give it a little bit of time. HDR is getting pretty common. We might be able to do wide gamut materials and rendering by default where the SDR deliverable will be the secondary option “downscaled”.

I have tested it and pre converting textures to ACEScg along with using ACEScg in UE yields the same results as no pre converting and Utility - Linear - sRGB. After all - ACEScg is encoded linearly. In the latest release notes, in section dedicated to OCIO implementation, there was a video of Epic using ACEScg, so I think it should be viable. I wish Epic chipped in on this or made the documentation more clear, right now it does not say much.

Getting the exact same results is totally correct. But that doesn’t mean the colors are rendered correctly. Anything managed from OICO will always yield exact results as everything is handled from within what ACES 1.2 is doing. If the entire color management is setup incorrect, telling ocio to convert that from “space x” to “space y” and in another viewer “space w” to the same “space y” will always look exactly the same if the source material was also converted from “space x” to “space w” if that makes sense :P.

Not scientific, both are placed in a single scene at different locations so light might differ slightly, but arguably the same result.

Here you can clearly see the big difference in gamma (or exposure?) compared to the default tonemapper.

I think we have to keep in mind that even though we are able to convert textures to ACEScg the texture is always loaded as linear sRGB.
It says HDR RGB, no sRGB but that literally only means it’s interpreted as linear gamma. The colorspace remains sRGB.


I’m not technical enough to understand if it’s a problem or not. I’m fairly new to Unreal and it has a lot of layers that alter behavior of light and color because it’s not a renderer but a game engine with a lot of compensating and simplifying behavior for realtime purposes.

You start running into issues when you add elements that are entirely based on the default pipeline like the Sun and Sky blueprint. This element isn’t based on ACEScg and renders oversaturated. The colorchecker still renders correctly kinda, but the sky doesn’t.

If you’d stick to only lighting with an HDRI in ACEScg plus physical lights only I think it’s doable.
As you said in your initial post. I also hope now OCIO implementation has started we’ll see some more management features for texture handling and lighting :smiley:

I think I managed to make it work (Using Unreal as an ACEScg rendering space, and viewing it with the Correct OCIO config right in the viewport).

think the reason the image on the right for you Sinekraft looks oversaturated because you’re indicating through OCIO that an sRGB gamut picture is an AP1 one (which is not, you’re performing an Aces-to-sRGB conversion onto an already sRGB render, resulting in a super-saturated image). I believe Unreal’s final image by default is in a linearized sRGB space (When you export it in Render Queue as an EXR for example, unreal thankfully takes off the sRGB display curve, guessing EXR is an intermediate working file, for every other 8-bit formats (thankfully) he bakes that in, guessing that’s a final image).


Btw I think **ACES **(at least for a CG artist) isn’t really a “magic switch” that makes your renders instantly look prettier - unless you have a scenario with extremely saturated colors of materials and lighting like the guys had at Lego Movie (although they did not render in ACEScg, according to this video they used LegoP3 primaries (which is also quite wider than the sRGB), and used ACES workflow after that, i guess during the composite+grade). ACES has a much noble goal than just improving CG renders, However if Unreal calculates Lighting & Shading correctly on this wider ACES gamut (like AP-1), it can give you more accurate render results to the real world, because it can work with many colors that’s visible to the human eye but aren’t exist even in the traditional linear workflow (which is unfortunately limited to the sRGB gamut) - that’s used by most render engines including unreal by default. Not because render engines are bad, AFAIK render engines are color-space-agnostic, they’re just calculating whatever you throw at them; it’s because what input artists give to them.

I tried to recreate Chris Brejon’s simple-but-spectacular example in Unreal with success I guess, which he achieved in Guerilla Render, and shared his knowledge in this really great and helpful read about this topic:…g-system-aces/


The Left image is the** “vanilla” unreal render** with it’s own tonemapper (No OCIO or ACES was used here, it’s basically The final image! If it’s an EXR, an sRGB display-transform (Not ACES sRGB!) is needed to look like a final-image. If it’s exported as jpg or png, unreal automatically bakes the sRGB gamma-correction to the image). Actually this is unreal’s ‘traditional linear workflow’ result.

The **Middle **one is with the unreal tonemapper turned off (so I handled it with an **IDT **of ‘Utility - Linear - sRGB’ in OCIO). The tonemapping is done via ACES. **ODT **is ALWAYS dependent on your outpute device, for 90%+ of the time it’s a regular monitor with -> Output-sRGB.

With the correct ODT added, there is automatically an S-curve like tonemapper in the ODT that gives a pretty close result to unreal’s - after tweaking unreal a bit: removing the blue correction, the fake wide gamut, and pre-gain your final image with 1.45 according to this thread by Thomas Mansencal on ACES central. You may want to add the Blue correction Fix as an LMT to the final render if your bright-saturated-blues are becoming violet, that’s an ACES specific problem that can be fixed by an LMTwritten by the ACES guys - unreal actually uses this same exact code when tonemapping the final image. The middle image hasn’t got the richer lighting details because (sadly) Lighting & Shading is still calculated within the sRGB gamut.

AFAIK by default Unreal operates in a Linearized-sRGB gamut, although (only for tonemapping) the image is transformed into ACES space, gets tonemapped (and gets some other magic juicy stuff), then comes back to sRGB when done, resulting in a Linear-sRGB image again at the end, but with a tonemap that was applied at ACES space. You can check it in this code, around line ~330.

The image on the **Right **however was rendered in an **ACEScg **working space, previewed in the **viewport OCIO **with an IDT of ‘ACES - ACEScg’ & **ODT **of ‘output - sRGB’ (if you render it as a Linear, untonemapped, no OCIO baked in EXR, that’s considered as an ACEScg AP-1 EXR - and THAT’s how you should deliver to the comp / colorist department -, and you can preview or work on the render in ANY application that has OCIO | ACES support: RV, Nuke, Fusion, Resolve, AfterEffects, Blender, Maya, Houdini, Unreal 4.26…). The right image for instance **has richer GI details **(the indirect **lighting **happily **works with **4.26 GPU Lightmass & RTX GI as well, & with Photometric camera + light values)!

If you guys are interested, I’m planning to make a tutorial on how to achieve this (Please note that I’m not an official ACES teacher, I’m just a regular CG artist excited about the topic, probably i’m doing some stuff wrong, but i’m pleased to show you what I know by far - dear ACES gods: please do correct me if I’m doing anything wrong!).


Correct, including every color value created in unreal as well (the color picker is linear with an sRGB gamma-preview toggle). Although I didn’t want to preconvert every texture outside e.g. in nuke, especially didn’t want to generate tons of exrs (unfortunately ALL ACEScg textures MUST BE 16-bit-Half-EXRs, this is the way…), so instead I wrote my own IDT converters in unreal to make it flexible and easily applicable to any project at any point non-destructively without worrying about massive exr conversions and pain-in-the-*** Artist texture iteration times due to conversions (I made a global “ACEScg” switch that’s toggleable in-editor anytime). If you’d like to get it, I’m planning to upload it to gumroad, until Epic implements the ACEScg workflow fully into Unreal.