Rendering with sequencer is too bright

you need to add it by going to the +setting in the movie render window.

Though, looking back in retrospect (as I posted that ages ago), you should only be doing this if you know what you’re doing with the color management.

Watch this video if you want to know more

2 Likes

As this is still confusing people here a quick check list, basically you want to avoid auto exposure and make sure that render settings match the viewport settings -

  • On you camera, search for ‘‘exp’’ and change the setting to manual, then adjust the exposure using the exposure compensation.

  • If it doesn’t work, try doing it in the post process volume if you have one. (It’s exactly the same there)

  • If that doesn’t work, check the viewport setting -

Make sure game settings is ticked, the EV100 will give you a different exposure just for the viewport but I don’t think it affects the render.

  • There’s also another setting in the project settings, make sure you untick ‘‘auto exposure’’

  • If there’s still an issue, make sure you’ve not hidden any lighting in the scene by disabling it in the outliner (The light will be re-activated in the render). To properly disable a light, you should search for ‘actor hidden in game’’ and '‘visible’ in the lights details panel and adjust those settings.

  • If again it’s still occurring, maybe you’ve disabled the tone curve in the render settings in the color output tab (if you’ve added it) or you’re using an OCIO config for the viewport that doesn’t match the one in the color output tab in the render settings (if you’ve added it). Viewport OCIO is set in the lit tab, shown at the bottom of the window by OCIO Display as seen in one of the above images.

  • Another possible cause is the use of more than one directional light in the scene, if forward shading isn’t set correctly, then the engine will chose one over the other for lighting volumetric fog etc, and this could differ when the engine loads for rendering.

  • Also check in your render settings that you’re actually rendering the correct level, this can happen if you have duplicate levels with different lighting and you haven’t set up the sequencer to correctly match the level you want to render. I made a more detailed post on this here - Transfering sequencer and cameras to another project - #2 by Origenic

These are all the possible causes I can think of at this moment but there maybe more.

1 Like

Hi there :slight_smile:

This is driving me nuts … I’ve tried all of the above, still no luck. This seems to me there’s a gamma applied to the rendered images.
You can find below a comparison between legacy Sequencer output (same as viewport) and MRQ output.

Thanks in advance for your help !

Best regards.

1 Like

Which matches the editor viewport in play mode? In MRQ is there anything in the Color Output? Is this Pathtracer or Lumen/Deferred? Any local or auto exposure?

Hello,

  • The Sequencer (Legacy) is the one matching the editor viewport.
  • No Color Output in MRQ
  • It’s Lumen
  • No local exposure

If you hit Play in the viewport does it look bright? If you disable Game Overrides? No auto exposure? Are you at a low shadow scalability in the editor?

Are you able to give me any repro steps? Like if you new level with a particular camera setting or something does it still happen?

Hello Shaun,

If I hit Play in the viewport, it’s exactly like in Editor viewport, totally fine.

Since I don’t have any Game Overrides in MRQ settings, I think you talk about Game Settings (in the viewport settings → Exposure section → Game Settings unchecked) ? if so, I unchecked it → Editor’s viewport is darker, and everything looks fine when playing in the viewport, it becomes as balanced as the Editor viewport with Game Settings checked.

No autoexposure at all.

About scalability settings, please check below a screenshot of the current settings.
scalability_groups

Hmmm I can’t think about any repro steps tbh … the only “fancy” thing I can think of is that I use Ultra Dynamic Sky - not sure if that helps though.

Thanks again.

MRQ Game Overrides have an effect on your render whether you have it in the UI or not. You have to add it to the UI and disable it. Game Overrides will put it into CInematic Scalability and changes a bunch of things. I have not tested Ulta Dynamic Sky with MRQ that much so I could not speak to that.

1 Like

Hello Shaun,

I’ve tested adding Game Overrides to MRQ settings, and no matter if it’s on or off, still the same washed out output.

Idk then. If you have a small project you can send over I can have a look. I’m genuinely curious now, ha

Looking at the images, it’s possible that it’s the fog rendering differently… i.e thicker fog, or more scattering. Try removing it and do a render. I’ve had issues with Ultra Dynamic Sky not rendering the global volumetric fog material correctly when UDS was set to cinematic/offline mode. (I fixed it by either disabling the ‘use global volumetric fog’ tick box or changing the UDS project mode to ‘Game/real time’)

Unfortunately, it’s impossible for me to send this project, heavy NDA on this one.
I don’t have much time these days, but I’ll try to investigate a bit more next week.

Thanks again.

HI :slight_smile:

Thanks for your suggestion, I’ll try that ASAP.

That being said, the problem reminds me of the old days when it was sometimes tricky to use linear workflow, when we got double gamma correction applied on the output renderings (3dsmax, VRay …)
With the images I rendered through MRQ, applying a gamma compensation curve in post ends with the original look, the one as seen in the editor viewport … anyway, be sure I’ll try your fog suggestion next week.

Thank you for your help.

Regards.

So, recently I’ve been working with blueprints a lot and it’s taught me a few things about how they work and how things get rendered. Anyone coming form a game dev perspective will likely already know some of this if not all.

When you render out of UE using movie render queue, it renders using runtime, which is basically what game play uses, rather than what you see in the editor. When you create a blueprint, you have two node graphs, the construction script and the event graph. The construction script runs in the editor and is basically used for level design, the event graph runs in runtime. For some blueprint assets you’ll find work on either one, or both. If a blueprint only has nodes set up in the construction script and you try and keyframe values, you’ll find that those keyframes are not read when you do your render and vice versa… i.e. a blue print only set up in the event graph will not read keyframes when you playback in the editor.

So, if anyone’s finding that their blueprint assets are not rendering correctly this could be a reason why. If you only have nodes in the construction script you may be able to simply copy them over to the event graph and connect them to the event tick (This depends on what they do, event tick can be pretty heavy so it’s best to avoid doing complex things like spawning hundreds of meshes. i.e. I think it tries to cycle the event tick at 120fps, I may be wrong but when spawning meshes it lead to insane render times!).

If your keyframes aren’t being read from the blueprint in the editor and you are using the construction script, go to the class settings in the blueprint (tab at the top) and enable ‘run construction script in sequencer’ if it’s not enabled and see if that works.

Also another BP tip, if you ever need to keyframe a parameter from a blueprint and it isn’t available to keyframe, open up the blueprint, look for the variable name in the list on the lower left hand side panel, select it, then in the far right panel tick ‘expose to cinematics’. i.e. there’s a some Ultra Dynamic Sky parameters that have this unticked, you can also set the max values and ranges here for parameters if you need more than the blueprint details panel offers.

I’ve found that it’s good practice to check your sequence works in runtime before you render it. You can do this by making sure your camera cuts track is active, then hitting the play button on the top of the screen to enter runtime, then hit play on your sequencer. It will then play the sequence as you’d see it in the movie render (aside from additional MRQ modifications like game overides) using event graph logic for all your blueprint actors. This is great for quickly seeing if an issue is still occurring without the need for setting up a render and waiting for it.

I’m not sure if this is the best thread to post this in but it’s related and it was the most apt one I could find for this brain fart. It’s just stuff I wished someone had told me early on, so I hope it helps someone.