I’ve been working on various tests trying to get the most out of UE4’s ability to be part of an offline render solution to maximize turn around time on shots.
Sequence rendered out of UE4, then taken to After Effects for Post work.
I’ve noticed that there are issues with the blue lights behind the wall grating going from having length to having none depending on the angle of the camera.
When I fist set up this scene, this was not happening, and I would like to know why it’s happening now. The lights are set to stationary, I’ve tried turning off and on texture streaming, and just about everything else I can think of inside the editor. The only thing I can think of is that when I first set up the scene, I had a fraction of the number of lights I now have in the scene, but it appears that only these lights which are the only ones that have length, are having this problem. Also all quality settings are set to “Epic” or otherwise as high as I can set them in editor.
took some time to play around with animation in Matinee. The Flying transport thing is just a really quick model with really rudimentary UV’s so please forgive it’s lack of polish, I just needed something quick to test with.
I have more control in AE than UE4, and the the ability to iterate change, and come up with new stuff is easier in AE, even though it takes many times longer to render out of AE. Though I suspect that in the pipeline we will probably migrate some of this down to UE, and I HOPE that the new motion blur in 4.8 will replace the need to give it some love in AE, but we will likely have an entire Post aspect of the pipeline to separate the time it takes individuals from having to iterate change. And those skill sets might not overlap. For example Editorial will likely be handling the whole post processing and color grading, because that is standard practice, and sending those kinds of changes back to the guy running the engine might actually take more time, because that would have to be scheduled. So for now the idea is that whoever ends up working on stuff in Engine should be more focused making sure that all the animation, and lighting is correct.
Right now, these are just tests, meant to pippoint issues that need to be solved.
For example, I have lights popping in the scene, I have no idea why, and as a result a shots with that happening in it, would be unusable. Also the AA issues in the HDR areas of the shot, also unusable. I am hoping to have someone point out how to fix that, be in changing something in an ini or a setting in engine. I want to know.
another thing to realize that is that hope to develop the ability to render out multiple passes, both for artistic, but as well as purely functional reasons. For example, we might want to add elements that are only possible in an offline renderer, and we would have to then blend them in a shot with minimal effort, make it occluded by elements rendered in UE, have it cast reflections etc which would then require full raytracing. the idea is that UE would be just another renderer in the pipeline, though hopefully the primary one. So at the end of the day the pipeline needs to support all that and not cause special cases that would cause two shots sitting next to each other to look quite different.
let me ask you something. in your color grading aka correction.
What options do you use. For example I color grade in Photoshop on a LUT and import it to UE.
But anyway my questions is, how many options and what options do you adjust.
Thanks!
For the moment i just use curves, and hue and saturation, the glows are at normal saturation but the base footage is at about half satuation in AE. Which i am unsure how to do in UE (I know how grade in UE, but not the method I am currently using to get my desired look) , eventually i’d switch over to something like speed grade, or somthing else to generate proper LUTs. I would not use the UE method unless i needed the scene real time, or the turn around time requiers that we forgo the seperate post process part of the pipeline. The UE method would limit our color output if we wanted to change the color at a later date, and could cause issues.
The thing to realize is that in TV production change happens constantly and unpredicably at the same time, so you need to leave the pipeline open to allow for that. Depending on the production you might not be anywhere near the group that does the editorial process where most color grading happens. So the guys and gals doing the in engine work would have to produce a render in LOG format where colors are washed out and it looks really flat because like any higher bit rate format it actually has all the color information there it’s unfiltered so you see it as washed out, and then the colorist comes in later and grades the image to something more appealing.
Though I personally find it more organic if I grade when I comp, I have more control that way over the various layers and I can do more than what someone further down the line could do.
Here is a version of the same shot that is fully dynamically lit.
not quite as nice, but it still works. I think it’s still using the lighting environment from the baked scene, so I am currently setting up a new scene from scratch that does not inherit any lighting information from the baked scene.
what your used light settings, where I set using the HRSS tool. can you share this, please. What maps change anything from light? whether or not this option you used? I would like in my stage also use dynamic light, but for now I do not know where to turn, could you explain to me? Thank you in advance and greet