Announcement

Collapse
No announcement yet.

Physically Based Camera + Advanced Post Processing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Physically Based Camera + Advanced Post Processing

    Hey guys,

    I was just wondering and thought why not ask^^ Are there any plans from your side to rework the post processing features to be more "nextgen" and tie it up to a more physically based camera?

    For example, I have never really liked the way DOF works in Unreal. Its unintuitive to tweak, it regularly looks weird and its quite...difficult to really tweak^^

    When I read the document for Frostbite going PBR, I really liked the fact that they were going for a physically based camera with shutter settings etc. So you get DOF automatically based on the camera settings. FoxEngine does it like that as well. I would really prefer those system working together.

    Also, about the other post process stuff. I think it lacks features in general, but also the options to tweak it properly. You cant chose glare types or edit the lens flares, you cant blur the bloom in different directions to get anamorphic effects, there is no tilt shifting or camera lens offsets and the fringe also doesnt look that good. Besides that, you cant really customize the fringe (basically how the color channels are split and offsetted).

    I would love to know what you guys have in mind for things like that!

    PS: Really like the effects that for example YEBIS3 does http://www.siliconstudio.co.jp/middleware/yebis/en/ (sadly it doesnt show soo much about the features...but here is a link that explaines a bit more: http://www.siliconstudio.co.jp/middl...eatures/3_new/) This is actually a very good example of what I would like to see
    Check out UNREAL 4 Lighting Academy
    https://forums.unrealengine.com/show...ng-like-that-)

    #2
    Yeah, I think it would be nice if we had more physically based camera options

    Comment


      #3
      I can only second the shortcomings you listed here Daedalus! It would be indeed interesting to hear what's planned by Epic in terms of improving the actual post-process quality, since the two trello roadmap cards only mention performance optimization (March) and settings usability (backlog).
      ArtStation

      Comment


        #4
        +1 , with crane camera like Cinema4D for kinematic move will be awesome

        Comment


          #5
          +1 I kinda dont like that depth of field.
          Pursuit of Realistic Cinematic scene.

          Comment


            #6
            As long as this didn't result in permanent camera related artifacts, improvements to the system could only be good for people
            Rule#21: Be polite, be professional, but have a plan to kill everyone you meet.

            Comment


              #7
              Mainly it would be nice if we had some real world settings, like F-stop, ISO, Shutter speed. And that those would affect things like DOF and motion blur.

              Comment


                #8
                I just watched thw twitch steam abot creating the open World demo and you announced a physical camera there (awesome!!!) how about the other things I mentioned, and Most importantly...When will we see these Updates? Will the stuff shown on twitch be in 4.8?

                Hope you guys have a good time at Gdc and looking forward to hear from you
                Check out UNREAL 4 Lighting Academy
                https://forums.unrealengine.com/show...ng-like-that-)

                Comment


                  #9
                  I don't know the difference but wouldn't a better physically accurate camera for real time content be modelled on the human eye and not a camera. Would be cool if any technical people could comment. I think simulating biological hdr and dilation would make it easier to get photorealistic visuals or atleast immersion. If directors/photographers could virtualise their cameras there would be little need for post processing.
                  Last edited by RAMDAC; 03-07-2015, 12:30 AM.

                  Comment


                    #10
                    Last year those guys said that Yebis would be on Unity. Looks like it didn't happen
                    | Savior | USQLite | FSM | Object Pool | Sound Occlusion | Property Transfer | Magic Nodes | MORE |

                    Comment


                      #11
                      Originally posted by RAMDAC View Post
                      I don't know the difference but wouldn't a better physically accurate camera for real time content be modelled on the human eye and not a camera. Would be cool if any technical people could comment. I think simulating biological hdr and dilation would make it easier to get photorealistic visuals or atleast immersion. If directors/photographers could virtualise their cameras there would be little need for post processing.
                      That is acctually an interesting problem, but I think that what we really need in game industry is more imitating optical lenses, than human eyes. I was wondering about this some time ago, and if you analyze what kind of images people do percieve as realistic on the screen it looks like we do prefer images with quite strong "optical" effects. Human eye is having a very, very flexibe adaptation mechanism, so it reacts very quickly and on top of that the image is processed by our brain in the background. That means that what we see is already "post-processed", however the goal is to remove all the possible effects, not to add them. And this is how we do see the world, as little DoF as possible, no lens flares, etc.. and we are not even 100% aware of that what was processed.
                      But looking at the image on the screen, we kind of expect this image to be recognizable as it was created with some kind of optical device. That's why we like images with strong DoF and other optical artifacts like slower eye adaptation for example. And than we judge the on screen image, as a real one, when it pretends to be created with a real-life, optical lenses. Add discrete vigneting to the image and more people will judge it as plausible than a plain one, stange but that is how it works.
                      Let's be honest, the images we see with our eyes, from the post-processing point of view are quite "boring", or at least they are trying to be as boring as possible. The eye adaptation and the brain do such a great job that we don't even know what's really going on. Post-processing is simply the opposite of what a humans' perception is trying to do, not to mention that everyone's brain is doing it a bit differently. Playing with the "human eye settings" would be one hell of a challenge.

                      Comment


                        #12
                        Forgot to say I was thinking more of VR, as in photorealism for the purpose of immersion, almost like inducing synesthesia. In terms of what people say they like, I think aesthetically pleasing and photorealism often gets mixed up. And traditional post processing is often used to compensate for short attention spans. Again with VR, suspended belief can make the most trivial things pretty engaging to the point where maybe 2d media post processing would be pretty distracting and more biologically accurate processing would be more relevant, i'm thinking mostly about dynamic contrast/color accuracy which would get the most benefit from a calibrated oled screen. One interesting idea i had while thinking about photorealism is UV damage. I think procedural sun exposure and material degradation could be interesting.

                        Comment


                          #13
                          Hello Deadalus51,

                          Thanks for the post. I have entered a feature request for the following features:

                          1. kinematic move
                          2. F-stop
                          3. ISO
                          4. Shutter speed
                          5. Aberration/ correction simulations
                          6. Open/ close simulation of diaphragm blades

                          The jira ticket number is UE-11667 for tracking. I tried to grab all of the features that are being requested in this thread that are feasible. If I missed anything just let me know and I will get it in!

                          Comment


                            #14
                            Good list Benjamin! I'm glad to see you looking into this. I would also like to add lense distortion to that list if possible.

                            Comment


                              #15
                              Hi Benjamin, I cannot find UE-11667 on https://issues.unrealengine.com/ - can you provide an update on the status of this request ?

                              Comment

                              Working...
                              X