Announcement

Collapse
No announcement yet.

How to Improve Frame Rate Through Video Settings

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #31
    The PS4 is able to run games well due to the fact that it uses GNM and GNMX, with the shader language being PSSL. Basically, it is much lower level than DirectX 11 and even DX12. Additionally, the developers only need to target one single set of hardware as they know that if it runs at a set level then it's pretty much guaranteed to run the same on every other machine. Unreal Engine 4 uses modern rendering techniques that are more akin to realistic physically-based rendering pipelines.

    You can't really find comparable hardware for the PS3 as it has a massively powerful custom IBM CPU that does a lot of the processing for the system, even taking over the GPU. Think of it as something GPGPU does for the CPU, except in reverse. Compression algorithms and coding allows developers to stretch a lot of potential out of the PS3's hardware, which is how it can run very good looking games with only 256mb RAM allocated for games and the rest for the OS. The PS4 relies on GPGPU for CPU-based processing and its GPU is equivalent to an R7 260X, however with its heterogenous architecture it is able to have a 5GB VRAM buffer for game assets with the rest going to the OS. Additionally, the PS4 shares the same sort of compute tech that the R9 290X has (asynchronous compute units) which helps with GPU rendering and GPGPU.
    Last edited by Icaraeus; 03-24-2015, 02:10 AM.

    Comment


      #32
      There are some i7 CPUs with more threads and comparable clock rate, as well as significantly more flexibility between cores and shared cache memory. Remember, each SPE only has access to 256 KiB of data, roughly 275 KB. The Wii U by comparison has 32 MB of eSRAM shared between CPU and GPU: that's 116 times more memory available to the vast majority of CPU processes, and it's shared. So the PS3 can process very quickly, but not for a lot of data (or very accurate data) while most other consoles and computers have more flexible shared caches between cores. Only the PPE on Cell is capable of accessing system memory (RAM). The seven other SPEs can only access and process 275 KB of data each, and from my understanding it's not shared.

      The PS3's Cell was incredibly powerful for the time, I mean, no PC was capable of simulations like Flower, Journey, or Little Big Planet, but nowadays a lot more physics and particle processes are moving to the GPU which is much faster and much more capable of handling simulations like this, assuming a boost in cache is possible. And CPUs have also exceeded the Cell in terms of raw processing power in consumer machines as well for at least the past three years. Additionally, the limitations with memory and graphics power on the PS3 are the most debilitating factors. So, if you want comparable graphics, an integrated card will work, and if you want comparable simulations, a newer, lower-end dedicated graphics card will work, and if you want comparable CPU power with something much more flexible, a mid-grade i7 card will completely supersede any benefit the PS3 could've ever given you, and with more cache to spare. And if you just don't care about price and want to put the Cell to shame, try a Xeon E5-2699 v3 processor with 18 cores, 36 threads, and 45 MB of cache. Sorry, but Cell is no longer the supercomputer that Sony touted it for 9 years ago. Technology marches on!

      Comment


        #33
        I'm only saying that what surprised me is that a game running on a PS3 managed to, at least in my eyes, outshine my untweaked "out of the box" test running on what I think is a somewhat acceptable laptop.

        But Unreal 4 scales from phones all the way up to movies, just need to know what settings will give the right look for a certain hardware configuration.

        So my question earlier was perhaps a bit ambiguous, but I wondered what settings would make it look like a PS3 game or PS4 game. But since it uses a different lightning model than the games on PS3 this might not be easily answered.
        Last edited by Michael3DX; 03-26-2015, 02:04 AM.

        Comment


          #34
          • Don't use dynamic or stationary lights: switch them all to static. This disables all GGX specular rendering, and it will significantly improve performance. You can use a static skylight and the lightmass environment color to provide a generic GI fill in the shadows.
          • Under Screen Space Reflections in post process, set the max roughness and quality to 0. This should disable Screenspace Reflections entirely.
          • Plug in an ambient cubemap to provide generic lighting. Reflections and lighting won't be appropriate, but there are many games that do this and still get praised for their graphics.
          • Set Lens Flare intensity and size to 0. Set Ambient Occlusion intensity and radius to 0. Set the anti-aliasing method to FXAA. This should eliminate some of the heavier post processing effects.
          • Set Bloom intensity to 0. Set Auto-Exposure min brightness and max brightness to be the same value (1 is good for default). These effects are not very heavy, but you can disable them if you want.
          • Use a lower game resolution. 1280x720 doesn't look bad. For lowering the resolution while working in-engine, use the resolution scale in the engine scalability settings, located to the left of the blueprints icon. To lower the resolution in the middle of a particularly demanding portion of the game, use the screen percentage post process setting.


          If you start a project with these settings, You would have disabled most of the default effects in UE4. If this doesn't run well above 60 FPS on your computer, then you seriously need to get yourself a real graphics card! From here, you can slowly bring in some of the more advanced effects and start to use better lighting/reflection rendering while increasing the resolution until you find a balance that works well for your intended hardware.

          Comment


            #35
            Can you guys tell me if the specs of my graphics card is really crappy or not?
            I get pretty good speed in editor play too but standalone is verrrry sluggish.

            Name Intel(R) HD Graphics 3000
            PNP Device ID PCI\VEN_8086&DEV_0116&SUBSYS_15821043&REV_09\3&11583659&0&10
            Adapter Type Intel(R) HD Graphics Family, Intel Corporation compatible
            Adapter Description Intel(R) HD Graphics 3000
            Adapter RAM (2,084,569,088) bytes
            Installed Drivers igdumd64.dll,igd10umd64.dll,igd10umd64.dll,igdumd32,igd10umd32,igd10umd32
            Driver Version 9.17.10.3347
            INF File oem3.inf (iSNBM0 section)
            Color Planes Not Available
            Color Table Entries 4294967296
            Resolution 1366 x 768 x 60 hertz
            Bits/Pixel 32

            Comment


              #36
              Any integrated graphics is going to be closer to mobile hardware than higher-end PC. The chips are very small, they run with almost no energy straight off the motherboard, and are always passively cooled. All serious disadvantages if serious performance is what you're after.

              Intel's HD Graphics 3000 is not even good enough to handle 8-year-old games at HD resolutions. A GTX 750, even passively cooled, will run circles around it, and only require 55 watts to do so. Heck, it'll even render the engine better than my GT 640, which can run medium settings in 720p at 60 FPS! NVIDIA seems averse to releasing lower-powered graphics cards now that they can get amazing performance out of extremely efficient cards like the GTX 750 (and in the mid-higher end, 960), so there's very little reason to not spend the few extra dollars and get something that is many times more powerful and runs on practically any power supply you give it.

              Oh, and 2 GB of RAM was considered a laughable amount for the last 6-8 years. You need at least 8 GB of RAM and an i7 recommended minimum to run lightmass.
              Last edited by mariomguy; 07-15-2015, 12:07 PM.

              Comment


                #37
                Hi sir, i have read your article and download a fps booster here you can see:http://www.videoconverterfactory.com...o-quality.html。 But here is the question, my original frame rate is 29, and i increase it to 60. I did not find any difference between the 2 videos. I wonder how can it be true for those frame rate booster since the number of pictures flashes by per second never changes. How can it insert extra 41 pictures into my 29 fps video and make it 60?

                Comment


                  #38
                  Originally posted by Giranceso View Post
                  Hi sir, i have read your article and download a fps booster here you can see:http://www.videoconverterfactory.com...o-quality.html。 But here is the question, my original frame rate is 29, and i increase it to 60. I did not find any difference between the 2 videos. I wonder how can it be true for those frame rate booster since the number of pictures flashes by per second never changes. How can it insert extra 41 pictures into my 29 fps video and make it 60?
                  Probably the wrong thread (you're talking about videos?) but if it works anything like SVP it's frame interpolation done using some fancy algorithm. The difference between 30fps and 60fps should be pretty easy to see assuming your method works. Here's an example: https://www.svp-team.com/wiki/Frame_...ation_overview

                  Comment


                    #39
                    Awesome thanks for this

                    Comment

                    Working...
                    X