Hi, I work primarily in TV so I know a thing about interlaced vs progressive modes.
At the moment i’m producing a realtime system with is running in a 1080i graphics mode (on the card).
My problem is that I don’t see any real performance gains in running in this mode over progressive. Surely when games are running on consoles in this mode there must be an engine mode which halves the vertical image resolution.
1080p is 1920x1080 x 25(pal) images per second - this is what I usually run in while i’m developing.
1080i is 1920x540 x 50(pal) images per second, where the aspect ratio of each ‘leave’ of the interleaved frame is half as high.
Am I right in thinking that the engine just runs at full progressive mode regardless of the output being in interlaced mode or not and just rams the image into the output buffer 50(pal) times per second? If so, isn’t this massively wasteful in terms of GPU time?
I’m sure 50 1920x540 frames per second is a hell of a lot cheaper than 50 1920x1080 frames per second.