10-10-2006, 08:51 PM
480i - The screen is divided into 480 horizontal lines, "scan lines" that are interlaced, that is, the screen refreshes half the scan lines at a time. This is a fairly low bandwidth signal, by that i mean that (relatively) little information is being transmitted.
480p - Same number of scan lines, but here we have moved to a progressive refresh, this means that the entire screen, all the scan lines, are refreshed at the same time. This leads to a smooth, higher quality picture. "Progressive-scan" dvd players are outputting 480p. Also of note here is that generally 480p signals have a different aspect ratio than 480i, whereas many/most 480i singals are 4:3 aspect ratio, 480p can go with a 16:9 (roughly) aspect ratio. The bandwidth needed in comparison to 480i is roughly double.
720p - Welcome to the wonderful world of High definition signals. Here, we have upped the vertical resolution (# of horizontal scan lines) to 720 and are understood to be using a 16:9 (widescreen) aspect ratio. Again, our signal is progressive at 60 frames per second, so the end result is a sharper, smoother picture than what we had before.
1080i - Another "HD" set of standards, here we have 1080 horizontal scan lines for an even higher resolution, but now we are back down to interlaced, "painting" the screen a half at a time.
---An Interlude: 720p v 1080i ------------
So these are two seemingly competing HD formats, which one is the best? Might as well ring the bell and get Buffer to scream "lets get ready to rumble" because the argument here is fierce. I make no judgements, i will just pass along the facts as best i can:
While 1080i is of a higher general resolution than 720p, it is interlaced, and thus the screen is actually dividied into around 540 "fields" that are refreshed. Thus the human eye (perhaps the greatest video/image processing chip ever made, wish i'd gotten the patent) or a chip has to de-interlace the signal for display. The upshot is that a static image, you will have a seemingly sharper picture with 1080i, but when alot of motion is introduced, you'll begin to notice blurring and artifacts in the image. 720p thus provides a smooth moving image. This makes 720p the choice for many sports broadcasts (ESPN, ABC, and Fox all broadcast 720p, subjectively, i think NBC is broadcasting the NFL in 720p as well based upon my own viewing, but i haven't checked to back this up).
Thats all fine and dandy, but what about gaming? Well, first things first.. .what is native for your display? An LCD,Plasma, or DLP display is natively a progressive display, the pixels that constitute the image on the screen are either on or off. Thus, they output an hd signal of 720p regardless of the input given them. They must de-interlace and scale the incoming signal to get it there... so obviously if your game device of choice offers 720p output, thats the way to go.
Likewise, rear-project CRT televisions are happiest with a 1080i signal, and are going to have to take a 720p signal and interlace and upscale the signal for display, so feeding them a native 1080i signal is the optimal solution.
If you are one of those lucky people who are looking to buy an HD television and arent worried about 1080p, then what would you rather have for gaming? its a personal choice really. The only system that really takes full advantage of HD is the xbox 360, the ps2 and xbox are just feeding you a 480p signal that you'll be upsampling regardless of what you get. When the PS3 rolls out, it will support HD resolutions... so which way to go?
Well, as we said before, it is fact that 1080i offers a higher resolution and thus a sharper image with a static image. It is also fact that 720p offers a "smoother" moving image because of its progressive nature. By these facts one can infer that if you are into games that have alot of action on screen at once, such as sports games, shoot em ups, fighting games... it makes more sense to go 720p. If you are into more static games... such as Action/Adventure, RPG, or Oldschool/arcadish stuff (Frogger in HD ftw), perhaps 1080i is the better direction to head.
1080p - this is the newest "High def" standard featuring 1080 scan lines, now progressive for that silky smooth picture we all love so much. 1080p capable consumer displays are just starting to hit the market at "affordable" prices; Theaters have been using 1080p projectors for hi-def films for a while now. There are currently no broadcasted 1080p signals. Reasons? Well the highest standarized ATSC format for 1080p is 1080p30 which is 30 frames per second. Even then, the amount of bandwidth needed to broadcast the signal is significant, and thus standard MPEG-2 compression wont get the job done. Thus "new" codecs like H.264/Mpeg-4 must be used. The vast majority of consumer displays in use, now and in the near future, dont understand these new codecs and so it would be ncessary to broadcast multiple streams, and there simply isnt the bandwidth available for that in the broad consumer market. Currently, the PS3 is the only videogame console that will feature 1080p output, although there is a good bit of rumor that the xbox 360 is capable of the trick.