DBSTalk Forum banner
Status
Not open for further replies.
1 - 11 of 11 Posts

·
Legend
Joined
·
136 Posts
Discussion Starter · #1 ·
I understand that when they talk about resolution on TV broadcasts they only seem to talk about the vertical resolution. What is the horizontal resolution on DTV programming in SD and HD???

Keith
 

·
DBSTalk Club Member
Joined
·
460 Posts
keithw1975 said:
I understand that when they talk about resolution on TV broadcasts they only seem to talk about the vertical resolution. What is the horizontal resolution on DTV programming in SD and HD???
I'll take a stab at this, though keep in mind that I may be slightly off. If so, I hope someone will correct me.

For HD, you always hear about 720p or 1080i, and that refers to the number of "lines" drawn from top to bottom. Technically, each line would be 1 pixel tall. For 720p, the number of pixels in the horizontal direction is 1280 -- for 1080i, is it 1920 (assuming you are not watching something that has been "down-rezzed").

For SD, the horizontal pixel count is 720. That doesn't guarantee anything though, since an SD picture, especially if you're talking about something from a DBS provider, often has been subjected to some heavy filtering, and thus the frequency response (and thus the resolution) has been rolled-off.

I think that about covers the basics.
 

·
Banned
Joined
·
11,498 Posts
I always thought the ‘typical’ SD resolution for NTSC was 640 x 480i. How does one find out the resolutions of a channel on Dish or DirecTV anyway? On my cable box there are a set of hidden diagnostic screens. Regular SD channels are 328 x 480 and all the premium movie channels are 528 x 480. All HD content is actually 1920 x 1080i or 1280 x 720p.
 

·
Beware the Attack Basset
Joined
·
24,965 Posts
Steve Mehs said:
I always thought the 'typical' SD resolution for NTSC was 640 x 480i. How does one find out the resolutions of a channel on Dish or DirecTV anyway? On my cable box there are a set of hidden diagnostic screens. Regular SD channels are 328 x 480 and all the premium movie channels are 528 x 480. All HD content is actually 1920 x 1080i or 1280 x 720p.
Here's a link that will be of interest: http://www.geocities.com/cplarosa/video/vidres.htm

Most video boards (as opposed to display adapters) for computers are 700+ x 480+. In fact, the wily Japanese have always used 640x400 (as did the Commodore Amiga) because it translates to TV better.

There are also some links in there that if you read carefully, you might be able to figure out what the fascination is with the generally lower resolution Plasma televisions. I liken it to the difference between ink spray and dye sublimation printers.
 

·
Hall Of Fame
Joined
·
2,353 Posts
keithw1975 said:
I understand that when they talk about resolution on TV broadcasts they only seem to talk about the vertical resolution. What is the horizontal resolution on DTV programming in SD and HD???

Keith
The main reason 'horizontal' resolution isn't usually referenced is because it is implied by the aspect ratio of the image being displayed.

SD (digital) programming uses the standard TVs 4:3 aspect ratio, in simple terms, there are 4 horizontal pixels for every 3 vertical.
(480 /3 (vert) = 160 * 4 = 640 horizontal pixels)
HD programming uses the widescreen 16:9 format or 16 horizontal for every 9 vertical pixels.
(720 / 9 = 80 * 16 = 1280)

Of course these are only true for 'untreated' signals. Once the providers work their bandwidth black magic, or your HW 'converts them, they may be subject to change. But you will notice some degree of image degredation when they do.

Why TV is referenced this way and Computer displays are referenced as Horiz x Vert is anyone's guess. Maybe since everyone was used to thinking of analogue TV images as continuous scan-lines, they didn't want to deviate from the norm...
 

·
Cool Member
Joined
·
23 Posts
BattleScott said:
The main reason 'horizontal' resolution isn't usually referenced is because it is implied by the aspect ratio of the image being displayed.
Actually, this isn't true. MPEG video can and does use "non-square" pixels. SD "source" resolution is the same as DVD resolution - 720x480. The headers in the MPEG stream indicate the aspect ratio (4x3 or 16x9), and the video processor correctly "stretches" the image for the display.

Unfortunately, D* does not pass along the "source" resolution. For SD video, D* takes 720x480 source material and converts it to 480x480 (most common), or even 352x480 (relatively rare). D* throws out a full one third of the horizontal resolution (and therefore a full one third of the source pixels) before it beams the signal to your dish.

The story is similar in the HD world. As mentioned below, the two HD formats are 1080i (1920x1080) and 720p (1280x720). Because 720p doesn't use as much bandwidth as 1080i, D* does not alter the resolution* of 720p source signals. Every 720p channel (ESPN, FOX, ABC) is delivered to your dish at 1280x720. However, 1080i (CBS, NBC, Discovery, HDNet, TNT, UHD, etc) requires more bandwidth, and in order for D* to squeeze as many channels on the sats as they can, they reduce 1080i HD much like they do SD video. The 1080i signals that your receive from D* are reduced from 1920x1080 to 1280x1080. Again, D* throws out a full one third of the horizontal resolution (and therefore a full one third of the original pixels).

Why they do this is simple - they decided long ago that QUANTITY was more important than QUALITY to the majority of their subscribers. And frankly, if you're watching D* on a screen smaller than 40", you might not be able to tell the difference.

* NOTE: I completely ignored bitrate in this discussion - it's equally (if not more) important in the SD/HD picture quality equation.
 

·
Icon
Joined
·
914 Posts
I soooo need to get my HD OTA Antenna up...
 

·
Éminence grise
Joined
·
8,473 Posts
Another factor is that the television has to convert from the resolution which is output from the receiver to its physical pixel resolution. More sets capable of actually displaying 1920 x 1080 are becoming available, but there are still a lot of sets with fewer pixels, such as 1024 x 768 or even less. From normal viewing distances it is really hard to tell the difference in most cases.
 

·
Godfather
Joined
·
259 Posts
texmex said:
Actually, this isn't true. MPEG video can and does use "non-square" pixels. SD "source" resolution is the same as DVD resolution - 720x480. The headers in the MPEG stream indicate the aspect ratio (4x3 or 16x9), and the video processor correctly "stretches" the image for the display.

Unfortunately, D* does not pass along the "source" resolution. For SD video, D* takes 720x480 source material and converts it to 480x480 (most common), or even 352x480 (relatively rare). D* throws out a full one third of the horizontal resolution (and therefore a full one third of the source pixels) before it beams the signal to your dish.

The story is similar in the HD world. As mentioned below, the two HD formats are 1080i (1920x1080) and 720p (1280x720). Because 720p doesn't use as much bandwidth as 1080i, D* does not alter the resolution* of 720p source signals. Every 720p channel (ESPN, FOX, ABC) is delivered to your dish at 1280x720. However, 1080i (CBS, NBC, Discovery, HDNet, TNT, UHD, etc) requires more bandwidth, and in order for D* to squeeze as many channels on the sats as they can, they reduce 1080i HD much like they do SD video. The 1080i signals that your receive from D* are reduced from 1920x1080 to 1280x1080. Again, D* throws out a full one third of the horizontal resolution (and therefore a full one third of the original pixels).

Why they do this is simple - they decided long ago that QUANTITY was more important than QUALITY to the majority of their subscribers. And frankly, if you're watching D* on a screen smaller than 40", you might not be able to tell the difference.

* NOTE: I completely ignored bitrate in this discussion - it's equally (if not more) important in the SD/HD picture quality equation.
You are exactly right.

D* takes a 1920x1080 source = 2.1Million pixels
and down rezzes it to 1280x1080 = 1.4 Million pixels

They just threw 1/3 of their pixels away.

Now on each screen frame refresh:
720p: 1280x720 = 921600 pixels
1080i = 1920x1080 / 2 (interlaced) = 1.03 Million Pixels

So both formats are updating nearly a million pixels at 60Hz

DirecTV is interlacing their down rezzed HD: 1280x1080 / 2 = ONLY 691000 pixels per frame

SO the million pixels is down about 1/3 as compared to either of the standard HD formats.

Then, to make things worse, my cable company uses a sample bit rate of a full 19.4Mbps (max for MPEG2), while D* typically samples HD channels at between 12-14 Mbps.

So what this means is that as compared to cable, DirecTV is sending roughly 40-55% the bits during any given period of time compared to my cable HD.

So what does this mean?
It saves D* a lot of bandwidth, but scenes with lots of motion or lots of pixels changing look horrible, blurry, lots of macro blocking as compared to cable.

And then to make it slightly worse, the extra compression, the possibility or recompression by the satellite itself, and the down rezzing than up rezzing back to your screen size will undoubted have some negative effects on the pictures and how vivid the received colors are.

There is a quick summary of HD-Lite.

Here is an example of this "bit starving" causing poor motion and macroblocking respectively:

http://img175.imageshack.us/my.php?image=wmc907067ql9.jpg
http://img175.imageshack.us/my.php?image=wmc907068kc4.jpg

Of course, if you're not watching an action movie or sporting event, or watching a scene with little motion (news, sitcom, etc...) Even downrezzed, it's still over a million pixels (interlaced) and looks quite sharp.

http://img175.imageshack.us/my.php?image=wmc907063xh5.jpg

It's motion that ruins the viewing experience.
 

·
Beware the Attack Basset
Joined
·
24,965 Posts
bobnielsen said:
Another factor is that the television has to convert from the resolution which is output from the receiver to its physical pixel resolution. More sets capable of actually displaying 1920 x 1080 are becoming available, but there are still a lot of sets with fewer pixels, such as 1024 x 768 or even less. From normal viewing distances it is really hard to tell the difference in most cases.
I think that if you look closely, there are only a handful of truly "native" 720p TVs (Sony's RP 3LCDs are one example). The top-line Pioneer plasmas are 1365x768 (their heart's in the right place with the correct aspect ratio). The Panasonic 720p plasmas are in the same boat (their top-line units are 1080p). The LCD panels seem to be running 1366x768 ("that silly little millimeter" for those of us over 40).
 
1 - 11 of 11 Posts
Status
Not open for further replies.
Top