Jump to content


Welcome to DBSTalk


Sign In 

Create Account
Welcome to DBSTalk. Our community covers all aspects of video delivery solutions including: Direct Broadcast Satellite (DBS), Cable Television, and Internet Protocol Television (IPTV). We also have forums to discuss popular television programs, home theater equipment, and internet streaming service providers. Members of our community include experts who can help you solve technical problems, industry professionals, company representatives, and novices who are here to learn.

Like most online communities you must register to view or post in our community. Sign-up is a free and simple process that requires minimal information. Be a part of our community by signing in or creating an account. The Digital Bit Stream starts here!
  • Reply to existing topics or start a discussion of your own
  • Subscribe to topics and forums and get email updates
  • Send private personal messages (PM) to other forum members
  • Customize your profile page and make new friends
 
Guest Message by DevFuse

Photo

Dish messing with Voom, All are now HD-Lite 1280x1080i


  • This topic is locked This topic is locked
285 replies to this topic

#41 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 05:34 AM

Not arguing that Dish isn't downrezzing the Voom HD right now....

but... is it fair to call it HD-Lite?

As I understand it, HD can either be 1280x720p or 1920x1080i... Anything less than 1280x720p would be HD-Lite in my book... but the 1280x1080i is somewhere in between the lowest and the highest HD resolutions...

So, while I prefer 1920x1080i... It is hard for me to say 1280x1080 isn't HD, when it is higher than the 720p resolution.

...Ads Help To Support This SIte...

#42 OFFLINE   LtMunst

LtMunst

    Hall Of Fame

  • Registered
  • 1,261 posts
Joined: Aug 24, 2005

Posted 04 December 2005 - 08:06 AM

Maybe if I had a $20,000 system that could display near infinite line resolution, I would be upset also. With my 1280x720p set, however, it is very unlikely there will be any difference (unless bitrates start to drop).

#43 OFFLINE   bavaria72

bavaria72

    I am one too!

  • Gold Members
  • 1,201 posts
Joined: Jun 10, 2004

Posted 04 December 2005 - 08:38 AM

Bavaria you are incorrect, HD-Lite is no where close to the picture quality of True 1920x1080i HD, it isn't even in the same ball-park...
-Gary


Gary, technically speaking you are correct when you say 1920x1080 is better than 1280x720 but quite frankly I can't tell that much of a difference. Perhaps my eyes are not as good as they used to be but IMHO I really don't think E* is "ripping" those of us who are paying whole $5 a months for the old Voom channels. Heck, I would hate to see their current business case on keeping the Voom channels. I certaining don't have a $20K system ($2295 return special is what I have). If you pissed over the $5 bucks for Voom then I would assume you are livid over the $10 for the 5 other HD channels. I'm just thankful E* decided to keep the old Voom programming, 1920x1080 or 1280x720. - Art
I'm back! Needed more than 1 HD feed.

#44 OFFLINE   Ghostwriter

Ghostwriter

    Legend

  • Registered
  • 204 posts
Joined: Oct 10, 2005

Posted 04 December 2005 - 09:27 AM

My biggest problem here is that you almost need to let them know that you are aware of what is going on with the quality of your picture. Although I can not say it is unbearable to sit through, if you take it all in stride you can almost bank that they will lower the resolution again. Now if they feel that enough people are aware of what they are doing, then maybe at least they stop here.

Bavaria, all in all I am happy with Dish, and considering they are the only ones that carry the international channels I desire I am not leaving all that soon. And just to say if you are unhappy leave, I always thought you should voice your opinion, not just take your ball and go home. JMHO

#45 OFFLINE   LtMunst

LtMunst

    Hall Of Fame

  • Registered
  • 1,261 posts
Joined: Aug 24, 2005

Posted 04 December 2005 - 09:42 AM

With the current crop of HDTVs, most people would not see any difference with 1280x1080i. Long term, though, we will see many more 1920X1080i or even 1920x1080p sets hitting the market. Hopefully Dish is planning to keep up.

#46 OFFLINE   Ghostwriter

Ghostwriter

    Legend

  • Registered
  • 204 posts
Joined: Oct 10, 2005

Posted 04 December 2005 - 09:58 AM

I have a 1080i so I already see a difference. I am just worried the downrezzing will continue. For now although it is not as good (some voom channels) not all since a few already came online a 1280x1080i I can almost hold out and hope that things will become better with the Turbo or MPEG4 when they start being used.

#47 OFFLINE   James Long

James Long

    Ready for Uplink!

  • Super Moderators
  • 40,098 posts
Joined: Apr 17, 2003

Posted 04 December 2005 - 10:29 AM

Dish can take this HD-Lite Voom and stick it you know where, I will not pay for this garbage nor bare and strain myself to sit thru it and try to enjoy it's fuzzy qualities

Fine. Go. Bye. Nobody is stopping you. Don't let the door hit you where the doctor slapped you. If you don't like the way you perceive E* is operating you can always try someone else.

There are only two reasons why you wouldn't just make the switch:
1) E* is still the best HD option available
2) You would rather scream your head of than lose (not loose) Voom

By your definition, millions of satellite customers have been watching SD-Lite channels for years. It is part of the business. And at this point you don't know where E* will leave the channels. Calm down and react intelligently.

JL

#48 OFFLINE   Bill R

Bill R

    Hall Of Fame

  • Registered
  • 2,498 posts
Joined: Dec 20, 2002

Posted 04 December 2005 - 10:30 AM

Maybe we should just refer to it as "lower quality HD".

The vendors that are doing it (both DirecTV and DISH and some cable companies) need to know that this is NOT what we want and something that we are not willing to pay for.
Bill R

#49 OFFLINE   Gary Murrell

Gary Murrell

    AllStar

  • Topic Starter
  • Registered
  • 88 posts
Joined: Jan 10, 2005

Posted 04 December 2005 - 03:46 PM

I will gladly be syaing by to Dish and their HDTV offerings if this sticks, I am working on installing a C-Band 4DTV setup right now, I would rather have 4/5 true HDTV channels than 30 HD-Lite's

I have spent thousands of $'s on Dish and their HDTV offering's and have been with them since the day Discovery HD Theater was lite up and this is downright insulting

1280x1080i is actually much worse than 1280x720p, because if you factor in the interlaced vs progressive stuff, you are comparing:

1280x720

vs

1280x540

Dish would have been better off having Voom or themselves convert to 1280x720p

I do not think that Dish would suddenly switch all Voom to this garbage if that is not where they were going to leave them, I won't have to wait long for my decision to dump Dish's HD, CES is where the Voom 21 will be announced and things will be set by then

-Gary
Dish Setup:

129/110/119/61.5/148 -Sats
3 X DP34/3 X DP21 -Switching
942/6000/522/322/510 -Receivers
Everything Pak/NHL CI/Voom/HD Pak/CBS-HD/Locals -Programming

#50 OFFLINE   sgiwiz

sgiwiz

    Cool Member

  • Registered
  • 17 posts
Joined: Aug 07, 2005

Posted 04 December 2005 - 04:30 PM

Not arguing that Dish isn't downrezzing the Voom HD right now....

but... is it fair to call it HD-Lite?

...snip...

So, while I prefer 1920x1080i... It is hard for me to say 1280x1080 isn't HD, when it is higher than the 720p resolution.


I'll say it then. 1280 x 1080 ISN'T HD! It's not even the same aspect ratio. It is not one of the many white paper defined HD standards for broadcast. IT IS NOT HD - and if it's being sold as HD then there is a strong case for a charge of false advertising. 720p or 720i on the other hand IS one of the HD standards, and if your TV only displays 1366x768 natively you are probably better off trying to watch 720 than 1080 anyway.

Let me take you on a journey in the life of a pixel in this resize scenario that we've all been moaning about.

In a true HD 1080 image you have 1920 pixels of clean, crisp horizontal resolution. For argument's sake we'll say that this image is TRUELY a 1920x1080 image such as a frame from a Pixar film, or a nice bit of branding on HDNet. NOT from a film telecine transfer, or from an Sony F900 HD camera (I'll get to that in a minute).

These crisp 1,920 pixels are the best looking thing you're ever likely to see, and the only way you will be able to see them is if you have the chance to run uncompressed dual-link HD-SDI 4:4:4 into a monitor like this one from DataCheck: http://www.datacheck...cts/21245a.html playing the media directly from hard drives as uncompressed data. Try and hit an HD trade show if you want to see this, it's pretty.

So, anyway, you take those 1,920 pixels and you squeeze them to fit into 1,280 pixels. That's a 1/3 reduction per line. So you're dropping one pixel for every two that you keep on any given line. But you can't just drop out every 3rd pixel and keep the nice clean ones you have left, because you would see the damage clearly when fine detail moves horizontally, or as jagged edges on curves and diagonals. So you have to throw out ALL the clean pixels and blend the color values that they represented. Your new pixel A gets some of the color value of pixel B and your new pixel C gets some of the value of pixel B to make two new pixels that are a "blur" of the three that used to define that part of the image. Of course it doesn't stop there because you have to spread the old pixels' data more evenly than that, but I'm not going to do that kind of math in my head and you get the point anyway. The new 1,280 pixels are a blurred version of the old ones. They look okay from a distance, but they are a "mushed" version of what used to be there. It's called interpolation and it's the reason a lot of things look blurry on TV, on film, in magazines... all over the place, it's done all the time. Everyone knows that when you take you 2.3 magapixel photos and reduce them to e-mail them to your mom they aren't as good as the originals. But you never think about it because you aren't going to blow-up that lower res version of the photo to the size of the original to look at it are you? You'll look at it at the new smaller size and it looks fine, maybe even better.

But that's not what Dish is doing. They ARE blowing it back up to the original size.

And this is where it gets very ugly, you have to stretch those 1,280 pixels back out to 1,920 pixels to return it to an HD standard. So (you guessed it) you take your new blurred hybrid A and B pixels and you move them apart and you make a new pixel to go in the middle that's a blend of the two colors and you have 3 pixels again where there only used to be 2. 1,280 becomes 1,920 again. But you have now blurred the image twice. Twice the damage, the second compounding on the first.

I would say tough, almost none of the source material out there is truely native 1920x1080. The vast majority of film transfer machines used for HD (The "Spirit" from Thomson/Grass Valley) are actually not scanning film at resolutions any higher than about 1440 pixels of horizontal resolution, and then only on the luminance channel. There are very few (if any) cameras in the field that have imaging sensors that come close to 1920x1080 native and until very recently they only recorded to HD-CAM (see below). So almost everything has already been interpolated once, BEFORE it get's to tape. This is SLOWLY changing as more movies are using digital post production techniques that allow HD masters to be created directly from true high-res scans rather than telecine, but they are few and far between right now.

No we get to compression.

To deliver HD masters to a broadcaster you have few choices; HD-CAM, D5 or HD-CAM SR. They are all compressed, some much more than others, HD-CAM being the worst. HD-CAM is also the cheapest... guess which one is most popular.

So by the time the the master sources get to the broadcaster they have for the most part already been interpolated from a lower resolution and they have all already been compressed.

So then they squeeze it to 2/3 of the original horizontal resolution, and compress it to send it you us.

In actual fact, the reduced resolution helps a lot in cutting the amount of data needed to represent the image, not just because there are fewer pixels to store, but because after the interpolation those pixels are less crisp, less contrast between them... they are blurry. So the compression algorithm can more often look at a small area of the picture and say "Hmmm... this bit of contrast falls below my threshold to try and maintain the detail, I'm just going to blur it into a solid area of one color" than it would if the image were sharp. Sharp, detailed images are much more diffiult to compress and take up much more bandwidth. So the saving in bandwidth by doing this reduction and mushing is two fold. And so is the damage it does to the image.

Now, don't get me wrong... the guys who worked all this stuff out are geniuses. This hardware/software all works very very well and is probably doing the best it can possibly do to keep the image as clean and crisp as possible. But that doesn't mean it going to come out the other side looking as good as it did on the way in. It's just not possible.

All you users who are saying "it looks fine to me" are probably watching the image on a lower resolution screen, through hardware that was doing a bad job of displaying the detail of an HD frame to begin with. I wouldn't expect you to notice the difference. But I also wouldn't expect you to get upset at the complaints of people who CAN tell the difference and are unhappy about it.

I mentioned in an earlier post that I haven't bothered to buy an HD set for home for EXACTLY this reason. If I don't display the image on a system that can show me every pixel then I will never be able to tell that the source image is garbage. And in the mean time, everything that's broadcast in HD looks like uncompressed, clean, sharp SD. And I'm happier with that than I would be with garbage HD.

It just burns me up that just as the technology that can display HD at it's native resolution is gaining popularity, so the sources for that content are disappearing.

#51 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 04:54 PM

1280x1080i is actually much worse than 1280x720p, because if you factor in the interlaced vs progressive stuff, you are comparing:

1280x720

vs

1280x540


No it isn't... 720p has 720 scan lines (actually a few more, but I won't get into that)... and 1080i has 1080 scan lines.

720p is progressive scan, which means all 720 lines are displayed in one contiguous scan of the screen. 1080i is interlaced, which means every-other-line (540) is displayed on the first scan, then the other 540 lines are displayed on the next scan.

There used to be more of a problem with "flicker" in the old days of interlaced... but that is mostly gone now unless your eyes are particularly sensitive.

In no way is 1080i anything like 540p... because 540p would only be 540 scan lines, barely more than what SD currently uses.

I hate to see people who are confused or being fed misinformation spread the misinformation so people think that somehow 1080i is less vertical information (scan lines) than 720p.

#52 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 04:57 PM

All you users who are saying "it looks fine to me" are probably watching the image on a lower resolution screen, through hardware that was doing a bad job of displaying the detail of an HD frame to begin with. I wouldn't expect you to notice the difference. But I also wouldn't expect you to get upset at the complaints of people who CAN tell the difference and are unhappy about it.


That might apply to some users... but I have a 1080i true native display... and I can tell a difference between 1920x1080 and 1280x1080. In the past I've definately seen problems on the Equator channel, for instance, with downgraded pictures.

However, lately when Monsters is showing a 1950s Godzilla movie that wasn't cleanly transferred, and is in black & white... I can't tell the difference between 1920x1080 and 1280x1080 for that movie.

Some channels and programming show the flaws better/more than others... which is why when I've flipped around lately since viewing the threads about the downrezzing... I haven't noticed a big difference. If they were showing live or better transferred programming, then I would probably notice.

#53 OFFLINE   hammerdown

hammerdown

    Cool Member

  • Registered
  • 28 posts
Joined: Jan 20, 2004

Posted 04 December 2005 - 05:16 PM

I'll say it then. 1280 x 1080 ISN'T HD! It's not even the same aspect ratio. It is not one of the many white paper defined HD standards for broadcast. IT IS NOT HD...

So, anyway, you take those 1,920 pixels and you squeeze them to fit into 1,280 pixels. That's a 1/3 reduction per line...

...They ARE blowing it back up to the original size...you have to stretch those 1,280 pixels back out to 1,920 pixels to return it to an HD standard.


I'm trying to wrap my brain around this. If they blow it back up to 1920 then how come the captured frames show up as 1280x1080? Wouldn't it fool the capture device into seeing 1920? If they are blowing it back out to 1920, how are we able to know it's 1280? And if they don't blow it back to 1920, why doesn't the aspect ratio come out all funky?

Hammer

#54 OFFLINE   sgiwiz

sgiwiz

    Cool Member

  • Registered
  • 17 posts
Joined: Aug 07, 2005

Posted 04 December 2005 - 05:42 PM

I'm trying to wrap my brain around this. If they blow it back up to 1920 then how come the captured frames show up as 1280x1080? Wouldn't it fool the capture device into seeing 1920? If they are blowing it back out to 1920, how are we able to know it's 1280? And if they don't blow it back to 1920, why doesn't the aspect ratio come out all funky?

Hammer


I'm actually not sure HOW they transmit the lower resolution stream, probably just an MPEG data stream. I can guess that the transmission is in encoded and decoded as 1280x1080, and that is what people are getting a reading from. I don't think my 924 unit actually tells me that info, but maybe I just haven't found that menu yet.

I would be very interested in knowing that info like I do on my EyeTV500, if anyone knows where it is stored on the 924?

But the fact remains that the image has to be stretched back out to 1920x1080 to get to your screen if you are watching in 1080 (not 720 or SD). Regardless of whether "they" do it at the broadcast station, the uplink, or inside your receiver it's still being done.

sgiwiz

#55 OFFLINE   sgiwiz

sgiwiz

    Cool Member

  • Registered
  • 17 posts
Joined: Aug 07, 2005

Posted 04 December 2005 - 05:57 PM

That might apply to some users... but I have a 1080i true native display... and I can tell a difference between 1920x1080 and 1280x1080. In the past I've definately seen problems on the Equator channel, for instance, with downgraded pictures.

However, lately when Monsters is showing a 1950s Godzilla movie that wasn't cleanly transferred, and is in black & white... I can't tell the difference between 1920x1080 and 1280x1080 for that movie.

Some channels and programming show the flaws better/more than others... which is why when I've flipped around lately since viewing the threads about the downrezzing... I haven't noticed a big difference. If they were showing live or better transferred programming, then I would probably notice.


Exactly right HDMe. There is lots of content that the compromise would be less noticeable on. That doesn't make it less of a compromise because they don't switch it off and on when a good quality show comes on, everything suffers the same fate.

You can't say "Oh, well, Steven Soderbergh made the movie 'Traffic' really grainy so we can just dowres it and compress it more because most people won't see 'much' difference." and then "Hero" comes on and you set the resolution back to the "high" setting.

If it doesn't matter if the content is high resolution or not, why bother with high definition at all? I'm sure a really high quality DVD of 1950s Godzilla would look just as good as bad HD does but that's not what we are paying for is it?

The people arguing on the side of lower resolution are saying "I don't see MUCH difference" not "I don't see ANY difference." It seems like people are having to make the choice between quality and quantity. And I know which side of that argument I fall.

My honest and humble question is this:

Why is anyone sticking up for the idea of reduced quality, even if they don't see "much" difference?

Has someone issued an ultimatum? "Either you accept this or we're taking the channels off the air!" I haven't heard that from Dish, have you?

sgiwiz

:confused:

#56 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 06:08 PM

I'm actually not sure HOW they transmit the lower resolution stream, probably just an MPEG data stream. I can guess that the transmission is in encoded and decoded as 1280x1080, and that is what people are getting a reading from. I don't think my 924 unit actually tells me that info, but maybe I just haven't found that menu yet.

I would be very interested in knowing that info like I do on my EyeTV500, if anyone knows where it is stored on the 924?

But the fact remains that the image has to be stretched back out to 1920x1080 to get to your screen if you are watching in 1080 (not 720 or SD). Regardless of whether "they" do it at the broadcast station, the uplink, or inside your receiver it's still being done.

sgiwiz


I don't have any technical knowledge of what they are doing.... but I have a theory.

They take a 1920x1080 and use some kind of semi-advanced algorithm by which they toss pixels here and there until they end up with a 1280x1080 image. Then they transmit this encoded so that the receivers see it as a 16:9 image that is suposed to fill the screen. The receiver automatically stretches horizontally to fill the width of the screen.

I *think* this technically is happening all the time... its just that when a 1920x1080 or 1280x720 image is sent, that is 16:9 already, there is no stretch because it is already the right width... but when the image is less than that width it is horizontally stretched.

I don't know what the stretching algorithm is... or if my theory is correct, but I'm sticking with it until someone comes up with what really is happening.

#57 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 06:14 PM

Exactly right HDMe. There is lots of content that the compromise would be less noticeable on. That doesn't make it less of a compromise.


Agreed. A street sign with fonts (letter sizes) too small for most people to read is still an improperly designed street sign even if I have better than 20/20 vision and can still see it.

You can't say "Oh, well, Steven Soderbergh made the movie 'Traffic' really grainy so we can just dowres it and compress it more because most people won't see 'much' difference."


Also agreed... it's just that I can't see the difference when they do it on that movie, so I don't know to complain. If I hadn't been reading these discussions, I wouldn't know anything was happening.

If it doesn't matter if the content is high resolution or not, why bother with high definition at all? I'm sure a really high quality DVD of 1950s Godzilla would look just as good as bad HD does but that's not what we are paying for is it?


True. If it is an HD channel it should be HD. However, the part and the reason why I can't get as up in arms about it is... since HD is either 1280x720 or 1920x1080... its hard for me to say 1280x1080 isn't HD, when it is better than 1280x720 technically... so even though I know they are technically cheating, how can I complain?

They could reduce to 1280x720 and actually be a less quality image to save even more bandwidth... and then no one would complain it isn't HD... but it would still look less detailed than the 1920x1080 image and, to me at least, be more obviously so.

My honest and humble question is this:

Why is anyone sticking up for the idea of reduced quality, even if they don't see "much" difference?


I compare FOX to CBS regularly and CBS to ESPN as well... for various sporting events... and I always like CBS 1080 much better... but when I see ESPN or FOX without directly comparing it still looks nice.

Given the choice, I'd love it all to be 1080, but if it is 720 or better, I can't complain like I could if they were showing, say 1024x720 or something less than the 720p standard.

#58 OFFLINE   Gary Murrell

Gary Murrell

    AllStar

  • Topic Starter
  • Registered
  • 88 posts
Joined: Jan 10, 2005

Posted 04 December 2005 - 06:52 PM

like I said before, Dish or Voom would have been better off providing 720p, that would have gave them the bandwidth room they need and keep picture quality top notch, as 1280x720p picture quality is superb

now with considering the whole interlaced vs progressive, on these 1280x1080i channels we have:

1280x540 now

vs

1920x540 before

interlaced upon motion is only 540 lines, still shots is were 1080i looks best

with 720p we would have 1280x720 instead of 1280x540

1920x1080i gets it's pristine crisp look from the 1920 horizontal pixels of info

this is a huge difference and anyone who thinks this 1280x1080i HD is not much different than TRUE 1920x1080i HD is dead wrong

the hole point of TRUE 1080i HD is that is has high 1920 horizontal rez to offset the interlaced 1080 lines of vertical which is most of the time only 540(when there is motion) you get the benefit of 1080 on still shots

HDMe 540p and 1080i are 99% identical because of the motion factors of 1080i(with motion 1080i only has 540 lines active)

many many HD set top boxes from Directv and upconverting DVD players even output 540p instead of 1080i

1280x1080i is NOt technically better than 1280x720p because of the above stated interlaced motion concerns
and it is cerntainly not better than 1920x1080i we all know that

1280x1080i is not a HD resolution at all as sgiwiz stated, it isn't even 16:9(1.78:1) and must be streched somewhere in the chain to make that aspect ratio

sgiwiz, the Voom mpeg2 streams recorded directly from Dish on my pc(no set top box involved) are 1280x1080i, I us TSreader to give me that info, they also originate from Dish in 16:9 aspect ratio

-Gary
Dish Setup:

129/110/119/61.5/148 -Sats
3 X DP34/3 X DP21 -Switching
942/6000/522/322/510 -Receivers
Everything Pak/NHL CI/Voom/HD Pak/CBS-HD/Locals -Programming

#59 OFFLINE   Stewart Vernon

Stewart Vernon

    Excellent Adventurer

  • Moderators
  • 19,785 posts
  • LocationKittrell, NC
Joined: Jan 07, 2005

Posted 04 December 2005 - 08:12 PM

now with considering the whole interlaced vs progressive, on these 1280x1080i channels we have:

1280x540 now

vs

1920x540 before

interlaced upon motion is only 540 lines, still shots is were 1080i looks best

with 720p we would have 1280x720 instead of 1280x540

1920x1080i gets it's pristine crisp look from the 1920 horizontal pixels of info

this is a huge difference and anyone who thinks this 1280x1080i HD is not much different than TRUE 1920x1080i HD is dead wrong

the hole point of TRUE 1080i HD is that is has high 1920 horizontal rez to offset the interlaced 1080 lines of vertical which is most of the time only 540(when there is motion) you get the benefit of 1080 on still shots

HDMe 540p and 1080i are 99% identical because of the motion factors of 1080i(with motion 1080i only has 540 lines active)

many many HD set top boxes from Directv and upconverting DVD players even output 540p instead of 1080i


I thought perhaps you just didn't know better, or had been given incorrect information... but repeating it makes it seem like you really believe what you are saying.

I don't mean to be insulting, but what you are saying about "540p" simply isn't true. I'm obviously not a good "teacher" either, because I couldn't communicate it well enough, so maybe someone else can.

Interlaced is a different way of presenting the information on the screen than progressive. Progressive is still a relatively new technology in terms of TVs or monitors. The difference between interlaced and progressive is NOT the amount of information presented, but rather HOW it is presented.

If I am dealing cards... I could either lay 6 cards one beside the other (progressive) OR I could lay 3 cards, skipping spaces between them and then go back and put the other 3 cards in those spaces I left. You still get 6 cards either way!

A 720 scanline image has 720 scanlines. A 1080 scanline image has 1080 scanlines. A 720p picture puts all 720 scanlines one right after the other on one pass of the screen. A 1080i picture puts all 1080 scanlines in two sweeps, with each sweep laying down 540 of the lines... but there are 1080 scanlines that form the entire image.

The same argument that would say "but there are only 540 lines on a pass" can be used to say there are only 360 lines in a 720p image when it is halfway done too!

I don't know how else to say it... 1080i has 1080 lines... there are not 540 and it isn't "like" a 540p image. A 540p image would only have 540 scanlines.

1280x1080i still has 1080 scanlines, so it is higher resolution than a 720p 1280x720 image. It is not a full 1920x1080 1080i picture, but it is still more than a 720p one. No matter how you slice it, that is the fact of the matter.

I don't mind people complaining about the downrezzing or saying it is "ripping us off" by saying they are sacrificing picture quality... but I don't like the misinformation saying that 1080i is 540p, which is simply not anywhere close to being right.

#60 OFFLINE   Gary Murrell

Gary Murrell

    AllStar

  • Topic Starter
  • Registered
  • 88 posts
Joined: Jan 10, 2005

Posted 04 December 2005 - 08:56 PM

Never did I say that 1080i was 540p, I said it was close

the fact is that 1080i is interlaced and when there is motion interlaced is only perceived as half the vertical resolution or Temporal Resolution

for still shots 1080i cannot be beat(Spatial Resolution), but for motion 1080i is basically 540p, it has 1920 dots of info per each scan line which is a massive amount of horizontal detail, 1280 reduction reduces that down by alot resulting in a horrid image

1280x720p has more vertical resolution, that is why it fairs so well against 1920x1080i,

1280x1080i is not a higher resolution image than 1280x720p because with anything other than a still image you are comparing these resolution's with Temporal Resolution(of which most video we watch is)

1280x540(HD-Lite 1080i) = 691,200 pixels of Temporal Resolution

vs

1280x720(720p) = 921,600 pixels of Temporal Resolution

vs

1920x540(True 1080i) = 1,036,800 pixels of Temporal Resolution


here are the numbers per each second of actual viewed video:

1280x1080i: 1280x1080x30 = 41,472,000 pixels
1920x1080i: 1920x1080x30 = 62,208,000 pixels
1280x720p: 1280x720x60 = 55,296,000 pixels
720x480p: 720x480x60 =20,736,000 pixels

the numbers at top would also come into play if you were to pause a program of each resolution above and look at the image

720p vs 1080i is a touchy subject and as you can see 1280x720p is a higher resolution than 1280x1080i HD-Lite

this new HD-Lite is only 2X DVD quality
while True 1080i HDTV is 3X DVD quality
720p is 2.7X DVD quality

big difference between these

-Gary
Dish Setup:

129/110/119/61.5/148 -Sats
3 X DP34/3 X DP21 -Switching
942/6000/522/322/510 -Receivers
Everything Pak/NHL CI/Voom/HD Pak/CBS-HD/Locals -Programming




Protected By... spam firewall...And...