Separate names with a comma.
Discussion in 'DIRECTV General Discussion' started by saleen351, Jul 27, 2007.
I'm pretty sure it would take a spectrum analyzer.
I would have a major problem measuring the bandwidth of a complex waveform in the time domain with an o-scope.
Let's consider a special case where a single frame that is not part of a moving picture (i.e. a test pattern) is sent and displayed. It can be sent once at full bandwidth because the compression encoder detects successive frames are always identical. So if you look at the bandwidth at the analog connector you will see full bandwidth with the actual transmitted bandwidth being near zero.
Next, somewhere in the transmission process, that every third pixel must be reinserted (probably in the set top box) to format the signal back to 1080i so your monitor can recognize it as a valid ATSC over-the-air picture format. I'm thinking that process will increase the bandwidth you measure at the connector because there is more data there then was transmitted.
Certainly, those 'missing' pixels cannot be reconstructed with 100% accuracy so even though there is more data there, it's 'bogus' data that the HD-Lite people hope you can't see.
"You measure the bandwidth coming off the YPbPr output." Sounds like:
1) An O-scope might work.
2) since the signal going out of the HR-20 would be 1920 x 1080 [after it is scaled], I'm not sure how it would show the reduced bandwidth.
As I understand the process: D* drops every third pixel [1920 down to 1280], send this through the SAT feed and then averages in the missing pixels. There is no way that I can see my TV showing a 1080i image in anything less than 1920 x 1080, given that when it is in letterbox [a movie] or 4:3 format, I see it as such, & would see a 1280 x 1080 image not filling up my display and looking distorted at the same time.
Would it not be a source issue? Is there anyone broadcasting 1080p?
Yes, a source issue, and no, not to my knowledge is anybody even thinking of broadcasting it. I've been told the broadcom chip can't de-interlace 1080i also.
The bandwidth for 1080p would be tough to pass.
720p is actually smaller than 1080i, so "something" would have to give to send 1080p and fit the current bandwidth [even for OTA].
Your assumptions are incorrect on several levels, but in a nutshell, your comments assume (correctly) that D* is using reduced resolution from 1920x1080 - otherwise there is no reason to throw away every third pixel. There are still the majority who refuse to believe that basic fact and that is what it shows.
These were done several years ago by Bob Siedel @ CBS in New York who has the proper equipment to take a screen capture with. They ARE NOT the current HD-LIL but everyone who has checked has confirmed the results are the same for the MPEG4 HD-LIL as they are for the downrezed HD-LITE MPEG2 feeds.
This is how WCBS-DT looks OTA via an ATSC Tuner:
This is how WCBS-DT looks via Directv HD-LITE:
I got it, and was thinking the same thing reading the post just above yours.:hurah:
The every third pixel theory is just a carry over from an earlier post that suggested that as a possible method to convert a 1080i picture to HD lite. I think in principle, 'throwing away' a third of the data would indeed reduce the bandwidth required to send the picture. That method would be somewhat simplistic so I hope they employ a more sophisticated algorithm.
It's factual that an ATSC standard exists that justifies HD Lite as a satellite transmission standard with less than 1920 horizontal pixels. That means a 1080i picture must somehow be altered to reduce the number of effective horizontal pixels for transmission through the satellite. For a 16:9 picture that would mean the either the pixels aren't square anymore or some are missing. That original 1080 format really does need to be regenerated because a consumer monitor, which is designed for over-the-air ATSC, cannot directly understand the ATSC DBS format.
There can only be speculation because they won't tell us how they they do it and don't readily admit they do do it. You can learn atomic secrets easier than you can get tech information from the DBS people.
But, there's no doubt they are playing mind games with us. :lol:
That was one of the test patterns we played around with - a line missing every 3rd line (of course we needed to make 3 versions not knowing which would be discarded).
Ever? Yes, of course. Likely any time soon? No. It isn't on any public radar.
haha... good to know somebody got it...
i understand that no one is broadcasting in 1080p... just from what I've read... i don't think it's possible for 1080p to be carried over a coax... or at least from a cable company perspective... maybe one feed from a dish to a receiver... just not enough bandwidth to feed that much to thousands of houses... so that leaves fiber and satellites... so my question was is it possible (from a bandwidth perspective) to ever expect 1080p from D*
Technically it possible now, but one or two channels will occupy whole bandwidth of one transponder.
It's not a question of coax, fiber, or SAT. OTA has the most bandwidth and uses 5 MHz for their channels. If they don't drop some part of the signal [24 frames instead of 30/60], they would need say 10 MHz [for example only as it may not be this much]. Then a 1080p signal would take up two 1080i channels. This is the bandwidth problem as with any HD channel. Quality verse quantity. The supplier [SAT or cable] can get more bang [bucks] with more channels than "better" picture.
This goes back to the title of this thread: HD-lite. The more the supplier can offer channel wise, the more revenue. It's a pure business decision.
There ya go VOS - business decision. It's a business, they are in it to make money, not to please videophiles. They will do the least they can get away with, to maximize profit, without losing customers.
So are you saying at first D* was doing HD-Lite with LIL's but now they aren't? A lot of people on this board and avs have wanted to see facts but other than hearsay nothing has been said.
What I've read was: MPEG-4 [1080i] is 1440 x 1080 and not the 1280 x 1080 used with MPEG-2.
I don't have the equipment to know myself and there is a lot of "smoke" in all of the postings about any of this.
There may be some smoke coming from a number of sources, but I'm pretty confident in stating D* has been reducing the resolution of MPEG4 HD LIL. Whether that continues as we speak is not clear, but I'd be shocked if it isn't.
Well, 1440 sure isn't 1920.
Instead of a third lost it's a quarter [if the numbers are true].
I believe what I see, hear, touch, or can test. After that I have to take things on faith in whoever is posting it and how well it makes sense to me.