DBSTalk Forum banner
1 - 20 of 60 Posts

· Hall Of Fame
Joined
·
6,035 Posts
Discussion Starter · #1 ·
Since there are now TVs available that offer 1080p resolution, has anyone heard of any networks planning on offering 1080p programming? Right now, all we can use 1080p for is watching Blu-Ray movies. I can see 1080p really being nice for sports, due to their fast motion, and even movies. I would expect a network moving to 1080p would be expensive since some new equipment would be needed.
 

· Hall Of Fame
Joined
·
4,266 Posts
Just because we have 1080p TV’s and some 1080p content via BluRay and on-demand, there is not real reason to expect 1080p broadcasts. Even without 1080p content, a 1080p screen is still worth having. Basically, with a 1366x768 native screen, everything has to be rescaled from 720 or 1080. With a 1080p screen, the 1080i broadcasts only need to be deinterlaced so there isn’t a need for rescaling.
 

· Registered
Joined
·
5,915 Posts
bonscott87 said:
TV stations are just now starting to go HD in the first place. Let's get that done first before thinking about 1080p. ;)
No kidding!

and 1080p is not that much better, the major jump is from SD to HD period. Lets get that going before worrying about 1080p that has just gotten marketed to death since HD-DVD/Blu-ray.

Yes it is better, but it is not THAT much better for a lot of situations and just having HD instead of SD is much more important IMO.
 

· Broadcast Engineer
Joined
·
4,146 Posts
kturcotte said:
They'd also have to again send out 2 signals wouldn't they? 1080p for those that have the 1080p tuner, and 720p or 1080i for those that don't have a 1080p tuner.
Any ATSC tuner will work, including the very oldest one sold on August 6th, 1998. The tuner doesn't even know if its 1080p or 1080i or 720p or 480i. All the tuner sees is a RF signal modulated as 8VSB. After it is demodulated and demuxed, and then decoded from MPEG back into full baseband HD, only then is the flag telling the display what format it is framed in accessed to differentiate between the various formats, and that is much after the tuner has already done its job.
 

· Broadcast Engineer
Joined
·
4,146 Posts
kturcotte said:
I doubt we'll see anything broadcast in 1080p for quite awhile. The cost and the bandwidth required is just too much.
To make such a statement, it is really necessary to differentiate between 1080p60 and 1080p24.

Actually, stuff is broadcast as virtual 1080p24 all of the time. Any encoder that is fed 24 fps content (or is fed 24 fps original content that has 3:2 pulldown) automatically reverts to "film mode" (which would be every time a movie is broadcast in HD). There is an automatic 2:3 pullup process done just before encode (if necessary) and the frames are encoded as progressive frames without pulldown (or true 1080p24). The local decoder after the 8VSB tuner can recognize this mode and re-add 3:2 pulldown after decoding (assuming a broadcast format of 1080i30) which also happens all of the time.

While the 1080p24-encoded stream is technically sent with a flag indicating that it is to be decoded as interlaced content or 1080i30, any 1080p display can easily reconstitute it completely transparently into true 1080p24 (plus 3:2 pulldown, which may add a tiny smidgen of judder), which means essentially that you are receiving the exact same quality as 1080p24 and it does not suffer interlace error as typical 1080i content normally does. A 720 set reconstitutes it into 720p, also without the interlace error that would remain for typical 1080i30 content. So it is sent in progressive format with "progressive" quality (meaning no interlace error), flagged as interlaced (1080i30) and reconstituted as 1080p24 (once your display deinterlaces it and displays it progressively).

The re-adding of pullup is only necessary to maintain consistency with the interstitial material, such as commercial breaks, which are typically 1080i30. Otherwise there would be an ugly glitch as the decoder has to transition from 1080p24 to 1080i30 and back. The pulldown also decreases the 24 fps flicker factor which likely offsets any negative aspects of the addition of judder and is therefore actually an overall improvement. But essentially, every time a 1080i station broadcasts a movie, if you have a 1080p set you will get the very same quality you would get had you bought a 1080p hard copy in Blu-Ray.

What is difficult to do (and impossible so far within a 6 MHz bandwidth) is to transmit 1080p60. That is the format that takes more bandwidth to do properly. And that is also the only format that would really improve over 1080p24, 1080i30, or 720p60, as far as motion artifacting goes (which would indeed make it really nice for sports). So far, even all DOD 1080p is really only 1080p24, and I would not really even expect much 1080p60 content to be considered, at least for now, as the de facto format world-wide is 1080p24 for content acquisition, specifically because any other format can be easily extracted from it making conversion universal.

So if you are talking specifically about 1080p60, you are absolutely right--I would not expect to see content broadcast or sent by DBS/FIOS/cable either, but 1080p24? We actually get that all the time.
 

· Godfather
Joined
·
450 Posts
n3ntj said:
Has anyone heard of any networks planning on offering 1080p programming?
When there is no motion from field 1 to field 2 of an interlaced image it doesn't matter if the display is interlaced or progressive.

Prime time dramas and sitcoms on CBS and NBC are shot in 24P, converted to 1080i 30 frames, and displayed by your TV as 1080p.
 

· Legend
Joined
·
178 Posts
TomCat said:
But essentially, every time a 1080i station broadcasts a movie, if you have a 1080p set you will get the very same quality you would get had you bought a 1080p hard copy in Blu-Ray.
Dude, you have seriously lost it. There is no compression with Blu Ray, the bitrates are much higher providing much higher quality and detail that blows away MPEG2 broadcast TV. And most, not all, 1080P TVs do not properly implement 3:2 pulldown, which is not necessary with one that will accept 1080P/24. I think you may want to have your glasses checked.
 

· Broadcast Engineer
Joined
·
4,146 Posts
HD AV said:
Dude, you have seriously lost it. There is no compression with Blu Ray, the bitrates are much higher providing much higher quality and detail that blows away MPEG2 broadcast TV. And most, not all, 1080P TVs do not properly implement 3:2 pulldown, which is not necessary with one that will accept 1080P/24. I think you may want to have your glasses checked.
Thanks for the free consultation, but maybe later. Actually, you may want to have your facts checked. And I can help you with that:

1) All HD is compressed. That includes Blu, which uses the very same MPEG-4compression scheme that DTV uses. Since uncompressed HD has a bit rate of about 1.485 Gb/s, that would mean that a 30 GB disc could only hold less than 3 minutes of video, were Blu-Ray uncompressed. So you are grossly misinformed on that count. And, you can look that up. Most HD is compressed in the very act of acquisition.

2) "Detail" refers to how sharp the picture is, which is directly tied to resolution. All else held equal (focus, lens quality, imager quality, etc.), 1080p or 1080i resolution is exactly the same regardless of what medium it might be delivered to you in, which means the detail in 1080i OTA TV is precisely the same as that in Blu-Ray. If you are seeing more "detail", then you are simply fooling yourself. There isn't any more to see. Also, resolution and bit rate are two completely different things that typically do not affect each other. There are some low-bit delivery schemes that effectively lower resolution, but that is not the case in OTA and cable, and thankfully is no longer the case for DBS.

3) While that means that higher bit rates do not provide higher detail (which as we have established is instead a function of resolution), higher bit rates also do not necessarily provide higher quality in any other way. I know that's a hard one to swallow, and an inviting conclusion that would be very easy to jump to. In a case where there are plenty of bits to go around, such as in a OTA broadcast at 14.5 Mb/s, increasing the bit rate will not buy you any increase in quality whatsover. Shocking, (to those who are misinformed or who have not been able to resist jumping to that conclusion) but completely true. Only when lower compression techniques use trade-offs that compromise quality to prevent equivalent artifacting at those much-lower bit rates, is there necessarily any reduction in quality. It is a common, yet highly flawed misconception, that raising bit rates or having a higher bit rate on one medium as compared to another, will automagically yield better quality. As someone who gets paid handsomely to work with compression algorithms on a daily basis, I think I can safely say that it just doesn't work that way.

That said, I will agree that some HD delivered by bit-starved systems will not track motion with the same level of artifacting as will HD delivered without equivalent bit starving. And true, some OTA stations do that. In that case, "Transformers" from an ABC station with two sub channels may not look quite as good during action sequences as a Blu-Ray copy will. But the difference in the action sequences may not be that significant even for the most bit-starved frames, and 99% of all frames with motion will look nearly indistinguishably the same in either case, and all frames with low or no motion will look identical in either case. But then that is not the hair-splitting argument I was speaking of, and is somewhat beyond the scope of the thread and the discussion we all were having.

4) Pulldown is not all that difficult to do. Especially with digital circuitry, which makes storing and repeating frames pretty simple, and is exactly what pulldown requires. If a display manufacturer can't do pulldown properly, then they have no business even being in the business.

Not only that, but for the case I posted about, which is 1080p24 delivered in a 1080i30 format (well, 29.97 if you do want to split hairs) It is not the display that does the pulldown, it is the decoder. The ability of the decoder to do pulldown is a part of the ATSC spec, and a part of "film mode" which is the mode invoked when 1080p24 content is sent. Every ATSC decoder conforms to that spec, and includes this ability. All your TV has to do is to accept the 1080i30 content that is reconstituted WITHIN the last stage of the decoder, and display it progressively as 1080p, which is what the set is not only designed to do, but is exactly what it does with all content (displays it as 1080p60). And that means it does exactly the same thing when fed 1080p24 directly from Blu--it displays it as 1080p60. And that would imply pulldown done in the display, not the decoder. If, as you say, some sets might not do pulldown properly, then it would actually be the Blu content that is at risk, not content that was already pulled during decoding via ATSC.

Of course it is important to distinguish 1080p60 as a display format from 1080p60 as an acquisition format, which I mention only because most folks confuse the two and incorrectly impart the benefits of the acquisition format to the display format. For those who only could get into community college, I can break that down: 1080p60 acquisition format = good. 1080p60 display format = possibly not so good, depending on the original content.

For acquisition, it implies that there are 60 unique 1920x1080 fields recorded progressively each second, with no interlace error. As a display format, it only implies that the raster displays 60 1920x1080 fields progressively each second. It does not imply that the fields are unique (which they won't be with pulldown) or that they will not include interlace error (which they will if acquired as interlaced content). Bottom line, 1080p60 as an acquisition format is significantly better than other formats due to more frames and no interlace error(except for 720p) while a 1080p display is somewhat better than a 768 or 720 display, only for completely different reasons, and not for the reasons enjoyed by the 1080p60 acquisition format.

But then regardless of all of that, if you are thick enough to actually buy a set that doesn't do reinterlace or pulldown properly, (after all, it IS 2008), then I probably would have a hard time drumming up sympathy for you.
 

· Broadcast Engineer
Joined
·
4,146 Posts
Tower Guy said:
When there is no motion from field 1 to field 2 of an interlaced image it doesn't matter if the display is interlaced or progressive...
All displays other than CRTs are progressive by nature. They have no choice but to display progressively, because they can only display progressively, which is why you never see any "1080i" displays at Best Buy. A 1080p set is not designed as a 1080p set because that is some wonderful improvement to display technology (although the hype masters would love for us to think that this was the intent all along), it is designed as 1080p because there is no capability or choice to display in interlaced mode on modern flat panel displays, which it can't possibly support.

But I get your drift, I think. The result of taking both fields of an interlaced frame and displaying them progressively (interlacing them into one field, one frame) is virtually identical to displaying the equivalent originally-progressive frame, well, "progressively". That eliminates any interlace flicker factor, which leaves only interlace error. And as you say, if there is no motion, there can be no interlace error.

That said, television is not radio with pictures. By their very nature television images move most of the time, just as things do in the real world. Even if, theoretically speaking, still pictures have no interlace error under 1080i or 480i, interlace error is still very significant, because in most video there is motion.
 

· Premium Member
Joined
·
21,658 Posts
TomCat said:
All displays other than CRTs are progressive by nature. They have no choice but to display progressively, because they can only display progressively, which is why you never see any "1080i" displays at Best Buy. A 1080p set is not designed as a 1080p set because that is some wonderful improvement to display technology (although the hype masters would love for us to think that this was the intent all along), it is designed as 1080p because there is no capability or choice to display in interlaced mode on modern flat panel displays, which it can't possibly support.
Just to be technically accurate...

It is all but impossible to display a progressive image natively on an interlaced imaging device because that device simply cannot scan fast enough or scan consecutive lines accurately. This is why interlaced even exists in the first place.

However, a progressive-native device most certainly can display a native interlaced image. Granted they might not typically do it, but there is no reason to not be technically capable. It is a relatively easy thing to display interlaced on a native-progressive device.

To say that they don't is usually accurate, but to say that they can't is not accurate.
 

· Old Guys Rule!
Joined
·
5,048 Posts
+1
There's no reason a 1080p device can't display interlaced. If the circuitry exists, interlaced scan can be done. Flat panels are not inherently progressive. Scanning msthod is determined by support circuitry.
 

· Godfather
Joined
·
318 Posts
I don't think this will be a possibility for at least a minimum of 5-7 years, and that's being extremely optimistic. IMO, 1080p is primarily a gimmick created by Sony and a few others designed to dupe uneducated viewers into believing that they are in fact seeing a better picture than a 1080i/720p TV. Granted, the picture is a little better but it is nothing to drop your jaw over.
 

· Hall Of Fame
Joined
·
8,968 Posts
Another issue that we have been living with since the 1930s that is just starting to be addressed for the first time is the "24 frames displayed across 60 frames" issue.

The vast majority of HDTVs, and all TVs before them made for the US market, are 60 Hz refresh-rate TVs. 24 does not go into 60 evenly, so anything shot at 24 frames per second has ALWAYS had to be shown with 3:2 cadence on TV.

Just in the last two years have there been consumer-level TVs that are capable of switching to a refresh rate that is an even multple of 24 Hz, which allows the TV, finally, to display 24 fps sources at 24 fps. In modern TV lingo, we refer to this as a TV that can not only accept a "1080/24p signal" (many fixed 60 Hz TVs can accept these signals), but can render them properly at 1/24th of a second (actually, at a multple).

It is really only these newer TVs that get the full benefit of a 1080/24p signal. There are many TVs that can correctly display 1080/60p (great for video games, computer use, and a few Blu-Ray titles) that can't do 1080/24p "correctly" (i.e., without the 3:2 cadence that causes judder).

Anyway, to get back to the original question: 1080/60p broadcasting would require scrapping existing ATSC tuners and hardware, and a doubling of broadcast bandwidth, neither of which is likely to happen in the next 20-30 years. Heck, HDTVs have been around for a decade, and we are still in the transition period, which has at least another 5 years to go before most folks have HD in their home, and before most networks are sourcing all NEW content in HD.
 

· Beware the Attack Basset
Joined
·
26,609 Posts
Tower Guy said:
True, but misleading. There is no motion between field one and two when the show is shot on film.
Pulldown may introduce changes from one field to the next.
 
1 - 20 of 60 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top