...it was finally measured that digital jitter was an issue in digital. Now digital jitter is accepted worldwide and measured.
Turned out that those 16bit 1s and 0s flying down a digital path worked fine - until the timing burped...
I never claimed that jitter is not a factor, just not one that matters in digital transport of video. When the bits are so smeared time-wise that ones and zeroes can't be identified from each other, we are already way past the threshold of what can be decoded. This means that under that threshold, jitter is not that much an issue. It is also pretty much a non-issue because reclocking, which is available at jitter rates under this threshold, is pretty easy to do and eliminates jitter completely, by fully restoring the timing relationships between bits. And the act of decoding includes reclocking. Further, all DTV signals that you have ever seen are below this threshold for jitter, because any that are over that threshold can't be decoded.
Bottom line, it just does not matter.
...So yes, giving a 10bit path gives some headroom.
I would really love to hear an explanation of how this might be possible. An 8-bit signal, no matter how degraded it might become in transport, can't use those other two bits; there is nothing in those other two bits except zeroes. If you understood how digital works or read my earlier post, which explained why it is not
possible, you would understand that. There is no analogy to a "wider pipe" here. Error correction would be the "wider pipe" analogy.
Lots I don't know. Some I do know.
...However, I am at least smart enough to learn when I don't know.
Well, brother, you sure could have fooled me. Your epic fail regarding headroom speaks volumes to the contrary.
...However, as it turns out, this is a mute point...
But you, unlike the other 99.99% of the English-speaking population, apparently don't know what "mute" means. Or "moot", for that matter. Not exactly a credit towards your credibility. I, on the other hand, do actually have credibility, at least in this field. Since I have been formally educated in, and get paid a lot to work with these issues directly, daily, and have for over 15 years, and have been a very successful major-network-employed Broadcast Engineer for longer than many on this forum have been out of diapers, and as is the requirement to be expert in something according to "Outliers", I have my "10,000 hours", many times over (the IQ north of 160 doesn't hurt, either).
You might have even noticed that I never even admit any of this until some random yahoo calls me out on it, because what I post should speak for itself and I am never comfortable trying to assume the mantle of expert or want people to think I might be lording it over them; I just want to post what I happen to know to be the truth and help people with questions get answers, which sometimes I even have. You, who literally may actually be "SomeRandomIdiot" by your own admission, don't have to accept any of that, as on the internet we are all anonymous and uncredentialed, at least officially. But then I am also quite happy to give D-Nice the benefit of the doubt; in calibration of a TV he could very likely run rings around me. You, maybe not so much.
the only way for Digital extraction to be perfect as you state above is with Error Correction. And there is no error correction in the HDMI chain.
Well, I did not state that. If there are no errors, there is no need for error correction, period. (and no need for "Error Correction" [sic], either)
Often, especially in DVB or ATSC delivery, there is a need. HDMI is very different. HDMI happens in a closed, isolated, interference free shielded cable that is less than a few meters long. It doesn't need error correction, which is why it was designed without it. Also completely beside any point made earlier by anyone in this thread.
Not hating at all.
Now that would be really
hard to prove.
Edited by TomCat, 04 January 2013 - 09:29 PM.