Rich said:
...Those 21s got old early.... :lol:
Rich
Got old early? I just got mine last week! :grin:
HDMI cables don't act exactly like we expect them to. Carrying a digital signal, we sort of expect them to work perfectly or not at all. While that rule seems to apply to compressed signals, such as sat or OTA, HDMI is not compressed, and it does not need redundant information to decode the signal because it is "already decoded" (compressed signals can fail immediately if just a small critical amount of the signal is corrupted). So ironically, it can fail gradually, like analog.The PQ will not fail like analog does, where color fades, levels drop, and sharpness disappears, but there is more and more potential for the white flecks to creep in.
I have seen this a couple times. I used a 30-foot HDMI cable at work, and it gave a beautiful picture, except it looked like there were "snow flurries" all over everything. A new, fatter cable fixed it. At home I am feeding a HDMI signal from a switch box through a super-heavy gauge HDMI cable, and I still get constant white flecks on occasion, this time in patterns. I'm not sure if it is the cable or not, but I suspect the switch box since the cable worked OK once before.
This we know for sure: the thinner the gauge of cable the less distance it can go before there is trouble, so if you are going more than about 15 feet, buy a heavier cable.
But for the OP, this may all be beside the point. One other thing that we know for sure is that HD does not necessarily mean freedom from video noise. On the contrary, the increased resolution of HD is exactly what is needed to reproduce noise particles faithfully (where SD tended to mask a lot of noise) because the noise particles themselves are just about exactly the same size as the pixels. There is typically a lot more incremental generational noise when the production chain is analog, but digital does not mean there will not be noise, especially since in a way all digital production chains are really part analog and there is no true 100% digital video, even in consumer HD production where it is as simple as one camera recording to a HDD and playing back from that HDD.
Lots of HD video has noise in it, because the first thing that is created in a HD camera is light reacting on the imager, which is essentially an analog process. Our eyes are analog, and no matter how solid-state or digital the imager is, it has to mimic analog sight in order to create a picture that humans can see.
The imager has a certain amount of inherent thermal noise, and dark parts of the scene will mean that what is close to black will be modulated by that noise floor, and visibly so with cheaper equipment and poorer technique. Just turn the ISO way up on your point-and-shoot "digital" camera, and take a picture in the dark, and you will experience the same thing for yourself.
Lots of HD production allows noise into the process, too. And, noise reduction techniques which are common before encoding are not really effective at removing noise; instead they beat it down to a dull roar, so that it is acceptable, but it starts to look like you are looking through a foot of water as it gets all smeary as a side effect of the noise reduction process. If the original was SD or put through part of the production in the analog domain, that is even more opportunity for the video to be noisy.
But generally we don't notice the noise at normal viewing distances. You can also minimize it with artful settings on the sharpness and other controls.
If the banners are not noisy, it is probably noise in the original analog portion of the production chain at fault. But HDMI flecks can appear just in certain areas and often do not dynamically change all over the entire screen like random analog noise does, so it is important to take a close look to be sure.