Separate names with a comma.
Discussion in 'DIRECTV HD DVR/Receiver Discussion' started by KoRn, Nov 15, 2008.
Is there any difference between the 2? Is 1 more optimal then the other?
You will not notice any difference between the two.
Difference as to... what? Obviously, it depends on the digital inputs your decoder (receiver, stand-alone decoder, etc) has, but if you want to talk about the real 'specifications' of the two, irrespective of course of the ability of the unit (in this case the HR22) outputting the signal...
Coaxial is a bit better. Why, you may ask..? You have to think about how the transmission takes place.
First off, both types (coaxial and optical) are single stream digital, which means that the TIMING of the signal is inherent, not on a different 'line', so the receiver has to recover the TIMING of the signal to properly decode it. I've tried to explain this to folks here on the basics of digital microwave/satellite signals, to very little avail.
With optical, the transmitted lightwave through TOSLINK fiber 'bounces' back and forth down the fiber, down to the receiver. This 'smears' the pulses and making the TIMING recovery harder. There are two basic types of fiber, single-mode and multi-mode, guess which type TOSLINK is?
If you guessed multi-mode, you get a prize. Pat yourself on your back. Why not single-mode? Because multi-mode takes far less optical power to 'launch' the wave into the fiber, and therefore the optical detector/receiver at the other end doesn't have to be higher quality. So TOSLINK multi-mode is a cheaper method, and why it was chosen by the manufacturers to build into everything.
Coaxial has some of the same multi-path problems, after all. we're talking about RF propagation through coaxial wire here, so it's just like any RF signal. But high-grade coaxial is MUCH cheaper than TOSLINK in the transmitters and receivers, and obviously far cheaper than any single-mode fiber would be, if that even was an option. So minimizing the reflections and such is easier. At the receiving end, the 'smear' of the digital bitstream is far lower (we're only talking about a couple feet, right?), and therefore recovering that TIMING is not only easier, but results in far less 'smearing' of the sound bits.
All of this was the object of LOTS of commentary at the dawn of the digital audio age (late70's early 80's). There are tons of books on the subject, and tons of links if you point your web-browser that direction. But at the end of the day, it may depend on the original source of the audio; most if not all through DirecTV has gone though may 'hops', 'conversions', you name it, so you're probably not going to hear any difference whatever way. I guess that's the bottom line, but those of us 'golden ear' types won't touch TOSLINK in our audio systems, generally.
It sounds as if what you are saying is important to sound quality. As it turns out, it isn't important at all.
Why? because while digital paths may suffer jitter, reflections, and other timing errors, (and while TOSLINK may have more errors than coax, which is doubtful and unproven) increases in jitter and timing errors have ABSOLUTELY NO EFFECT on the quality of the audio itself. IOW, a "smeared" bit stream does not imply degraded sound quality. As a matter of fact, there is no way it could possibly have any such effect. "Quality" is locked in before encoding, and is absolutely untouchable and unchanged all the way until after decoding. This is one of the big reasons digital was invented. What happens in the middle, while in the digital domain, including transport from your STB to your AVR, can't degrade the sound, assuming all is operating as designed. All jitter and other timing errors are completely removed simply by reclocking at decode.
If slight anomalies in the digital timeline were problematic, then commonplace packetized digital delivery (which is what is transmitted), which greatly disrupts and even rearrages the order of the timeline on purpose by definition, would be exceptionally problematic. Neither is. Slight anomalies are NOT problematic, and neither is packetized delivery, because the time base of the content itself is preserved in the encoding process, and recovered unchanged in the decoding process, completely regardless of how much error may be induced into the transmission itself.
Bits are either interpreted properly or they aren't. Its a one or its a zero. "Smeared" bits are also either interpreted properly and in proper temporal relationship to each other or they aren't. There is no such thing as a "smeared" one or a "smeared" zero, meaning there really is no such thing as a "smeared bit". It's an all-or-none situation, not unlike the digital cliff in digital OTA broadcasting. You either end up with enough bits through direct interpretation plus error correction to decode the sound as it was originally encoded, or you end up without enough uncorrupted bits to decode the sound at all, which means you either have perfectly-reproduced sound, or you have no sound at all. If there are too many errors in the packets, audio output is muted, so there is no chance for degraded audio to reach the output, and instead there is no audio output at all. There is no "in the middle" condition, and "smeared" bits will not mean you will have "smeared" sound. It's ether all (decoded audio identical to the encoded audio) or none (muted audio, or silence). There is nothing in between.
So if optical works, it works. And if coaxial works, it works. They both have equivalent parameters for encoding/decoding, so if both work, both work identically well. There is no difference between the sound of one compared to the other, regardless of what "golden ears" might be falsely trying to tell you. Those $20 green magic markers sold in the 90's to the unsuspecting for absorbing the reflected laser light within your CDs makes about as much sense as any theory about reflected light interference in TOSLINK.
Both systems are engineered to acceptable parameters for consumer use, meaning that their level of how problematic they might be is also fairly equivalent.
Bottom line, use what you have, or what is convenient or economic, and don't worry that you may have compromised by chosing one over the other. You haven't. If it works, it works.
Go back to school. After your prof recovers from laughing so hard he'll bust a gut (as almost did I, but I'm used to this kind of total nonsense on public boards such as this).
BTW, this type of 'digital is digital' nonsense was very typical back 'in the day' (the aforementioned 70's/80's) but it was as wrong then as it is today. A LOT of better circuit, component, and systems design has gone into improving digital sound (and now video), that things are somewhat better today, even on the low end of the spectrum. But much better systems are available today, although in a world where 32Kb/s MP3's are 'standard', quality sound recording and reproduction is relegated to those who can hear the difference.
But if you can't hear the audio problems caused by these inferior components.... One can only hope you don't get a job in broadcasting or recording.
Note that digital error correction and formats for transmission are in place to very much prevent what you are talking about.
"Digital" interfaces have changed a lot since the 70s/80s and as further understanding has come into play, better transmission standards and protocols have been put in place.
I prefer coax because if you step on it it doesn't break, optical is pretty fragile.
Deforming the coax can impact its impedance. Fiber isn't really that delicate. It typically has to be sheared to do damage to it.
The concept that is important to understand is that at the level of the HR2x, unless you have an oscilloscope, you probably won't notice any difference.
I vote for Mike's answer. Unless you've got superhuman ears you will not notice a difference. I've used both and not noticed a difference and I'm pretty discerning when it comes to my audio. What the textbooks say really isn't gonna make a difference in your house.
It is highly unlikely you will hear a difference in the typical home theater setup. Some may, in rare instances, in exceptionally resolving systems. Even then, I doubt anyone could pick between the two in 7 out of 10 times.
I have run the digital audio both ways to my audio system and I can tell you the following:
1) The insane variations in volume levels will be the same via coax or optical.
2) The aduio sync issues will be exactly as annoying via coax as they are optical.
In summary, I could tell no dicsernable difference between the two, but chose to use the optical from the HR22 since my DVD player only has a coax digital output and my A/V receiver only has 1 of each.
The difference will be the length of time you can listen to it at louder volumes, without becoming severely irritated.