Separate names with a comma.
Discussion in 'DIRECTV Installation/MDU Discussion' started by berniec, Oct 24, 2009.
anybody looked at how much more gain you actually get using the alaska hawaii dish vs. a slimline?
While I'm sure someone at DIRECTV knows, I don't know of anyone here who has that information.
The only time I'd even suggest one in the CONUS is if you had lots of rain fade even with a perfectly aligned slimline.
i wouldnt say LOTS, but i'm easily annoyed, and had a gain master prior which made my SD at the time pretty rock solid.
You might get the same results by moving to a SWiM setup.
The SWiM has AGC and can amplify [~+ 15 dB] when the signals start to fade.
already using the SWM and my signal is in the 90's just wondering if there would be any additional fade margin gained
There would be because of the area of the dish. I don't know the size differences but if the alaska hawaii dish reflector had twice the area, then you'd get twice [3 dB] the gain.
You might look at your splitters [or how you're using them] to see if you can set them up with less loss.
How is your system set up? Do you need to terminate any unused ports?
Using an 8-way splitter, when you only need four ports, is dropping your levels. Changing to a 4-way could save the same 3 dB the dish would give.
But the bigger dish gets you 3 more dB of S/N, which downstream loss reduction or amplification doesn't get you. When signal lock is lost due to "rain fade", rarely, in a properly designed system, is it because the signal level at the receiver is too low. Usually, it is because the S/N at the LNB has become too low.
You get about 3db extra with the Alaska/Hawaii dish. If you are looking at SD signals this can help. But with the wavelength of HD signals, the losses due to diffraction through the raindrops are very large and 3db won't do you much good.
I bolded the caveat. Sonora gives about a 25 dB system loss from the SWiM output to the receiver input. 8-way splitters take a good bite out of this, leaving ~ 10 dB for cable losses.
I haven't done any DBS system engineering in the last few years, but I remember much of what I used to know about it and will use that to demonstrate why I value antenna gain more than distribution conservation.
In the late 1990s, Sonora published some technical information that indicated that the S/N of a CONUS transponder coming from an ordinary LNB 18" dish was about 13dB in my region. I think that was from back when DirecTV's satellite had 120 watt transponders. I think that their next CONUS-only satellite had 240 watt transponders, so that would make the S/N 16 dB.
A few years later, Channel Master came up with a premium product called a "Gainmaster" that had just over 2dB more gain than an 18" round or an 18" x 24" oval, meaning that the LNB S/N would be about 18dB. Since the claimed minimum S/N for reliable tuner performance is said to be 8dB, then there was an S/N surplus of 10 dB.
Adapting that to the present, Ka CONUS transponders is difficult and adapting it to spot beams may be impossible, because the spot beam intensity drops off precipitously at the periphery of the boresight and there is no reliable information available to the public on spot beam EIRP contours, but anyway...
I think the SWM LNBs have a low noise LNB that internally outputs probably around -30 dBm on the KU CONUS transponders and maybe a little more than that on the Ka CONUS beams, and then there is an AGC amplifier following that which is set to output around -20 dBm, provided that it has enough gain to enable it to do so. Using my old Gainmaster and 240 watt transponder figures, if the rain reduces the signal strength hitting the reflector by 10 dB, then the S/N ratio is reduced from 18dB to 8 dB and the LNB output is reduced from around -28 dBm to -38 dBm.
Sonora sells a "drop in" line extender with AGC that I think can take any input from -40 dBm to -20 dBm and output -20 dBm. If I put one of those on a Gainmaster output, my downlead would not begin to show any decrease in strength until my S/N ratio had already dropped to an unacceptably low level.
I think that is the same architecture as the SWM LNB. I think that by the time the SWM AGC output begins to drop, the S/N is on the verge of dropping below the acceptability level of about 8dB. If the interstage level of the SWM LNB unit is, say, -25 dBm in clear weather, and if the internal, AGC line extender or launch amplifier or whatever we might call it can boost the output to -20 dBm even if its input is down to -40 dBm, then by that time, the S/N is already down by 15 dB, yet the receiver at the end of a distribution system designed to develop -55 dBm has not yet started to see a signal level reduction. To drop the signal level at the receiver to -60 dBm, which is still a safe operating level, the rain would have to have knocked the dish input signal down another 5 dB, to 20 dB less than in clear weather, and at that point, there is no way that the signal would be serviceable unless the initial, fair weather S/N ratio was above 28 dB.
Lodgenet uses Alaska dishes on all its properties even though doing so costs it probably around another $1,500 or more than putting in a residential dish. I expect to be doing so when I start upgrading my hotel headends to HDTV.
Getting back to berniak's situation, have their been reports here of people losing Ka signals to rain fade that are comparable to the reports of losing Ku signals to rain fade? I figure that the Alaska dish might have three to five dB more gain than the residential dishes. I have always figured that rain fade was proportional to rain density, so I would figure that if you had 3 dB more of gain, then you would be able to withstand twice the rain intensity. To people in Florida, it is often worth the extra expense to mitigate rain fade, but for most of the rest of the country, I don't think that is much of a problem.
There is a rain density model called the Crane (or is it Crain?) model that incorporates historic rain intensity information that might be useful to someone who wants to make a career out of estimating rain fade, but I don't think any information that anyone can learn about it on the internet will enable them to develop a more informed opinion on just how much less outage time one's satellite system would incur for any "X" improvement in antenna gain.
There are a few downsides that will have to be taken into consideration before anyone takes it upon themselves to replace a slimline with the AK/HI dish.
2) Don't expect it to be supported by DirecTv or any HSPs. They will not have replacement parts or any training/experience with those dishes.
3) Not covered under OTARD in the lower 48.
Installing one of these puts one clearly in the DIY/SIY territory.
Yes, some will argue that it's approved equipment, blah, blah, blah. True, but not for the lower 48.
^ AntAltMike [sorry but even my eyes started to get crossed].
Before they did though:
Yes, input "gain" is always the best place to start/look. Gain & noise are affected here the most.
"With that said": Your dBmV isn't correct. The closest I could guess is you meant dBm.
0 dBm = 49 dBmV, so -20 dBmV is "some 60 dB" off.
The SWM output is -30 dBm or 19 dBmV, or 79 dBuV.
[Now back to rainfade]
While west coast weather isn't the same as east coast, I live on the west slope of the Sierras and "we get rain big time". I've had both an AT-9 & AU-9, and only had rainfade for "a few mins" and then only during the heaviest rain/hail storms.
My systems losses are ~ 25 dB from dish to the farthest receiver.
If the larger dish would make "a large improvement" to rainfade, then I'd expect to see a lot more being used, other than where they were designed for [Hawaii & Alaska where they're at the edges of the SAT beams].
I just spent half an hour poking around through Sonora Design's website and I find it remarkable that they have spent so much effort concocting hundreds of web pages that contain so little techiical information.
I see one working model on page 4 of their residential tutorial in which a discrete SWM8 has a real-world (100 foot coax loss) input levels of -41dBm and a controlled output of -31dBm, but which doesn't say what the range of the AGC is. I can only determine from that drawing that it is at least 10dB.
On page 5 of that tutorial, the single output of a dish that they call SLS5 is said to be -31dBm, but then on page 9, the same dish has an output level of -21dBm and the downstream signal levels are consistent with a launch level of -21dBm, which seems to rule out the possibility that the -21dBm figure was simply a typographical error. The only difference between the systems shown on page 5 and page 9 is that the page 9 system uses a 24 volt power inserter rather than the 21 volt inserter used in the page 5 system. While I sometimes see signal strength go up a little when I switch the voltage source in my spectrum analyser from 13 volts to 18 volts, I just can't imagine a 3 volt increase in power source voltage resulting in a 10 dB increase in amplifier output signal level, even if there were no AGC (which there supposedly is in this LNB).
But regardless of what assumptions we make about raw input signal levels, if the single output dish has AGC in it, then any rain induced reduction in the satellite input signal reslts in a dB for dB reduction in the S/N ratio but no reduction in the LNBs output level until all of the reserve gain in the AGC is used up. Post #4 in this thread claims that the unit had 15dB of available gain in it. If that is the case, then anyone could cover half the reflector, causing a 3dB reduction in satellite signal, but having no measurable effect on the output level. You could cover 90% of the reflector (well, if you used a shield with a pie-shaped cutout), resulting in a 10 dB reduction in satellite signal strength with no measurable effect on the downlead signal strength, but with a 10 dB reduction in S/N ratio. I think you would have to cover 97% of the dish surface to get the signal strength down by 15 dB and to begin to cause a measurable transponder signal strength in the downlead. It would be nice of someone here could actually try that, but doing so would require a field strength meter, which most forum contributors don't have.
Getting any information is difficult.
"For me" this kind of started with the rainfade thread. Why do some have rainfade when it's "cloudy" and others seem to only have "slight rainfade" under very heavy conditions?
One thing I noticed were some that only had "slight" were on a SWM system.
"I know" [from the CTO] that SWM has "gain". How much has taken some time to "hear" about. The range seems to be -15 dBm to -45 dBm input will have the AGC output -30 dBm. Below -45 dBm and the output starts dropping.
One poster had four receivers, and was using 2 8-way splitters [with some long cable runs], which would exceed the 25 dB.
Sonora has published some pdfs, showing power levels on various layouts. "As we know", Sonora is in the business of selling amps, which may bias some of this.
"Minimum" levels for a receiver "they say" is -54 dBm for "full feature operation". What this really means I don't know, other than it looks like there is a budget of 25 dB of loss for the system.
This seems to "match" the 200' coax "limit".
Zinwell has yet to reply to my request for specs on their WB68 & WB616, "but" from what we're seeing, they have "some gain" to overcome their insertion losses.
Broadcom specs the chips used in the receivers for -25 to -75 dBm.
From some of the reading in my "other thread", the C/N levels are in the 11-14 dB range. What the minimum requirement is I don't know [yet].
Clearly these systems are: cheap & noisy, and "just good enough" to work.
I'll leave you to your own lab testing.
Sorry I'm late to the party but I did this experiment 2 years ago.
DTVA-H1.2Dish give 7dBm gain over AU-9 and 10dBm (which also is a 10X )gain over a phase III.
Excel spread sheet of full data reports.
But that experiment doesn't separate antenna gain from LNB gain. Look at the C/N ratio improvents. The C/N of the AT-9 averages a little under 1dB more than that of the Phase 2, and the DTVA-H1.2 averages no more than 2 dB above that of the AT-9.
What are the dimensions of an AT-9? Aren't they about 24" by 36"? If so, then each LNB will effectively "see" about the same surface as a focused 24" reflector. A 1.2 meter dish has 4 times the surface area as a 24" dish. Thus, it has a theoretical limit of 6dB increased gain, assuming the efficiency is the same, but larger dishes at Ku frequency will tend to have less efficiency.
I suspect that the modest C/N improvements may be partly explained by the fact that the stronger signal coming off the larger reflector is causing a little more intermodulation distortion in the LNBs than do the relatively weaker signals from the other dishes.
I just opened doctor j's table to its expanded version that includes the 99 and 103 spots that appear to be his (those with DirecTV signal percentages of 90% or greater), and the improvement in C/N ratio is just .7 and 1.2dB on 99, and .6 and .4 dB on 103.
Don't always understand the specifics.
I just do what I can and report the data as i see it.
In that thread there is also a pix of the 3 dishes side by side.
When comparing antenna gain, the term used is dB, not dBm. It is an expression of 10 times the logarithm of the power ratio (or 20 times the logarithm of the voltage ratio) of the signal received by the two antennas. The term dBm refers to a power level compared to one milliwatt. Another term often used is dBi, which compares the antenna gain to that of an isotropic radiator (a theoretical antenna which has uniform gain in all directions in free space).
Well if we're getting down to this level, power isn't affected by impedance, but voltage is, so you must "normalize" to it [which for our coax is 75 ohm].
Before everyone grabs their calculator [yeah, right :lol: ] here's one: