Again someone had to PM me about this, as I've told you several times your being ignored.
I stand by my statement that SWiM splitters have no impact on the satellite signal readings and moreover that they cannot create a situation that manifests as rain fade. What you suggested is false and misleading.
I'm not misleading or suggesting false information. You on the other hand don't seem to grasp the concept.
The satellite signal (as reported in Satellite setup menu option) is uniquely a measure of the satellite downlink signal quality received at the LNB feed horn.
what this is [quality] is the carrier to noise ratio, which as long as the signal is ~10 dB above the noise floor, does stay constant.
It does not evaluate the IF signal from the amplifier to the receiver. This is true of pretty much all consumer satellite setups (including broadband and SARS).
Well if you actually had DirecTV AND SWiM, you'd then know that the setup menu does have a screen to measure the IF. It's labeled "SWM" and shows the nine IFs.
The satellite downlink signal travels no farther than the LNB. Within the LNB, the signals are block converted down to the IF and retransmitted by the amplifier section. Adding or subtracting loads on the output of the IF amplifier doesn't impact the quality of the signal the dish reflects into the feed horn one iota.
Cable & splitter losses have nothing to do with the SAT feed to the LNB, and if this was where the receiver was, then the losses wouldn't be there. Since the receiver is what receiver the signal, losses to the LNB plus the losses to the receiver all add together. If these didn't, then Sonora would have the business they do and coax lengths and amps wouldn't matter. If "the load" on the LNB's IF doesn't "impact" the quality of the signal, then coax runs could be hundreds of feet and the receivers would still work fine.
If you placed a variable attenuator in the line ahead of a receiver, the satellite signal readings would remain constant until that receiver could no longer "see" IF signal. Even then, the satellite signal levels would remain more or less unchanged on any other receivers.
If you don't know or believe what I'm saying to be true, test it. I'm guessing you have access to or can assemble an unbalanced attenuator pad.
Since I do have DirecTV, SWiM, and know what I'm talking about, I have run some tests like this to see how low the signal can be before the receiver starts showing "the quality" degrading. I found this to be at about -71 dBm, which was where the highest IF showed a drop of 5% and I ran out of attenuation to go farther. The receiver chip is spec'd to -70 dBm.
As an aside, I would suggest that when you speak of gain or loss, you use percentages instead of dB as those not intimate with the concept of decibels have trouble relating it to anything meaningful. I would suggest using percentage gain or loss.
Percentages work well for bit error rates, but don't for losses, which is why decibels are used.
3 dB is easy @ 50%
6 dB is easy @ 25%
9 dB is easy @ 12.5%
10 dB is easy too @ 10%
20 dB isn't so much @ 1%
30 dB is sort of pointless @ 0.1%
Since we're dealing with dynamic ranges of 50 dB, percentages are basically meaningless.
Now to bring this whole thing back to my post that you replied to:
Using a larger splitter than one needs, will add more loss to the receiver. Since the whole signal path losses all add up, when the SAT signal drops to the LNB, the output of the LNB drops and the losses between the LNB and the receiver causes the receiver's signal to also drop. Reducing the cable/splitter losses, means the receiver will still be above it's minimum detectable signal levels longer.
Using a 4-way splitter instead of an 8-way splitter is much like having 50' less coax between the dish and the receiver.