During my testing with the AIM, I was more focused on signal levels and simulating rainfade.
With my second cup of coffee in hand (), "Lock" being related to CNR, it may be the AIM reporting it has min CNR, much like a menu in my TV.
Sorry for the belated response to this VOS;
According to the AIM manual I just d/l'ed, it describes "Lock" not specifically as minimum CNR but in a pinup notation as:
"Lock" appears on the screen when the
signal power is above the minimum level
required to supply the IRD.
But I feel this may be an inaccurate reference for what is really CNR since that would mean, ... what a -60 to -65 dbm level achieves a "Lock" on the meter?
... LO offset may merely be the difference between the signal from the TP and the LO.
103a & 103b are using the same LO, but the offset [above] varies 80-90 KHz.
The tuner may be where the offset is sorted out, but if its too great, the tuner drifts to the adjacent channel.
Agreed, as the manual "effectively" describes this for the meaning of "offset" simply as;
Frequency offset of the transponder signal from its expected frequency
So does this mean though that the legacy LNB you choose to currently use on the dish was a trade-off of a higher freq. error (~3 MHz) and a slightly lower power output for a better CNR (or noise figure)?
Hey, and BTW, using the AIM figures for "SNR" (CNR) vs. the relative 0-100 scale signal levels on the meter in various examples quoted in the manual always equates out to .13 db per point.
Do you think the AIM's same 0-100 scale is equivalent to the receiver's (IRD) 0-100 signal strength screen?
So say a "100" on the Tp. SS screen reading means (at least a min.) of a 13 db CNR?
Be nice to attach more of a definitive meaning to these common receiver SS readings than merely relative points on a scale of 0-100.