Insight : Your Test Tools Can Lie

You’d be surprised how often people claim, when they test equipment or provide loose evidence of testing that’s got little or no basis to it beyond ’because that’s what the meter reads‘ type blind assumption or because the result suits their point/purpose/belief.

I mean, in a general context, how often have you seen specs for cable or antennas etc that’s likely (almost certainly on what you receive) more ‘snake oil’ and bordering on science fiction than reality?

Well, even where you’re not a victim of malicious efforts, your trusted sources (i.e. test gear, instrument displays etc) can be as misleading if not checked for accuracy from time to time.

You’ll never see test gear used commercially by legit techs that’s not had it’s calibration checked at least once in it’s life, legit users who use them as part of serious diagnostic processes will see value in getting the calibration and serviceable working status/condition assessed as you can end up creating more x factors when your readings are BS than the actual fault on equipment you’re testing creates and add in the dreaded intermittent fault and you’re really going to led up a false alley miles from where you started - so to speak.

Now when you buy new, sealed, boxed or packaged items, you’ve got good reason to at least expect it to deliver within it’s supposed tolerances. In reality, if it’s delivering results that are borderline tolerance accurate when you test against a known good condition, you shouldn’t even entertain using the test instrument and go get a replacement - it’s a pretty good indicator that the accuracy is unduly compromised and likely to not be anything like usable over it’s operational range and be reliably tell you anything useful.

So it’s a definite that any new or unknown quantity needs to be at least tested against a basic known condition/load under the nominal conditions it’s supposed to be accurate within.

A great example I’ve seen from time to time are reflectometers/SWR units, as highly a mixed bag second only to RF power meters for the probability it’s a liar of grand proportions.

So, it’s worth actually sourcing decent test gear and occasionally testing it against known references you know are stable, and at least once in a blue moon, get them verified - compared to the value of wasted time chasing the consequences of crappy misreadings, the cost of occasional gear testing/verification is negligible.

If your use of them is critical, only a fool saves money by skipping this year’s calibration. OK, you don’t necessarily have to go to an annual extreme unless there’s some fixed requirements governing your test gear, but living blind assuming that the twenty year old calibration label is still reflecting the current state of the gear is simply crazy.

Clearly you’ll do what suits you - but at least be honest and realise the reality of what you use.

I’m only using a mixed bag of now quite old gear with known good modern gear because I’ve taken time to ensure I can rely on the stuff, and often it’s the older simple uncomplex basic items which are the best indicators (whilst I use a multimeter or three, different types, I still keep a set of AVO‘s in sound working order because ultimately, when there’s a mysterious conflict in field readings, the truth becomes apparent as to why once a repeat gets done using the AVO old timers.

Likewise, as much as getting into the test and evaluate habit about test gear, ensure you’re confident that you’re using your gear properly as it’s amazing how many people never really learnt how to use them correctly.

Ultimately, if you are going to sign off a job or project as sound, make sure it is - even down to being certain the references and test gear used were trustworthy. If in doubt, repeat and evaluate.