Insulation testers typically offer a discrete number of selectable test voltages, with 250, 500 and 1000 V being common selections. Until recent improvements in microprocessor technology, it was largely just assumed that the selected voltage was actually being realized across the test item. That may not have been so. Working in opposition to the test voltage is the amount of current being drawn, and since these instruments are being used to test insulation, the amount of current…typically only a couple milli-amps…could be restricted for safety and economy.
Poor load curve
If the load under test was breaking down…allowing for more current than the tester could output…voltage would collapse accordingly. It is generally recognized that insulation should not be less than a megaohm (one million Ohms) in working electrical equipment. A well-designed, quality tester will exhibit a sharp voltage rise up to about a megaohm, and retain that voltage through all higher resistance values. Only when insulation is no longer doing its job does the test voltage decay. A poorly designed tester, however, struggles to reach full voltage, so that at lower insulation values…those most critical for making decisions…something far less than selected voltage may be applied. Until recently, the astute technician had to look for a load curve in the specifications in order to make this determination. Modern testers, however, now may display the actual applied test voltage. This will tend to load up slightly, but should never load down. A one thousand volt test may actually be run at, say, 1025 V, but should never be 995…that is, unless the test item is breaking down and should not be in service. Accordingly, test voltage accuracy specs are typically something like -0%, +5%. On a “good” test item, the voltage may load up, but never down.