|Download Help (Windows Only)|
ADC calibration is a method used to compensate for an internal DMM gain error and allows you to appropriately trade off measurement speed for long-term accuracy.
In DCV and resistance at 6½ or 7½ digit resolutions, NI recommends using ADC calibration for the greatest accuracy. In ACV and current, or at resolutions of 4½–5½ digits, ADC calibration is not required for satisfactory performance. When ADC calibration is enabled, every measurement cycle includes an additional phase for acquiring the value of the high-precision reference. This phase yields the most exacting precision because any ADC gain drift is normalized to the input signal, and the ADC gain drift is removed in the resulting mathematical calculation on every measurement.
The following figure represents the process of the ADC calibration cycle. During this cycle, the input is disconnected from the ADC and the precision DC reference is measured. This measurement consists of an ADC calibration Auto Zero (REF LO) and an ADC calibration HI (REF HI). The normalized value of the reference voltage is calculated from Vref = REF HI–REF LO.
Your application may demand speed over accuracy. In these instances, you can disable ADC calibration.
|Note NI recommends that you run self-calibration before taking a 6½ or 7½ digit measurement with ADC calibration disabled.|