NI 4050/4060 Calibration Procedure

Calibration Executive 3.5.2 Help

Edition Date: June 2014

Part Number: 374563A-01

»View Product Info
Download Help (Windows Only)

This topic contains information for calibrating the National Instruments NI 4050/4060 using Calibration Executive.

Calibration Executive Procedure Features:

Adjustment

supported
Manual Mode

supported
Selectable Test Points

unsupported

Approximate Test Time: 100 minutes

Test
Equipment
dividerTest
Conditions
dividerDevice
Setup
dividerTroubleshooting
Guidelines
dividerTest Limit
Equations

Test Equipment

The following table lists the test equipment required to calibrate the NI 4050/4060.

Instrument Recommended ModelRequirements
CalibratorFluke 5700A If this instrument is unavailable, use a high-precision voltage source that is four times as accurate as the analog-to-digital converter (ADC) on the device being calibrated—at least 10 ppm (0.001%) accuracy for DC voltage specifications.

Do not calibrate these devices using a Fluke 55xxA calibrator. The AC voltage test requires an output of 219 V at 30 Hz, and the Fluke 55xxA calibrators cannot output this value. You must use only a Fluke 5700A/5720A.
Chassis NI PXI-1042,
NI PXI-1042Q
Use with PXI modules.


NOTE icon Note  The Calibration Executive procedure runs in automated mode if you use IVI-supported instruments.


Connectors

For current measurements on the NI 4050, use an NI 4050 cable adapter and CSM-200 mA current shunt module, shown in the following figure.

NOTE icon Note  Current verification is optional.

Test Conditions

The following setup and environmental conditions are required to ensure the NI 4050/4060 meets published specifications.

  • Keep connections to the NI 4050/4060 short. Long cables and wires act as antennae, picking up extra noise that can affect measurements.
  • Keep relative humidity between 10% and 90%, noncondensing, or consult the DMM documentation for the optimum relative humidity.
  • Maintain the temperature between 18 and 28 °C.
  • Allow a warm-up time of at least 30 minutes to ensure that the measurement circuitry of the DMM is at a stable operating temperature.
  • (PXI) Ensure that the PXI chassis fan speed is set to HIGH, that the fan filters are clean, and that the empty slots contain filler panels.

Calibration Temperatures

To account for temperature changes, the tested specifications include the effects of temperature drift. For NI 4050/4060 DMMs, the following temperature drifts are valid:

  • 24-hour range—calibration temperature ±1 °C
  • 1-year range—calibration temperature ±10 °C

Device Setup

To set up the NI 4050/4060 for calibration, complete the following steps:

  1. Install the NI 4050/4060 in the host computer according to the instructions in the NI Digital Multimeters Getting Started Guide.
  2. Configure the hardware using Measurement & Automation Explorer (MAX).
  3. Launch the Calibration Executive procedure and complete the setup wizard.
  4. Follow any further instructions you receive from Calibration Executive.
NOTE icon Note  If the NI 4050/4060 fails after calibration, refer to the Troubleshooting Guidelines section. If the device still fails after you complete the troubleshooting procedures, return it to NI for repair or replacement.

Troubleshooting Guidelines

This section describes common problems you might encounter when calibrating an NI 4050/4060.

If the NI 4050/4060 is not recognized in MAX, make sure that you followed the configuration guidelines. If MAX fails to recognize the NI 4050/4060 after you reconfigure, contact National Instruments technical support.

If the NI 4050/4060 fails after calibration, try the following:

  1. Check the connections and run the Calibration Executive procedure again.
  2. If the calibration still fails after trying step 1, try inputting or outputting the failed Test Points using the test panel in MAX or the DMM Soft Front Panel, which is located at Start»All Programs»National Instruments»NI-DMM»NI-DMM Soft Front Panel. For example, if the analog input (AI) failed at 1.99 VDC, input that value from the calibrator.
    1. Launch the test panel in MAX.
    2. Set the Input Limits to ±2.00 V and set the Data Mode to DC voltage.
    3. Check the Average Reading indicator.

If the Average Reading falls outside the range of limits shown in the calibration report after you complete steps 1 and 2, contact National Instruments for repair or replacement. If the Average Reading falls within the limits, you can change the calibration report to indicate the passing value.

NOTE icon Note   If you have Microsoft Access 2000 or later installed on the computer, you can also modify the database containing the test results. By default, this database is located in Program Files\National Instruments\Calibration Executive\Databases\Calibration Reports.mdb.

Test Limit Equations

The following test limits are derived from the published specifications.

DC Voltage

TestLimits = TestValue ± (TestValue * % of Reading + Offset µV)

DC Current

TestLimits = TestValue ± (TestValue * % of Reading + Offset µA)

AC Voltage

TestLimits = TestValue ± [TestValue * (% of Reading + Frequency dependent Error) + Offset mV]

AC Current

TestLimits = TestValue ± [TestValue * (% of Reading + Frequency dependent Error) + Offset mA]

Resistance

TestLimits = TestValue ± ( TestValue * % of Reading + Offset Ω )

WAS THIS ARTICLE HELPFUL?

Not Helpful