Setting Up Your Imaging System

NI Vision 2015 Concepts Help

Edition Date: June 2015

Part Number: 372916T-01

»View Product Info
Download Help (Windows Only)

Before you acquire, analyze, and process images, you must set up your imaging system. Five factors comprise a imaging system: field of view, working distance, resolution, depth of field, and sensor size. The following figure illustrates these concepts.

1  Resolution
2  Field of View
3  Working Distance
4  Sensor Size
5  Depth of Field
6  Image
7  Pixel
8  Pixel Resolution
  1. Resolution—The smallest feature size on your object that the imaging system can distinguish
  2. Field of view—The area of the object under inspection that the camera can acquire
  3. Working distance—The distance from the front of the camera lens to the object under inspection
  4. Sensor size—The size of a sensor's active area, typically defined by the sensor's horizontal dimension
  5. Depth of field—The maximum object depth that remains in focus
  6. Image—The image under inspection.
  7. Pixel—The smallest division that makes up a digital image.
  8. Pixel resolution—The minimum number of pixels needed to represent the object under inspection

For additional information about the fundamental parameters of an imaging system, refer to the Application Notes sections of the Edmund Industrial Optics Optics and Optical Instruments Catalog, or visit Edmund Industrial Optics at

Acquiring Quality Images

The manner in which you set up your system depends on the type of analysis and processing you need to do. Your imaging system should produce images with high enough quality so that you can extract the information you need from the images. Five factors contribute to overall image quality: resolution, contrast, depth of field, perspective, and distortion.


There are two kinds of resolution to consider when setting up your imaging system: pixel resolution and resolution. Pixel resolution refers to the minimum number of pixels you need to represent the object under inspection. You can determine the pixel resolution you need by the smallest feature you need to inspect. Try to have at least two pixels represent the smallest feature. You can use the following equation to determine the minimum pixel resolution required by your imaging system:

(length of object's longest axis / size of object's smallest feature) × 2

If the object does not occupy the entire field of view, the image size will be greater than the pixel resolution.

Resolution indicates the amount of object detail that the imaging system can reproduce. Images with low resolution lack detail and often appear blurry. Three factors contribute to the resolution of your imaging system: field of view, the camera sensor size, and number of pixels in the sensor. When you know these three factors, you can determine the focal length of your camera lens.

Field of View

The field of view is the area of the object under inspection that the camera can acquire. The following figure describes the relationship between pixel resolution and the field of view.

Relationship between Pixel Resolution and Field of View

Figure A shows an object that occupies the field of view. Figure B shows an object that occupies less space than the field of view. If w is the size of the smallest feature in the x direction and h is the size of the smallest feature in the y direction, the minimum x pixel resolution is

Minimum x pixel resolution

and the minimum y pixel resolution is

Minimum y pixel resolution

Choose the larger pixel resolution of the two for your imaging application.

Sensor Size and Number of Pixels in the Sensor

The camera sensor size is important in determining your field of view, which is a key element in determining your minimum resolution requirement. The sensor's diagonal length specifies the size of the sensor's active area. The number of pixels in your sensor should be greater than or equal to the pixel resolution. Choose a camera with a sensor that satisfies your minimum resolution requirement.

Lens Focal Length

When you determine the field of view and appropriate sensor size, you can decide which type of camera lens meets your imaging needs. A lens is defined primarily by its focal length. The relationship between the lens, field of view, and sensor size is as follows:

focal length = (sensor size × working distance) / field of view

If you cannot change the working distance, you are limited in choosing a focal length for your lens. If you have a fixed working distance and your focal length is short, your images may appear distorted. However, if you have the flexibility to change your working distance, modify the distance so that you can select a lens with the appropriate focal length and minimize distortion.


Resolution and contrast are closely related factors contributing to image quality. Contrast defines the differences in intensity values between the object under inspection and the background. Your imaging system should have enough contrast to distinguish objects from the background. Proper lighting techniques can enhance the contrast of your system.

Depth of Field

The depth of field of a lens is its ability to keep objects of varying heights in focus. If you need to inspect objects with various heights, chose a lens that can maintain the image quality you need as the objects move closer to and further from the lens.


Perspective errors often occur when the camera axis is not perpendicular to the object you are inspecting. Figure A shows an ideal camera position. Figure B shows a camera imaging an object from an angle.

1  Lens Distortion 2  Perspective Error 3  Known Orientation Offset

Perspective errors appear as changes in the object's magnification depending on the object's distance from the lens. Figure A shows a grid of dots. Figure B illustrates perspective errors caused by a camera imaging the grid from an angle.

Perspective and Distortion Errors

Try to position your camera perpendicular to the object you are trying to inspect to reduce perspective errors. If you need to take precise measurements from your image, correct perspective error by applying calibration techniques to your image.


Nonlinear distortion is a geometric aberration caused by optical errors in the camera lens. A typical camera lens introduces radial distortion. This causes points to appear further away from the optical center of the lens than they really are. Figure C illustrates the effect of distortion on a grid of dots. When distortion occurs, information in the image is misplaced relative to the center of the field of view, but the information is not necessarily lost. Therefore, you can undistort your image through spatial calibration.


Not Helpful