- Detail

The ways to improve the accuracy of industrial vision measurement system 1. Introduction the measurement technology of three-dimensional surface or three-dimensional contour of workpiece is widely used in industry, scientific research, national defense and other fields. The inspection of automobile body, aircraft fuselage, ship hull, turbine blade and other processing and manufacturing, especially the curved surface inspection of large workpiece, has always been a key technical problem in production. This kind of workpiece is generally measured by profiling method under workshop conditions, but the measurable section is small and the measurement accuracy is low; Under the condition of the measuring room, the CMM has high precision, but the data acquisition speed is slow, the measurement cost is high, and it is difficult to realize the measurement. In view of the limitations of contact measurement methods, laser triangulation, moire projection, industrial vision measurement and other non-contact measurement methods have been paid more and more attention, and their applications are becoming more and more extensive

Industrial Vision Measurement Technology (or digital near-field photogrammetry) is a stereo vision measurement technology [1]. Its measurement system is simple in structure, easy to move, fast and convenient in data acquisition, convenient in operation, low in measurement cost, and has the potential of real-time three-dimensional measurement, especially suitable for the detection of three-dimensional spatial points, dimensions or large workpiece contours

II. Measurement principle

the two-dimensional image of a three-dimensional object can be obtained by using a CCD camera, that is, the perspective transformation between the actual spatial coordinate system and the camera plane coordinate system can be realized. Through two frames (or more) of two-dimensional images taken by multiple cameras from different directions, the three-dimensional surface outline or three-dimensional space point position and size of the object can be comprehensively measured

for the convenience of explanation, the object space coordinate system is set as o-xyz, and the image plane coordinate system of CCD image plane is set as o-xy

the perspective transformation relationship of the system is illustrated by taking two cameras as an example. As shown in Figure 1, P is a three-dimensional object point in any space. Let the object space coordinates of the point be p (x, y, z), and its image point coordinates on the CCD image plane of camera I and camera II are P1 (x1, Y1) and P2 (X2, Y2) respectively

Figure 1 object space coordinate system and image plane coordinate system of dual cameras

for camera I, the transformation relationship between image point coordinates and object point coordinates is [2]

(1)

where W1 is a non-zero parameter, A1, A2, a11 are elements of the system transformation matrix, related to the placement position of camera I and the parameters of imaging system I, which can be determined through system calibration

for camera II, the transformation relationship between image point coordinates and object point coordinates is

(2)

where W2 is a non-zero parameter, and B1, B2, and B11 are elements of the system transformation matrix, which are related to the placement position of camera II and the parameters of imaging system II, and can also be determined through system calibration

formula (1) and formula (2) can be respectively reduced to

(3)

(4)

where a = [A1, A2, a11] t; B＝［b1，b2， ，b11］T。 AI and Bi add up to 22 unknown parameters. Four linear equations can be established by using a known target and its image points on two CCD image planes. To calculate 22 unknown parameters, at least 6 known targets are required. Using 6 or more known targets, these parameters can be obtained according to the above equation. This is the first step of the measurement work, which is called system calibration, that is, to calculate the transformation matrix A and B of the measurement system composed of two cameras

the second step of measurement is to obtain the object space three-dimensional coordinates (x, y, z) of the unknown point P according to the image point coordinates P1 (x1, Y1) and P2 (X2, Y2) of the measured point on the two CCD image planes

from equation (1) and equation (2), we can also obtain

(5)

(6)

from the above equation, we can obtain three unknowns x, y, Z, namely the three-dimensional coordinate values of point P. In this way, two two-dimensional images captured by two cameras can be used to measure the three-dimensional contour or size of the object point by point

industrial vision measurement experimental system is shown in Figure 2. The system is composed of two CCD cameras (with a resolution of 510 (H) 492 (V) pixels), a target with more than 6 reference points (known coordinates), an image acquisition card (3 channels, 8 bits, 256 gray scale, with a resolution of 512 512 pixels), the measured object, a PC and corresponding software

Figure 2 industrial vision measurement experimental system

target is used to calibrate the system composed of two cameras, and the spatial position coordinates of each target point on it are known through accurate measurement in advance. The image acquisition card inputs the image signal received by CCD into the computer for processing

by substituting the coordinates of the feature points on the target and the coordinates of the corresponding image points on the two CCD image planes into equations (3) and (4), the transformation matrix of the two imaging systems can be determined, that is, the system calibration can be completed. The points to be measured on the workpiece can be generated by adhering high reflection marks or by projecting light points with a laser. The coordinates of each point to be measured on the image planes of two cameras are substituted into equations (5) and (6), and the coordinate values (x, y, z) of the point to be measured in the actual spatial coordinate system can be determined from the transformation matrix of the system. If the sampling rate is large enough, the workpiece can be measured and reconstructed point by point. The microcomputer control system mainly implements a series of instructions for the above four parts to control the complex surface profile of the equipment

III. factors affecting accuracy and ways to improve accuracy

for the mature use of dimensional space point position or distance measurement in the three countries for more than 10 years, the measurement errors affecting the accuracy of the industrial vision measurement system are mainly caused by the misalignment of the upper and lower jaws in the two steps of calibration and measurement. This paper analyzes the main factors affecting the accuracy of the system and the ways to improve the accuracy

1.resolution of CCD camera and image acquisition card

due to the large imaging magnification of the industrial vision measurement system, the resolution of CCD camera and image acquisition card has a great impact on the measurement accuracy of points in space. Therefore, the stereo vision system used for measurement purposes should adopt the highest possible resolution. The positioning error of image target points caused by the limited resolution of the hardware system can be further reduced through software compensation, Make it sub-pixel level

2. distortion of imaging system

in the measurement model of the industrial vision measurement system, it is considered that the lens imaging is in an ideal state, and the objective distortion of the imaging system will lead to the calibration error of the system. Lens distortion error can be expressed as

(7)

in the formula Ki, PI the parameter related to lens distortion

r the distance between the point of the image plane and the optical axis

the accurate system calibration parameters can be solved by compensating the error of X, y in formula (1) and formula (2) (the compensation values are x, y), and then establishing a new non-linear equation group. For the dual camera system, the measurement model [3] that can compensate the distortion error requires a total of 32 parameters, so at least 8 known feature points are required for calibration

3. target design

for a camera, equations (3) and (4) represent a straight line determined by the object space point (x, y, z) and its image point (x, y). The six feature points indicate that six straight lines pass through the projection center of the lens. However, due to the influence of nonlinearity such as distortion, the six straight lines will not completely meet at the same projection center, but form a loose straight line bundle near the projection center. By reasonably arranging the position of the calibration points, the tighter the straight line bundle, the smaller the influence of nonlinearity. It can be concluded that the calibration accuracy of the system can be greatly improved when every three calibration points are not coplanar with the lens projection center. Generally, it is sufficient to make 3 points collinear on the target [4]

if possible, the distribution and range of the target should consider the range and shape of the measured object as much as possible, because the closer the measured object is to the target, the higher the measurement accuracy. Generally, at least 6 targets are required on the target, and more than 8 targets are required to compensate for lens distortion. By properly adding target points, high accuracy can be achieved in system calibration

generally, circular targets are selected in the first three quarters, including black spots on a white background or white spots on a black background. The target shall be stable and not easy to deform, and the coordinate measuring machine shall be used to accurately measure the spatial position of the target in advance. Its measurement accuracy will have a direct impact on the calibration accuracy of the system

4. the optical axis intersection angle of two cameras

the imaging system adopts the perspective projection principle, and the points on a spatial straight line can be imaged at the same image point position. If the positioning error of the center of the target point on the image plane is in a circular area, it corresponds to a conical area in the object space. The measurement error of the object point in the depth direction caused by the positioning error of the target point on the image plane by the two cameras will be determined by the overlapping part of the two cones, as shown in Figure 3

Figure 3 Schematic diagram of depth error analysis

in the figure, O is the spatial object point, as and a s are the imaging surfaces of two cameras, P and P are the two projection centers, which are the intersection angles of the two optical axes, the camera focal length is f, the positioning error of the center of the target point is l, and the distance from the object point to the projection center of the camera is L. According to the geometric relationship, the depth measurement error is approximately

z l l/fsin (/2) (8)

for example: S = 0.001mm, f = 16mm, l = 2000mm, = 60, then z = 0.25mm

with the increase of the intersection angle of the optical axes of the two cameras, the depth measurement error caused by the positioning error of the image plane target point center decreases. According to the magnification formula, the error in X and Y directions is

x ＝ Y L/F 0.13mm

it can also be seen from equation (8) that the measurement error in depth direction is also proportional to the object distance L. the farther the object distance is, the greater the measurement error is

5. positioning of target points on the image plane

in the two steps of system calibration and point position measurement, the positioning accuracy of image spot center position of target points on the image plane is the main reason affecting the final measurement accuracy. The factors that affect the positioning accuracy include the smooth edge of the image spot of the target point, the uneven gray level of the image spot, the irregular shape and so on. In addition, due to the influence of environmental light and circuit noise, there is also image noise

image processing technology is usually used to sharpen the edge of image spots. The algorithms for determining the central coordinates mainly include symmetry method, geometric center method, energy center method, quadric surface fitting method, etc. Among them, the symmetry method is greatly affected by the uneven intensity of image spots; When the geometric center method is used, if the edge of the image spot is not determined accurately, the centering error is large; Energy center method and quadric surface fitting method can achieve sub-pixel centering accuracy, but the amount of calculation is too large

during measurement, the shape of the target or measuring point shall be regular, and the ambient lighting shall be as uniform as possible to minimize the impact of stray light. When the points to be measured are generated by laser beam illumination, the shape of the light points is sometimes irregular

IV. example of spatial point position measurement

the industrial vision measurement system shown in Figure 2 is used for spatial point position measurement. The vertical distance between the measured object and the connecting line of the two cameras is 2000mm, and the intersection angle of the optical axes of the two cameras is 53. The system is calibrated with 6 targets (without considering lens distortion) and 8 targets (lens distortion compensation) respectively, and then the three-dimensional coordinates of some points are measured. Each point is measured for 10 times, and the measured mean value, standard deviation, deviation and the distance d from the measured point to the actual point are given in the X, y and Z directions. See table for measurement data

table space point position measurement results

the measured results show that within the space range of 24032090 (mm3), the maximum deviation between the measured point position and the actual point position is 0.2mm, and the maximum deviation after compensating lens distortion is 0.1mm. The measurement accuracy is significantly improved. When the measurement range is large and located at the edge of the field of view, the

Related Topics

- ·The looming harm of tax harmonization - Today News
- ·Canadas former top bureaucrat says price of public
- ·Here are our picks for the top videos from the pas
- ·Voting day starts smoothly at most London polling
- ·Ontario lifts mask mandates in most public spaces
- ·Dr. Bonnie Henry taken aback by fierce RCMP union
- ·Concerns for refugees safety grow as more Ukrainia
- ·Today’s coronavirus news- Loosened capacity limits