In modern mechanical engineering, gears are critical transmission components widely used in manufacturing, automation, and various industrial applications. Among them, the helical gear is particularly important due to its ability to provide smooth and quiet operation compared to spur gears. However, the helical gear’s spiral angle introduces challenges in precise measurement, especially when using machine vision techniques. Traditional measurement methods often struggle with shadows on the end-face images caused by the helical angle, leading to reduced accuracy. This paper addresses these issues by proposing an improved machine vision system and a novel contour feature point method for measuring key profile parameters of helical gears, such as the number of teeth, tip circle diameter, and root circle diameter. The goal is to achieve high-precision measurements with absolute errors within 0.02 mm, leveraging advanced imaging hardware and robust image processing algorithms.
The importance of helical gears in mechanical systems cannot be overstated. They are essential in applications requiring high torque and minimal noise, such as automotive transmissions, industrial machinery, and aerospace systems. Accurate measurement of helical gear parameters is crucial for ensuring quality control, performance optimization, and compliance with design specifications. Machine vision has emerged as a powerful tool for non-contact measurement, offering advantages like speed, automation, and repeatability. However, the unique geometry of helical gears—specifically, their helical teeth—can cause optical distortions and shadows during image acquisition, which complicates edge detection and parameter extraction. Previous studies have focused on spur gears, but research on helical gears remains limited. This work aims to bridge that gap by developing a comprehensive machine vision approach tailored for helical gear measurement.

In the literature, various machine vision methods have been proposed for gear measurement. For instance, some researchers have used image processing techniques to extract gear contours and fit them to circles or arcs for parameter calculation. Others have developed automated systems for gear inspection, but these often target spur gears and may not account for the helical gear’s complexities. Common issues include overlapping teeth in images, uneven lighting, and shadow effects from the helical angle. To mitigate these, approaches like gamma transformation, global low-light enhancement, and Otsu binarization have been employed, but they can lead to detail loss or incomplete contour extraction. This paper builds on these ideas by integrating high-resolution imaging hardware with optimized algorithms to overcome the shadow problem and enhance measurement accuracy for helical gears.
The core contribution of this work is a novel measurement system that combines a high-resolution dual-telecentric lens and a close-range backlight source to capture clear end-face images of helical gears. This setup minimizes shadows and ensures high-quality input for subsequent processing. Additionally, an Otsu-based image binarization method is applied to segment the gear from the background, effectively handling the shadow regions. For parameter measurement, a contour feature point method is introduced. This method involves connecting the gear center to the centroid of each tooth, extending these lines to intersect with the tip contour, and rotating them to intersect with the root contour. The intersection points are then used for circle fitting to determine the tip and root circle diameters. This approach reduces errors caused by contour irregularities and improves robustness compared to traditional methods like minimum enclosing circles.
To elaborate on the system design, the machine vision setup includes an industrial area-scan camera with a resolution of 4024×3036 pixels, paired with a dual-telecentric lens that offers minimal distortion and a large field of view. The lighting system uses a white LED backlight placed close to the helical gear to provide uniform illumination and reduce shadows. Experimental comparisons show that close-range backlighting significantly reduces shadow areas compared to distant sources, as demonstrated in earlier tests. The measurement process is divided into three modules: optimal exposure interval image acquisition, image processing, and parameter measurement. The exposure time is optimized through an automated routine that adjusts the time in steps, starting from 24000 μs with a step size of 1000 μs, to find the range where measurement errors are minimized. This ensures that the captured images have balanced brightness and contrast for accurate analysis.
Image processing is a critical step in this methodology. First, the captured RGB image of the helical gear is converted to grayscale using a weighted average method to simplify computations. The grayscale transformation is given by:
$$ \text{Gray}(i,j) = 0.299 \times R(i,j) + 0.587 \times G(i,j) + 0.114 \times B(i,j) $$
where \((i,j)\) represents the pixel coordinates, and \(R(i,j)\), \(G(i,j)\), and \(B(i,j)\) are the red, green, and blue channel values, respectively. Next, bilateral filtering is applied to reduce noise while preserving edges. The bilateral filter weights pixels based on both spatial distance and intensity difference, as expressed by:
$$ h(i,j) = \frac{\sum_{k,l} f(k,l) w_s(k,l) w_r(k,l)}{\sum_{k,l} w_s(k,l) w_r(k,l)} $$
with spatial weight \(w_s(k,l) = \exp\left(-\frac{(i-k)^2 + (j-l)^2}{2\sigma_s^2}\right)\) and range weight \(w_r(k,l) = \exp\left(-\frac{|f(i,j) – f(k,l)|^2}{2\sigma_r^2}\right)\). Here, \(f(k,l)\) is the pixel value at \((k,l)\), \(\sigma_s\) and \(\sigma_r\) are standard deviations for spatial and range domains, and \(h(i,j)\) is the filtered output. After filtering, Otsu’s method is used for image binarization, which automatically determines a threshold \(T\) to separate the helical gear from the background. This method maximizes the inter-class variance between the foreground and background pixels, yielding a binary image with minimal shadow interference. Small connected regions are removed using shape selection operators to isolate the gear’s main body.
Edge detection is performed using sub-pixel accuracy algorithms to extract the helical gear’s contour. This contour serves as the basis for parameter measurement. The number of teeth is determined by creating a circular mask with a diameter equal to the average of the minimum enclosing circle and maximum inscribed circle diameters, then computing the difference with the gear region to count the teeth. For diameter measurement, the gear center is located using a caliper tool that fits circles to the inner hole. The contour feature point method then proceeds as follows: let the gear center be at coordinates \((x_c, y_c)\), and the centroids of the tooth regions be \((x_i, y_i)\) for \(i = 1, 2, \dots, Z\), where \(Z\) is the number of teeth. Lines are drawn from \((x_c, y_c)\) to each \((x_i, y_i)\), and extended to intersect the tip contour. The intersection points, denoted as \((x_{t,i}, y_{t,i})\), are collected. Similarly, each line is rotated clockwise by an angle of \(\pi / (2Z)\) radians to intersect the root contour, yielding points \((x_{r,i}, y_{r,i})\). These points are fitted to circles using least-squares fitting to estimate the tip and root circle diameters.
The least-squares circle fitting minimizes the sum of squared distances from points to the circle. For a set of points \((x_j, y_j)\), the circle equation \((x – a)^2 + (y – b)^2 = r^2\) is solved for center \((a,b)\) and radius \(r\) by minimizing:
$$ \sum_{j} \left( (x_j – a)^2 + (y_j – b)^2 – r^2 \right)^2 $$
This approach ensures robust estimation even with noisy data. To validate the method, experiments were conducted on helical gears with known parameters. The exposure time was optimized as described, and measurements were compared against theoretical values and traditional methods. The results demonstrate the effectiveness of the proposed system.
In the experimental setup, multiple helical gear samples were used to test the measurement accuracy. The exposure time optimization revealed that the range of 26000 μs to 32000 μs yielded the smallest absolute errors for both tip and root circle diameters. Outside this range, errors increased due to over- or under-exposure. The following table summarizes the measurement results for a helical gear with theoretical tip diameter of 44.19 mm and root diameter of 35.19 mm, comparing the proposed contour feature point method with a traditional method based on minimum and maximum circles.
| Measurement Item | Theoretical Value (mm) | Traditional Method Value (mm) | Absolute Error (mm) | Proposed Method Value (mm) | Absolute Error (mm) |
|---|---|---|---|---|---|
| Tip Circle Diameter | 44.19 | 44.3077 | 0.1177 | 44.1716 | 0.0184 |
| Root Circle Diameter | 35.19 | 35.0860 | 0.1040 | 35.2076 | 0.0176 |
The table clearly shows that the proposed method reduces absolute errors to within 0.02 mm, significantly outperforming the traditional approach. This improvement is attributed to the contour feature point method’s ability to handle shadow-affected contours and provide more accurate intersection points for fitting. Additionally, the method was tested on a standard spur gear with a module of 2 and 18 teeth to verify its general applicability. The results are presented in the next table.
| Measurement Item | Theoretical Value (mm) | Traditional Method Value (mm) | Absolute Error (mm) | Proposed Method Value (mm) | Absolute Error (mm) |
|---|---|---|---|---|---|
| Tip Circle Diameter | 40.00 | 40.0904 | 0.0904 | 40.0159 | 0.0159 |
| Root Circle Diameter | 31.00 | 30.9025 | 0.0975 | 30.9846 | 0.0154 |
Again, the proposed method maintains errors below 0.02 mm, confirming its robustness for both helical and spur gears. The consistency in results underscores the advantage of using high-resolution imaging and optimized algorithms. Further analysis involves evaluating the impact of key parameters on measurement accuracy. For instance, the helical gear’s spiral angle can influence shadow formation, but the close-range backlight setup mitigates this effect. The Otsu binarization threshold \(T\) is computed dynamically based on image histograms, ensuring adaptability to different lighting conditions. The bilateral filter parameters \(\sigma_s\) and \(\sigma_r\) are set empirically to balance noise reduction and edge preservation; typical values are \(\sigma_s = 5\) and \(\sigma_r = 0.1\) for gear images.
To delve deeper into the contour feature point method, the rotation angle for root contour intersection is derived from the gear geometry. For a helical gear with \(Z\) teeth, the angular pitch is \(2\pi / Z\). Rotating the line by \(\pi / (2Z)\) ensures that the intersection occurs near the root region without interference from adjacent teeth. This angle can be adjusted based on the helical gear’s pressure angle or module, but in practice, \(\pi / (2Z)\) works well for standard gears. The mathematical formulation for line rotation is as follows: given a line from the center \((x_c, y_c)\) to a centroid \((x_i, y_i)\), the rotated line direction is obtained by applying a rotation matrix:
$$ \begin{bmatrix} x’_i \\ y’_i \end{bmatrix} = \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \begin{bmatrix} x_i – x_c \\ y_i – y_c \end{bmatrix} + \begin{bmatrix} x_c \\ y_c \end{bmatrix} $$
where \(\theta = -\pi / (2Z)\) for clockwise rotation. The extended line equation is then used to find intersections with the contour, which are computed using numerical methods like line scanning or polygon clipping algorithms in image coordinates.
The image processing pipeline also includes steps for morphological operations to clean up the binary image, such as dilation and erosion, but these were found unnecessary due to the effectiveness of Otsu thresholding and bilateral filtering. The overall system is implemented in a software framework that automates the measurement process, from image acquisition to parameter output. This automation reduces human error and increases throughput, making it suitable for industrial inspection of helical gears.
Discussion of the results highlights several key insights. First, the use of a dual-telecentric lens is crucial for minimizing perspective distortion, which is common in machine vision when measuring three-dimensional objects like helical gears. The lens ensures that the captured image represents the true end-face geometry without scaling issues. Second, the close-range backlight source eliminates most shadows by providing direct illumination from behind the helical gear. This is particularly important for helical gears where the teeth angles can create complex shadow patterns. Third, the contour feature point method offers a novel way to leverage geometric priors about helical gears—specifically, the symmetry of teeth around the center—to improve measurement accuracy. Compared to fitting circles directly to the entire contour, which can be affected by outliers or missing edges, this method focuses on specific feature points that are less prone to noise.
Potential limitations of the approach include dependence on high-quality imaging hardware, which may increase cost, and sensitivity to extreme lighting conditions. However, the exposure optimization module addresses variability in ambient light. Future work could explore deep learning-based segmentation to handle more complex gear geometries or damaged teeth. Additionally, extending the method to measure other helical gear parameters, such as the helical angle or pitch, would be valuable. The current system focuses on profile parameters, but the same imaging setup could be adapted for comprehensive gear inspection.
In conclusion, this paper presents a robust machine vision-based method for measuring helical gear profile parameters with high precision. By integrating a high-resolution dual-telecentric lens, close-range backlighting, and advanced image processing techniques like Otsu binarization and bilateral filtering, the system effectively overcomes shadow issues caused by the helical angle. The proposed contour feature point method further enhances accuracy by using geometric intersections for circle fitting, resulting in absolute errors within 0.02 mm for both tip and root circle diameters. Experimental validation on helical and spur gears confirms the method’s effectiveness and general applicability. This work contributes to the field of gear metrology by providing a reliable, automated solution for helical gear inspection, with potential benefits for quality control in manufacturing industries.
The implications of this research are significant for industries relying on helical gears, such as automotive and aerospace, where precise gear dimensions are critical for performance and safety. The machine vision system can be integrated into production lines for real-time inspection, reducing downtime and improving product consistency. Moreover, the methodology can be extended to other rotational symmetric components with similar measurement challenges. As machine vision technology continues to advance, combining hardware innovations with intelligent algorithms will further push the boundaries of non-contact measurement accuracy.
To summarize the technical contributions, the key equations and steps are encapsulated below. The grayscale transformation and bilateral filtering formulas form the foundation of image preprocessing. The Otsu thresholding method selects an optimal threshold \(T\) by maximizing inter-class variance \(\sigma_B^2\), defined as:
$$ \sigma_B^2 = \omega_0(\mu_0 – \mu_T)^2 + \omega_1(\mu_1 – \mu_T)^2 $$
where \(\omega_0\) and \(\omega_1\) are the probabilities of background and foreground classes, \(\mu_0\) and \(\mu_1\) are their mean intensities, and \(\mu_T\) is the total mean. For circle fitting, the least-squares solution involves solving a linear system derived from the circle equation. If we denote the circle parameters as \(A = -2a\), \(B = -2b\), and \(C = a^2 + b^2 – r^2\), the fitting minimizes:
$$ \sum_j (x_j^2 + y_j^2 + A x_j + B y_j + C)^2 $$
which yields a set of normal equations solvable via matrix inversion. These mathematical tools, combined with the hardware setup, enable precise measurement of helical gear parameters.
In practice, the system’s performance was evaluated through repeated measurements to ensure repeatability. The standard deviation of tip diameter measurements was less than 0.005 mm, indicating high consistency. Factors such as camera calibration, lens distortion correction, and pixel-to-millimeter conversion were also considered in the implementation. The conversion factor was determined using a calibration target, ensuring accurate scaling from image pixels to real-world dimensions. For the helical gear samples used, the pixel size was approximately 0.01 mm per pixel, allowing sub-pixel accuracy through interpolation in edge detection.
Overall, this research demonstrates that machine vision can achieve high-precision measurement of helical gears despite their complex geometry. The integration of tailored hardware and algorithms addresses specific challenges like shadows and contour irregularities. As industries move towards smarter manufacturing, such automated inspection systems will become increasingly vital for maintaining quality standards and optimizing production processes. The proposed method offers a practical solution that balances accuracy, speed, and cost-effectiveness, making it a valuable tool for gear manufacturers and researchers alike.
