Skip to main content
Log in

Automated optical inspection for the runout tolerance of circular saw blades

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

Circular saw blades are fundamental cutting tools applied to cut off materials. Inspection of finished products of circular saw blades is important in order to ensure their manufacturing quality and sawing performance. Traditionally, a contact inspection method is adopted to measure the runout amounts of circular saw blades. In order to improve the quality of the runout inspection, a non-contact inspection method based on machine vision is required. In this paper, an automated optical inspection (AOI) system was developed exclusively for inspecting the runout tolerance of circular saw blades. Based on the integration of motion control and image processing techniques, calibration and automated inspection processes for the developed AOI system were then established. Experiments to inspect circular saw blade samples were also conducted in order to test the feasibility and reliability of the developed AOI system. From the experimental results, the developed AOI system, in combination with the automated inspection process, could achieve sufficient repeatability and was verified to be able to inspect the runout tolerance of certain circular saw blades.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. DeGarmo EP, Black JT, Kohser RA (2003) Materials and processes in manufacturing—9th edn (updated edition). Wiley, Hoboken, pp 645–651

    Google Scholar 

  2. ASME (2009) ASME Y14.5-2009—dimensioning and tolerancing. ASME, New York, pp. 180–184

  3. Kurada S, Bradley C (1997) A review of machine vision sensors for tool condition monitoring. Comput Ind 34(1):55–72

    Article  Google Scholar 

  4. Jeon JU, Kim SW (1988) Optical flank wear monitoring of cutting tools by image processing. Wear 127(2):207–217

    Article  MathSciNet  Google Scholar 

  5. Kurada S, Bradley C (1997) A machine vision system for tool wear assessment. Tribol Int 30(4):295–304

    Article  Google Scholar 

  6. Jurkovic J, Korosec M, Kopac J (2005) New approach in tool wear measuring technique using CCD vision system. Int J Mach Tools Manuf 45(9):1023–1030

    Article  Google Scholar 

  7. Shahabi HH, Ratnam MM (2009) Assessment of flank wear and nose radius wear from workpiece roughness profile in turning operation using machine vision. Int J Adv Manuf Technol 43(1–2):11–21

    Article  Google Scholar 

  8. Pfeifer T, Wiegers L (2000) Reliable tool wear monitoring by optimized image and illumination control in machine vision. Meas 28(3):209–218

    Article  Google Scholar 

  9. Wang W, Wong YS, Hong GS (2005) Flank wear measurement by successive image analysis. Comput Ind 56(8):816–830

    Article  Google Scholar 

  10. Wang WH, Hong GS, Wong YS (2006) Flank wear measurement by a threshold independent method with sub-pixel accuracy. Int J Mach Tools Manuf 46(2):199–207

    Article  Google Scholar 

  11. Bradley C, Wong YS (2001) Surface texture indicators of tool wear—a machine vision approach. Int J Adv Manuf Technol 17(6):435–443

    Article  Google Scholar 

  12. Kim JH, Moon DK, Lee DW, Kim JS, Kang MC, Kim KH (2002) Tool wear measuring technique on the machine using CCD and exclusive jig. J Mater Process Technol 130–131:668–674

    Article  Google Scholar 

  13. Su JC, Huang CK, Tarng YS (2006) An automated flank wear measurement of microdrills using machine vision. J Mater Process Technol 180(1–3):328–335

    Article  Google Scholar 

  14. Duan G, Chen YW, Sukegawa T (2010) Automatic optical flank wear measurement of microdrills using level set for cutting plane segmentation. Mach Vis Appl 21(5):667–676

    Article  Google Scholar 

  15. Fan KC, Lee MZ, Mou JI (2002) On-line non-contact system for grinding wheel wear measurement. Int J Adv Manuf Technol 19(1):14–22

    Article  Google Scholar 

  16. Su JC, Tarng YS (2006) Measuring wear of the grinding wheel using machine vision. Int J Adv Manuf Technol 31(1–2):50–60

    Article  Google Scholar 

  17. Ramamoorthy B, Radhakrishnan V (1992) Computer-aided inspection of cutting tool geometry. Precis Eng 14(1):28–34

    Article  Google Scholar 

  18. Tien FC, Yeh CH, Hsieh KH (2004) Automated visual inspection for microdrills in printed circuit board production. Int J Prod Res 42(12):2477–2495

    Article  Google Scholar 

  19. Huang CK, Liao CW, Huang AP, Tarng YS (2008) An automatic optical inspection of drill point defects for micro-drilling. Int J Adv Manuf Technol 37(11–12):1133–1145

    Article  Google Scholar 

  20. Perng DB, Chen YC (2008) Advanced automated optical inspection system for fishtail collapse of microrouter. Nondestruct Test Eval 23(4):257–272

    Article  Google Scholar 

  21. Chen TH, Chang WT, Shen PH, Tarng YS (2010) Examining the profile accuracy of grinding wheels used for microdrill fluting by an image-based contour matching method. Proc Inst Mech Eng Part B, J Eng Manuf 224(6):899–911

    Article  Google Scholar 

  22. Chang WT, Chen TH, Tarng YS (2011) Measuring characteristic parameters of form grinding wheels used for microdrill fluting by computer vision. Trans Can Soc Mech Eng 35(3):383–401

    Google Scholar 

  23. Perng DB, Chen SH, Chang YS (2010) A novel internal thread defect auto-inspection system. Int J Adv Manuf Technol 47(5–8):731–743

    Article  Google Scholar 

  24. Chuang SF, Chang WT, Lin CC, Tarng YS (2010) Misalignment inspection of multilayer PCBs with an automated X-ray machine vision system. Int J Adv Manuf Technol 51(9–12):995–1008

    Article  Google Scholar 

  25. Lin AC, Hui-Chin C (2011) Automatic 3D measuring system for optical scanning of axial fan blades. Int J Adv Manuf Technol 57(5–8):701–717

    Article  Google Scholar 

  26. Perng DB, Liu HW, Chang CC (2011) Automated SMD LED inspection using machine vision. Int J Adv Manuf Technol 57(9–12):1065–1077

    Article  Google Scholar 

  27. Bamberger H, Hong E, Katz R, Agapiou JS, Smyth SM (2012) Non-contact, in-line inspection of surface finish of crankshaft journals. Int J Adv Manuf Technol 60(9–12):1039–1047

    Article  Google Scholar 

  28. National Instrument Corp (2007) NI Vision Concepts manual. National Instrument Corp., Austin, pp. 3.1–3.18, pp. 11.1–11.22

  29. Jain R, Kasturi R, Schunck BG (1995) Machine vision. McGraw-Hill, New York, pp 140–185, pp. 210–223, pp. 309–405

    Google Scholar 

  30. Goellner WJ (1984) Circular saw. US Patent 4463645

  31. Goellner WJ (1986) Circular saw. Canada Patent 1203453

  32. Advanced Machine & Engineering Corp. Speedcut saw blades. http://www.ame.com/fileadmin/user_upload/PDFs/02_MetalCutting_PDF/Speedcut_0608.pdf

  33. Pilana Tools (Pilana Saw Bodies s.r.o.) Tooth geometry of circular saw blades for cutting ferrous materials. http://www.pilana.com/?download=_/pk_sk_i1/sk-geometry-en.pdf

  34. Gonzalez RC, Woods RE (2002) Digital image processing—2nd edn. Prentice-Hall, Upper Saddle River, pp 568–585

    Google Scholar 

  35. Faires JD, Burden R (2003) Numerical methods—3rd edn. Thomson Learning, Pacific Grove, pp. 64–110, pp. 340–348

  36. Yamaguchi F (1988) Curves and surfaces in computer aided geometric design. Springer, Berlin

    Book  MATH  Google Scholar 

  37. Zeid I (1991) CAD/CAM theory and practice. McGraw-Hill, New York

    Google Scholar 

  38. Beckwith TG, Marangoni RD, Lienhard JH (2004) Mechanical measurements—5th edn. Pearson Education Taiwan Ltd., Taipei, pp 45–125

    Google Scholar 

  39. Chang WT, Su CH, Guo DX, Tang GR, Shiou FJ (2010) Automated optical inspection system for the runout tolerance of circular saw blades (in Chinese). Taiwan Patent Pending 099144624

  40. Chang WT, Su CH, Guo DX, Tang GR, Shiou FJ (2011) Automated optical inspection system for the runout tolerance of circular saw blades. US Patent Pending 13/052579

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wen-Tung Chang.

Appendix

Appendix

A general form of the applied edge detection method is described below. For a grayscale digital image consisting of an a×b array of pixels, its grayscale intensity function, G, can be represented by:

$$ G = G\left( {x,y} \right)\,{\text{for}}\,\,0 \leqslant x \leqslant \left( {a - {1}} \right){\text{ and }}0 \leqslant y \leqslant \left( {b - {1}} \right) $$
(A1)

The gradient of the grayscale intensity function is defined by:

$$ \nabla G = \left\{ {\matrix{ {{G_x}(x,{ }y)} \\ {{G_y}(x,{ }y)} \\ }<!end array> } \right\} = \left\{ {\matrix{ {\partial G(x,{ }y)/\partial x} \\ {\partial G(x,{ }y)/\partial y} \\ }<!end array> } \right\} $$
(A2)

By using the one-dimensional grayscale profile scanning [28, 29, 34] with y = b i being assigned, an edge coordinate (x e, b i ) can be detected by finding the location where the locally extreme gradient along the x-direction exists and whose magnitude is larger than an assigned threshold value, t e, that is:

$$ {G_x}\left( {{x_{\text{e}}},{ }{b_i}} \right) = \max \left( {\left| {{G_x}\left( {x,{ }{b_i}} \right)} \right|} \right) > {t_{\text{e}}}\,{\text{for}}\,{a_1} \leqslant x \leqslant {a_2},0 \leqslant {a_1},{a_2} \leqslant \left( {a - {1}} \right)\,\,{\text{and}}\,\,{t_{\text{e}}} > 0 $$
(A3)

which means that the edge coordinate (x e, b i ) exists when G x (x e, b i ) is the extreme gradient in an assigned domain of a 1xa 2 and whose magnitude is larger than t e. Likewise, when x = a i is assigned, an edge coordinate (a i , y e) can be detected by:

$$ {G_y}\left( {{a_i},{ }{y_{\text{e}}}} \right) = \max \left( {\left| {{G_y}({a_i},\,y)} \right|} \right) > {t_{\text{e}}}\,{\text{for}}\,{b_1} \leqslant y \leqslant {b_2},\,0 \leqslant {b_1},{b_2} \leqslant \left( {b - {1}} \right){\text{and}}\,\,{t_{\text{e}}} > 0 $$
(A4)

which means that the edge coordinate (a i , y e) exists when G y (a i , y e) is the extreme gradient in an assigned domain of b 1yb 2 and whose magnitude is larger than t e. The detected edge coordinates (x e, b i ) and (a i , y e) can merely achieve full-pixel accuracy. For each case, a threshold grayscale value, g ex or g ey, corresponding to the detected edge can be obtained by:

$$ {g_{\text{ex}}} = {g_{\text{ex}}}\left( {{b_i}} \right) = G\left( {{x_{\text{e}}},{ }{b_i}} \right) $$
(A5)

or

$$ {g_{\text{ey}}} = {g_{\text{ey}}}\left( {{a_i}} \right) = G\left( {{a_i},{ }{y_{\text{e}}}} \right) $$
(A6)

Considering that continuous edges in any assigned domain have been detected, an averaged threshold grayscale value, g ex(avg) or g ey(avg), can be defined by:

$$ {{{{g_{{\text{ex(avg)}}}} = \int {{g_{\text{ex}}}dy} }} \left/ {{\int {dy} }} \right.} $$
(A7)

or

$$ {g_{{\text{ey(avg)}}}}{{{ = \int {{g_{\text{ey}}}dx} }} \left/ {{\int {dx} }} \right.} $$
(A8)

The averaged threshold grayscale value, g ex(avg) or g ey(avg), can represent a balanced threshold for re-detecting edges in any assigned domain when considering the effects of noise interference and non-uniform illumination on pixel intensity variation. In addition, by using the so-called curve fitting methods [3537], a fitted one-dimensional grayscale profile for the neighborhood of (x e, b i ), Ĝ, can be obtained by:

$$ \widehat{G} = \widehat{G}\left( {x,\,{b_i}} \right) \,{\text{for}}\,\left( {{x_{\text{e}}} - {\text{D}}x} \right) \leqslant x \leqslant \left( {{x_{\text{e}}} + {\text{D}}x} \right) $$
(A9)

in which Δx is an assigned small pixel amount for controlling the range of the neighborhood. Then, a refined edge coordinate \( (x_{\text{e}}^*,{ }{b_i}) \) can be obtained by solving the following equation:

$$ \widehat{G}\left( {x_{\text{e}}^*,{ }{b_i}} \right) - {g_{{\text{ex(avg)}}}} = 0 $$
(A10)

When y = b i is assigned, the obtained location of \( x_{\text{e}}^* \) can achieve sub-pixel accuracy. Similarly, a fitted one-dimensional grayscale profile for the neighborhood of (a i , y e ), Ĝ, can be obtained by:

$$ \widehat{G} = \widehat{G}\left( {{a_i},{ }y} \right)\,{\text{for}}\,\left( {{y_{\text{e}}} - {\text{D}}y} \right) \leqslant y \leqslant \left( {{y_{\text{e}}} + {\text{D}}y} \right) $$
(A11)

in which Δy is an assigned small pixel amount for controlling the range of the neighborhood. Then, a refined edge coordinate \( ({a_i},y_{\text{e}}^*) \) can be obtained by solving the following equation:

$$ \widehat{G}\left( {{a_i},y_{\text{e}}^*} \right) - {g_{{\text{ey(avg)}}}} = 0 $$
(A12)

When x = a i is assigned, the obtained location of \( y_{\text{e}}^* \) can also achieve sub-pixel accuracy.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chang, WT., Su, CH., Guo, DX. et al. Automated optical inspection for the runout tolerance of circular saw blades. Int J Adv Manuf Technol 66, 565–582 (2013). https://doi.org/10.1007/s00170-012-4350-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-012-4350-6

Keywords

Navigation