In the growing Industry 4.0 market, there is strong need to implement automatic inspection methods to support manufacturing processes. Tool wear in turning is one of the biggest concerns that most expert operators are able to indirectly infer through the analysis of the removed chips. Automatising this operation would enable developing more efficient cutting processes that turns in easier process planning management toward the Zero Defect Manufacturing paradigm. This paper presents a deep learning approach, based on image processing applied to turning chips for indirectly identifying tool wear levels. The procedure extracts different indicators from the RGB and HSV image channels and instructs a neural network for classifying the chips, based on tool state conditions. Images were collected with a high-resolution digital camera during an experimental cutting campaign involving tool wear analysis with direct microscope imaging. The sensitivity analysis confirmed that the most sensible image channels are the hue value H that were used to teach the network, leading to performances in the range of 95 of proper classification. The feasibility of the deep learning approach for indirectly understanding the tool wear from the chip colour characterisation is confirmed. However, due to the big effects on chip colours of variables as the workpiece material and cutting process parameters, the applicability is limited to stable production flows. An industrial implementation can be foreseen by populating proper large databases and by implementing real-time chip segmentation analysis.

Indirect cutting tool wear classification using deep learning and chip colour analysis

Pagani L.;Parenti P.;Cataldo S.;Annoni M.
2020-01-01

Abstract

In the growing Industry 4.0 market, there is strong need to implement automatic inspection methods to support manufacturing processes. Tool wear in turning is one of the biggest concerns that most expert operators are able to indirectly infer through the analysis of the removed chips. Automatising this operation would enable developing more efficient cutting processes that turns in easier process planning management toward the Zero Defect Manufacturing paradigm. This paper presents a deep learning approach, based on image processing applied to turning chips for indirectly identifying tool wear levels. The procedure extracts different indicators from the RGB and HSV image channels and instructs a neural network for classifying the chips, based on tool state conditions. Images were collected with a high-resolution digital camera during an experimental cutting campaign involving tool wear analysis with direct microscope imaging. The sensitivity analysis confirmed that the most sensible image channels are the hue value H that were used to teach the network, leading to performances in the range of 95 of proper classification. The feasibility of the deep learning approach for indirectly understanding the tool wear from the chip colour characterisation is confirmed. However, due to the big effects on chip colours of variables as the workpiece material and cutting process parameters, the applicability is limited to stable production flows. An industrial implementation can be foreseen by populating proper large databases and by implementing real-time chip segmentation analysis.
2020
Chip analysis
Deep learning
Image processing
Monitoring
Tool wear
Vision inspection
File in questo prodotto:
File Dimensione Formato  
Indirect cutting tool wear classification using deep learning and chip colour analysis.pdf

accesso aperto

: Publisher’s version
Dimensione 4.24 MB
Formato Adobe PDF
4.24 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1148294
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 22
  • ???jsp.display-item.citation.isi??? 18
social impact