Real Time Steel Surface Defect Detection Using DCNN: A Project Report ON
Real Time Steel Surface Defect Detection Using DCNN: A Project Report ON
PROJECT REPORT
ON
DCNN
BY
Names of Student: - ID No
AT
CEERI, Pilani
(October, 2020)
1
A
PROJECT REPORT
ON
DCNN
BY
AT
CEERI, Pilani
(October, 2020)
2
Acknowledgement
I express sincere regards to my mentor, Dr. Dhiraj from CEERI for guiding me
right from the start. I thank him for extending his valuable guidance, support for &
critical reviews of the project and above all, the moral support and motivation he
3
4
BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE
PILANI (RAJASTHAN)
Title of the project: Real time Steel surface defect detection using DCNN
Keywords: steel defect detection, deep learning, CNN, logistic regression, random forest, SVM
Date: Date:
5
Abstract
Steel is one of the most successful alloys with wide range of applications in
several domains. However, the product of usable quality steel has never been
simple. The process involves the influence of raw materials, system control,
rolling process, etc. which results in several defects over its surface. Because of
defects in as lesser time as possible to boost the production speeds. Once the
defect is detected, it can be used to treat this steel piece further to remove it
based on the class of defect. However, the steel could have several types of
defects such as scratches, scars, iron scales, black burn, bright prints, inclusion,
rolled-in scale, patches, crazing, pitted surface, pollution and other defects. Apart
from the appearance, these defects reduce the properties such as abrasion
resistance, corrosion resistance, and fatigue strength which causes huge loss to
the industries.
Until recent times, the detection process has been quite manual and slow.
changes using artificial intelligence in real time. Due to the fast processors, it is
possible to reduce human involvement and increase detection speeds. The main
6
objective of the project is to improve the error detection rates of these systems by
7
Table of Contents
Acknowledgement............................................................................................................3
Abstract............................................................................................................................5
Table of Contents.............................................................................................................6
Introduction......................................................................................................................7
Current Progress..............................................................................................................8
Week 1:.........................................................................................................................9
Week 2:.......................................................................................................................10
Week 3:.......................................................................................................................11
Week 4:.......................................................................................................................13
Week 5:.......................................................................................................................15
Week 6:.......................................................................................................................17
Week 7:.......................................................................................................................19
Week 8:.......................................................................................................................20
Week 9:.......................................................................................................................21
Week 10:.....................................................................................................................23
Conclusion.....................................................................................................................25
Recommendation...........................................................................................................26
References.....................................................................................................................27
Glossary.........................................................................................................................28
8
Introduction
(CSIR India), New Delhi. CEERI Pilani was established on 21st September, 1953
in the field of Electronics. It was established with the vision of Mr. G. D. Birla.
Since its inception, it has been working for the growth of electronics in the
country and has established the required infrastructure and well experienced
Microwave Tubes
Smart Sensors
9
Aim of the project
The aim of the project is to propose a superior algorithm to increase the accuracy
for the steel defect detection and classification. Various feature extractors such
as LBP, LTP, CLBP, and AECLBP can be used with a combination with several
classifiers such as SVM, logistic regression, random forest, and CNN to find
10
Current Progress
Learning and Deep Learning. These concepts are important to build the
backbone of the project. Only after learning the basics, understanding the
methods, it is easier to find the gaps in the current implementation and solve
propose for a superior method to detect and classify the defects in real time with
higher accuracy. Almost all the concepts that have been studied in machine
NEU dataset for steel defect has been planned to be used for the project for the
initial part. Upon using satisfactory results, various other datasets can be
explored.
11
Week 1:
Aim
Get better understanding of the topic and get stronger on the basics.
1. Gradient Descent
2. Linear Regression
2. Basic of ML started
Summary:
12
housing price prediction, present the notion of a cost function, and introduce the
13
Week 2:
Aim
2. Polynomial Regression
Summary:
I have covered linear regression with multiple variables. I have seen how linear
14
I went over how to use MATLAB. I have worked on programming course
algorithms in practice.
15
Week 3:
Aim
implemented.
1. Multiclass Classification
3. Logistic regression
2. From CNN to computer vision has already been implemented in this domain.
Summary
16
This week, I have looked into covering logistic regression. Logistic regression is a
method for classifying data into discrete outcomes. For example, we might use
the notion of classification, the cost function for logistic regression, and the
generalize well to new examples that the model has not seen in practice. We’ll
introduce regularization, which helps prevent models from overfitting the training
data.
17
Week 4:
Aim
Explore Neural Networks. Learn to test the data using one vs rest method for our
1. Multiclass Classification
3. Non-linear hypothesis
Summary:
I have covered neural networks this week. Neural networks is a model inspired
by how the brain works. It is widely used today in many applications: when the
phone interprets and understand the voice commands, it is likely that a neural
18
network is helping to understand the speech; when we cash a check, the
machines that automatically read the digits also use neural networks.
19
Week 5:
Aim
Learn to train the data the Neural Network (NN). Algorithms like back
Network model.
Summary:
I have learned how to train Neural Networks. The Neural Network is one of the
most powerful learning algorithms (when a linear classifier doesn't work, this is
what people usually turn to), and the 'backpropagation' algorithm for training was
20
used to train the models. I have also gotten a chance to implement this algorithm
21
Week 6:
Aim
Learn to improve the algorithm accuracy that is required to tune our model.
Summary
on learning how to tell when a learning algorithm is doing poorly, and describe
the 'best practices' for how to 'debug' our learning algorithm and go about
22
I have also covered machine learning system design. To optimize a machine
of a machine learning system with multiple parts, and also how to deal with
skewed data.
23
Week 7:
Aim
Learn about Support Vector Machines (SVM). As one of the classifiers to test our
steel defect data on is SVM, it is important to learn the basics and ways to
improve it.
2. Kernels
3. SVM
Summary
This week, I learned about the support vector machine (SVM) algorithm. SVMs
are considered by many to be the most powerful 'black box' learning algorithm,
24
and by posing a cleverly-chosen optimization objective, one of the most widely
Week 8:
Aim
Implement the first part of the paper by Kechen Song, that is, implement the
1. LBP
2. Kernels
3. SVM
Summary:
a new image. If the corresponding neighbor has a greater than or equal to the
current pixel value, then the true or ‘1’ value is given to the pixel else, false or ‘0’
25
is given to it. By using SVM as a classifier provided with the self-
93.33%.
Week 9:
Aim
Implement the complete paper using LTP, CLBP, and AECLBP and SVM on our
dataset.
1. SVM
2. LBP
3. LTP
4. CLBP
5. AECLBP
Summary
26
Since the codes for any of the methods such as LTP, CLBP, AECLBP are not
available (as they are relatively new and not being used in practical applications),
on our dataset. After running these several times and tweaking its performance, I
was able to achieve an accuracy of 89.17%, 97.78% and 95.83% using LTP,
CLBP, and AECLBP respectively. All these images have been tested without
noise. Compared with 93.33% accuracy that was obtained by LBP, CLBP has
27
Week 10:
Aim
Implement the all stages of the paper including LBP, LTP, CLBP, AECLBP using
several classifiers like SVM, CNN, Logistic Regression, and Random Forest
1. SVM
2. Logistic Regression
3. CNN
4. Random Forest
several classifiers
Summary
It took some time to understand and implement new algorithms for classifying
other than the SVM that has been previously used. New methods that were
28
implemented in MATLAB are logistic regression, CNN, and Random Forest.
However, even after trying these various algorithms for classifying, SVM appears
29
Conclusion
After trying several methods of feature extraction (LBP, LTP, CLBP, AECLBP) and classification (SVM,
Random Forest, Logistic Regression, CNN), the best result obtained yet was using CLBP and SVM.
However, more research has to be done to test these results under various conditions such as in
addition of noise.
30
Recommendation
It is recommended to try various different settings to bring improvement to the current models. Also, a
greater number of classifiers and extraction features can be tried out to reach a better solution.
Different hurdles such as noise can be added to check the robustness of the conclusion.
31
References
Song, Kechen, and Yunhui Yan. "A Noise Robust Method Based On Completed Local Binary Patterns For Hot-
Rolled Steel Strip Surface Defects". Applied Surface Science, vol 285, 2013, pp. 858-864. Elsevier BV,
doi:10.1016/j.apsusc.2013.09.002.
32
Glossary
33