A Quality Control Application On A Smart Factory Prototype Using Deep Learning Methods
A Quality Control Application On A Smart Factory Prototype Using Deep Learning Methods
Abstract—The number of smart factories is increasing day sorting robot for quality control of the produced parts using
after day to reach the vision of Industry 4.0. Computer vision and image-processing methods.
image processing have important roles in the systems whose aim
is unmanned production. In the industrial automation Tekinalp et al [5], made an application for the separation of
applications, computer vision is mostly used at the quality control olives according to color using MATLAB and PLC controlled
stage. In this stage, there are many applications which use image- systems in real time with image processing methods. Cuşkun
processing methods for object detection and classification but and his friends [6] designed a 4-axis multi-functional robot
deep learning-based applications are rarely seen. In this work, a mechanism to find the position and the angles of the plates
visual quality control automation application is proposed by which have different features using image processing methods.
using a camera placed over the assembly line in a smart factor In addition, it is seen that computerized vision techniques are
model. The product is detected in an image obtained from the used in many applications in agriculture and food industry [7].
assembly line and then classified as “okay” or “not okay” using In their study, Kazemi and Kharrati [8] were able to classify
deep learning methods. After the deep learning-based quality objects through a moving production line and separate them
control, the “okay” products continue their production stages with a PLC-controlled robot.
and the “not okay” products are separated from the production
line using a PLC, which controls the line. It is seen with this Even though deep learning methods are not commonly
application that deep learning methods in automation encountered in industrial automation applications, as the
applications will have an important role in transitioning to the process of adaptation to industry 4.0 progresses, the use of
industry 4.0. artificial intelligence and consequently deep learning-based
applications will gradually increase. Because deep neural
Keywords – deep learning, industry 4.0, smart factory, object networks have achieved better classification accuracy than
detection, object recognition. human level recognition accuracy in some studies like ResNet-
152 [9-12]. In this study, it is aimed to make the visual quality
I. INTRODUCTION control of the final product by a deep learning method, which is
In recent years, the number of deep learning applications obtained end of the assembly line in the smart factory model.
has significantly increased parallel to the developments in For this purpose, the images of 3 different products which are
computer technology. The capability and accuracy of computer produced at the end of the assembly line are taken using
vision applications have made a great progress with deep webcam. This image is transferred to MATLAB. First, object
learning methods. Behrendt et al. [1] developed a deep detection algorithm is applied to image using machine learning
learning-based method for object detection and recognition in technique (neural network) to find the object, then object
real time. They used 2 different neural networks for object recognition algorithm is applied to the detected region by deep
detection and recognition in their study based on learning methods to classify if the final product is “OK” or
YouOnlyLookOnce (YOLO) method. Gordon et al. [2] has “NOT OK”. The result of the classification sent to PLC, which
developed a method based on Recurrent Neural Network controls the assembly line via TCP / IP protocol. PLC adjusts
(RNN) for object recognition in their study. This study shows the position of the points on the band according to the data
that RNN methods can achieve high successes for object obtained from the PC. It allows the “OK” products to continue
recognition. Image processing, object detection and recognition on the line, while separates the products classified as “NOT
tasks have an important issue in the process of transition to OK” into the discharge line. Also, to monitor, to control, and to
unmanned production in intelligent systems developed under save some of the data of all these processes, a human machine
the vision of Industry 4.0. There are many industrial interface panel program is written in TIA PORTAL.
automation applications using image processing methods in the The rest of the paper is organized as follows. We define the
literature. For example, Karadöl and Aybek [3] processed the smart factory model and its components in Section II. The
images obtained from cornfield in MATLAB environment to object detection, recognition parts of our model, dataset, and
determine the weeds. The obtained data were transferred from experiments and their results are explained in Section III.
MATLAB to PLC with OPC server. Kervancıoğlu et al. [4] Finally, conclusion is given in Section IV.
performed the automation of PLC-controlled electro-pneumatic
Authorized licensed use limited to: NED UNIV OF ENGINEERING AND TECHNOLOGY. Downloaded on June 01,2023 at 19:12:16 UTC from IEEE Xplore. Restrictions apply.
47
Authorized licensed use limited to: NED UNIV OF ENGINEERING AND TECHNOLOGY. Downloaded on June 01,2023 at 19:12:16 UTC from IEEE Xplore. Restrictions apply.
48
Authorized licensed use limited to: NED UNIV OF ENGINEERING AND TECHNOLOGY. Downloaded on June 01,2023 at 19:12:16 UTC from IEEE Xplore. Restrictions apply.
49
ACKNOWLEDGMENT
Authors thank to Bilecik Seyh Edebali University, Industry
Based Vocational Training and Research Center (EDMEM) for
let them use Festo's “Pick and Place”, “Muscle Press” and
Fig. 8. Sample images of “OK” class from the dataset with different colors.
“Separating” training prototype modules in this study.
IV. CONCLUSION
Deep learning techniques become more popular in recent years
thanks to the advances in computing power of computers. This REFERENCES
study on the smart factory model shows that deep learning [1] Behrendt K., Novak L., Botros R., A deep learning approach to traffic
lights: Detection, tracking, and classification, In: Robotics and
techniques can be used in many areas as well as providing Automation (ICRA), 2017 IEEE International Conference on, pp. 1370–
effective solutions in the field of quality control automation 1377, 2017.
studies where machine learning applications are frequently [2] Gordon D., Farhadi A., Fox D., Re3: Real-Time Recurrent Regression
used. A CNN, which is trained using a robust data set is not Networks for Object Tracking, arXiv preprint arXiv:1705.06368, 2017.
affected several environment conditions in object detection and [3] H. Karadöl, A. Aybek, “Mısır Arazisinde Yabancı Otların
recognition applications. Thus, CNN is different from Belirlenmesine Yönelik Matlab ve PLC Arası OPC Haberleşme
Kullanılarak Geliştirilen Bir Kontrol Sistemi”[ A Developed Control
traditional industrial object recognition applications, it can System by Using OPC Communication Between Matlab and PLC for
work in variable light conditions. The number of automation Determination of Weeds in Corn Field], Tekirdağ Ziraat Fakültesi
applications where computer vision is used can be increased Dergisi: 14 (02), pp. 129-137, 2017.
and the transition to the industry 4.0 process can be accelerated [4] E. Kervancıoğlu, A. Adıyan, L. Çetin, E. Uyar, “Görüntü İşlemeye
Dayalı Elektro-Pnömatik Parça Tasnif Robotu”[Electro-Pneumatic Parts
with the use of CNN. Also, production efficiency can be Display Robot Based On Image Processing], V. Ulusal Hidrolik
increased in production by eliminating human errors. Pnömatik Kongresi, İzmir, pp. 397-404, 23-26 October 2008.
[5] Z. Tekinalp, S. Öztürk, M. Kuncan, “OPC Kullanılarak Gerçek Zamanlı
Haberleşen Matlab ve PLC Kontrollü Sistem”[ Matlab and PLC
Controlled Systems Real Time Communication Using OPC], Otomatik
Kontrol Ulusal Toplantısı, Malatya, s:465-470, 26-28 September 2013.
[6] Y. Cuşkun, F. Duman, H. Basık, F. Gün, K. Kaplan, H. M. Ertunç,
“Görüntü İşleme Tabanlı 4 Eksenli Çok Amaçlı Robot Mekanizması”
[Image Processing Based Multi-Purpose 4-Axis Robot Mechanism],
Elektrik-Elektronik ve Biyomedikal Mühendisliği Konferansı, Bursa,
pp.247-251, 1-3 December 2016.
[7] J. F. S. Gomes, F. R. Leta, “Applications of computer vision techniques
in the agriculture and food industry: a review”, Eur Food Res Technol
(2012) 235, pp:989–1000, Springer-Verlag Berlin Heidelberg, 2012.
[8] S. Kazemi, H. Kharrati, “Visual Processing and Classification of items
on Moving Conveyor with Pick and Place Robot using PLC”, Intell Ind
Syst (2017) 3, pp:15–21, Springer Science+Business Media Singapore,
2017.
[9] K. Simonyan and A. Zisserman, “Deep convolutional networks for
large-scale image recognition” arXiv 2014, arXiv:1409.1556.
[10] C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D.
Erhan, V. Vanhoucke and A. Rabinovich, “Going deeper with
convolutions” In Proceedings of the IEEE Conference on Computer
Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015, pp.
1–9.
[11] K. He, X. Zhang, S. Ren, J. Sun, “Deep residual learning for image
recognition” In Proceedings of the IEEE Conference on Computer
Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June
Fig. 9. Sample images of “OK” and “NOT OK” that are taken during the 2016, pp. 770–778.
quality control. [12] M. Z. Alom, T. M. Taha, C. Yakopcic, S. Westberg, P. Sidike, M. S.
Nasrin, M. Hasan, B. C. V. Essen, A. A. S. Awwal, V. K. Asari, “ A
State-of-the-Art Survey on Deep Learning Theory and Architectures”,
In addition, this study shows that by using a larger data set, Electronics 2019, 8, 292; doi:10.3390/electronics8030292, pp. 47, 5
visual quality control automation applications can also be March 2019.
applied to products that are more complex. In addition, the [13] A. Krizhevsky, I. Sutskever, G. E. Hinton, "ImageNet Classification
quality control test can be carried out without stopping the with Deep Convolutional Neural Networks", NIPS, 2012.
product while it is moving on the assembly line with real-time
Authorized licensed use limited to: NED UNIV OF ENGINEERING AND TECHNOLOGY. Downloaded on June 01,2023 at 19:12:16 UTC from IEEE Xplore. Restrictions apply.