0% found this document useful (0 votes)
127 views8 pages

Automated Sewing System Enabled by Machine Vision For Smart Garment Manufacturing

The document presents an automated sewing system that incorporates machine vision capabilities into a custom-built sewing machine. The vision system utilizes deep learning to detect fabric patterns and generate stitching paths to automate garment production. This innovative system is expected to help realize smart manufacturing in the garment industry.

Uploaded by

usamachohan053
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views8 pages

Automated Sewing System Enabled by Machine Vision For Smart Garment Manufacturing

The document presents an automated sewing system that incorporates machine vision capabilities into a custom-built sewing machine. The vision system utilizes deep learning to detect fabric patterns and generate stitching paths to automate garment production. This innovative system is expected to help realize smart manufacturing in the garment industry.

Uploaded by

usamachohan053
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

5680 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 8, NO.

9, SEPTEMBER 2023

Automated Sewing System Enabled by Machine


Vision for Smart Garment Manufacturing
Subyeong Ku , HyunWoong Choi , Ho-Young Kim , and Yong-Lae Park

Abstract—This letter presents an automated sewing system de- computer vision, check the quality of the output or monitor each
signed for smart garment manufacturing, incorporating machine process. Beyond automation of individual processes or facilities,
vision capabilities into a custom-built sewing machine. The vision machines need to be systematically connected by data, enabling
system captures an image of the fabric pattern placed between two flexible manufacturing for smart factories.
acrylic plates with a small opening, utilizing a deep learning model
to detect and segment the opening, which represents the area of Many industries, such as automobiles, pharmaceuticals, and
interest on the plate. Subsequently, a specialized algorithm detects semiconductors, have been adopting smart manufacturing envi-
a narrow seam line within the segmented image and generates ronments [3], [4], [5]. Due to the high labor intensity, the garment
a stitching path alongside the seam line, ensuring a consistent industry has recently embraced the implementation of automated
distance. The sewing machine then accurately stitches along the systems equipped with advanced production equipment and
generated path automatically. The vision system utilized in this cutting-edge sensing technologies [6], [7]. The garment industry
study achieves a spatial resolution of 68 µm per pixel. The custom- embarks on smart manufacturing by leveraging social commerce
built sewing machine, controlled by an external computer, exhibits
platforms that enable consumers to select clothing items of
a spatial resolution of 10 µm, a translation speed of 60 mm/s, and
an adjustable stitching interval ranging from 1 mm to 5 mm. The their choice and utilize augmented reality (AR) technology for
subsystems and components are interconnected using the Robot virtual try-on experiences. Based on each consumer’s fitting
Operating System (ROS), enabling seamless communication and data with the AR technology in virtual space, sewing patterns,
integration. The proposed system eliminates the need for human the basic fabric pieces that construct final clothes, are designed
intervention, facilitating automated garment production. This in- and prepared. By combination of these patterns, individualized
novative system is expected to play a critical role in realizing the clothes are produced.
vision of smart garment manufacturing. To realize smart manufacturing in garment industry, computer
Index Terms—Artificial intelligence, automated sewing, vision is not only critical to guiding and monitoring operations
computer vision, factory automation, machine learning, smart but also important in inspecting the quality of the intermediate
manufacturing. parts as well as the final products. More specifically, computer
vision is used to recognize the shapes and the sizes of pat-
terns and to locate them in the desired places [8], to generate
I. INTRODUCTION sewing paths [6], [9], to monitor the sewing and assembly pro-
MART manufacturing represents technological innovations cesses [10], [11], and to check the quality of the work [12], [13].
S with the development of networks, robotics, sensors, arti-
ficial intelligence (AI), and the Internet of Things (IoT) [1],
Therefore, computer vision is a key element to systematically
connect and automate multiple processes seamlessly.
[2], enabling autonomous manufacturing based on the flow Computer vision is also highly useful in generating sewing
of data. For example, qualitative information on the products, paths, one of the most critical tasks for automated production.
such as consumer preferences and needs, is digitalized into However, due to technical difficulties, it has been limited to
data to design and produce customized products for individ- generating paths for simple overlapped patterns [6], patches, or
ual consumers. Based on the digitized data of design, robots logos [9] using simple edge detection algorithms. Most of the
perform automated production as well as assistance to human main processes for producing clothes are involved with stitching
operators, while measurement systems, such as sensors and of two different patterns. For example, a simple T-shirt consists
of patterns of a collar, cuffs, back and front bodies, and left and
right sleeves, which are first stitched for connection and then
Manuscript received 7 March 2023; accepted 8 July 2023. Date of publi-
cation 31 July 2023; date of current version 3 August 2023. This letter was tightly secured by top stitching (Fig. 1(a) and (b)) [14], [15].
recommended for publication by Associate Editor W. Zhang and Editor J. The first stitch (i.e., basting stitch) can be easily sewn along
Yi upon evaluation of the reviewers’ comments. This work was supported the profile marked on the two overlapped patterns. However,
in part by SNU-Hojeon Garment Smart Factory Research Center funded by the second stitch (i.e., top stitch) must be sewn along the seam
Hojeon Ltd. and in part by the National Research Foundation under Grant
RS-2023-00208052. (Subyeong Ku and HyunWoong Choi contributed equally line, formed by turning over the overlapped upper pattern and
to this work.) (Corresponding author: Yong-Lae Park.) spreading it apart from the lower one. A predetermined distance
The authors are with the Department of Mechanical Engineering, Seoul from the seam line should be maintained during the top stitch. In
National University, Seoul 08826, South Korea, also with the Institute of this case, the two patterns have the same color usually, making
Advanced Machines and Design, Seoul National University, Seoul 08826,
South Korea, and also with the Institute of Engineering Research, Seoul Na- it difficult to distinguish the seam line using conventional com-
tional University, Seoul 08826, South Korea (e-mail: [email protected]; puter vision methods. Moreover, environmental conditions, such
[email protected]; [email protected]; [email protected]). as uncontrolled illumination, dust in the air, or any unwanted
Digital Object Identifier 10.1109/LRA.2023.3300284 substances on the patterns, make it difficult to detect the seam

2377-3766 © 2023 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
KU et al.: AUTOMATED SEWING SYSTEM ENABLED BY MACHINE VISION FOR SMART GARMENT MANUFACTURING 5681

Fig. 2. Simplified architecture of YOLOv5. (a) Backbone for extracting fea-


tures. (b) Neck for mixing features. (c) Head for inference of bounding boxes,
masks, and class prediction.

Fig. 1. (a) Description of top stitch process and (b) its actual image. (c) Main
components of automated sewing machine. (d) Acrylic template used to hold
the fabrics and window (i.e., opening) for sewing.

line and to perform post-processing. Therefore, seam line detec-


tion and path generation for top stitch are the biggest challenges
in computer vision for automation of garment manufacturing.
Although manual sewing is the most common processes in
conventional garment factories, it is prone to yield variations in
the quality depending on the levels of skill of the operators [16]. Fig. 3. Vision setup for taking images of template.
To automate the sewing process, a pattern former which is an
automatic sewing machine with a motorized stage has been
utilized (Fig. 1(c)) [17]. A pattern former that consists of a generated sewing path by the integrated vision system without
sewing unit and an x-y stage automatically sews the pattern any inputs or interventions from the human operator. With an
on the stage following the path provided by the operator. Pattern additional inspection process after stitching, the quality of the
formers are in general suitable for stitching patterns with planar stitched path can be evaluated and the resulting data can be
shapes and large areas. A template, made of two thin acrylic generated for monitoring in the proposed system.
plates, is used to fix and hold the target fabric layers that
are overlapped (Fig. 1(d)). The template has a window that II. ALGORITHMS
exposes the seam line and the area to be sewn. In spite of the
automatic features, pattern formers currently available in the A. Instance Segmentation Based on YOLOv5
market have their own path generation software and interfaces We employed the YOLOv5 model to detect the template
provided by the manufacturers, making it difficult to use them window, which contains the seam line in an image. By using
for autonomous and seamless production. Even if a sewing path an instance segmentation, instead of object detection that only
is generated by a computer, human interventions are unavoidable finds a bounding box around the template window, it is possible
due to the lack of the capability of autonomous recognition of to infer an area occupied by the target on the image. Since the
the patterns and generation of the sewing path. image was taken in a controlled environment, the accuracy of
In this study, we propose an automated sewing system inte- segmentation was expected to be high. Therefore, we used a
grated with machine vision and image processing algorithms to one-stage detector rather than a two-stage detector to reduce
detect the seam line and to generate the corresponding top stitch the inference time. The architecture of the YOLOv5 instance
path. In addition, we present an instance segmentation model segmentation model consists of three main components: a back-
that infers the template window from the taken image for pre- bone, a neck, and a head, as shown in Fig. 2. In the backbone,
process of the algorithms. This area is detected by the proposed features are extracted from the input image by convolution and
instance segmentation model, which is transfer learned [18] from subsampling (Fig. 2(a)). In the neck, each output of the backbone
a segmentation model called YOLOv5 [19]. Based on the in- becomes the input to the layer with the corresponding resolution,
ferred template window, the seam line was detected, and the top and the features are mixed through upsampling (Fig. 2(b)). In
stitch path, with a designated distance from the seam line, was the head, masks and classes are predicted at each image scale
generated by the proposed algorithms. The top stitch path was (Fig. 2(c)), and the results are combined into a single image.
successfully generated by the proposed smoothing algorithm, To fine-tune the YOLOv5 model, the 69 images, including
regardless of dirts on the pattern or changes in illumination. the template window, were taken using a vision environment
In addition, we built an programmable sewing machine that (Fig. 3). Six images were assigned to the validation dataset, the
automatically performs the process of top stitching along the other six images were assigned to the test dataset, and the rest

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
5682 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 8, NO. 9, SEPTEMBER 2023

Fig. 4. Fabrics used in experiments. (a) Black, (b) orange, and (c) blue colored
fabric, and their magnified images near the seam line. Blue and black arrows
indicate the start and the end points of the seam line, respectively.
Fig. 6. Schematic of seam line detection principle and pixel intensity plots in
different d and p locations.

sources did not cover the entire area of the template, intensity
gradation appeared along the length of the window in the image.
Therefore, the CLAHE was applied to the image to reduce
the difference in brightness between the center and the edge
of the image, and then a Gaussian blur was applied to reduce
the sharpness (Fig. 5(ii)). In the image, the template window
occupied 150 pixels high, which was very small compared to the
total height of 3,000 pixels. Therefore, 4 × 4 tilings were applied
to magnify small features (Fig. 5(iii)). The image was divided
into 16 tiles, and each area was resized to 1000 × 750 pixels
and added to the dataset. We applied data augmentation to
add variance to the image of the dataset, such as rotation to
detect the template window with various orientations, mosaic
for small feature detection, and cropping of the image to vary
the size and the position of the template window. The YOLOv5
Fig. 5. Sequence of preprocessing. (i) Resizing image. (ii) Applying contrast
limited adaptive histogram equalization and Gaussian blur. (iii) Image tiling.
model was fine-tuned with the processed images using a GPU
(A100-SXM4-40 GB, NVIDIA). On the other hand, in the case
of inference, only the CLAHE and the Gaussian blur were
applied to the input image to predict the template window.
of the images were assigned to the train dataset. The physical
setup consists of a monochromic camera (BFS-U3-123S6M-C,
B. Seam Line Detection and Path Generation
FLIR) and illuminations (EuroBrite T M Bar Lights, Advanced
Illumination). Three types of fabrics with three colors of black, The seam line is detected by the difference of the amount of
orange, and blue were used to train, and each type has its own light reflected by the fabric under the illuminations (Fig. 6(a)).
weaving pattern and seam line (Fig. 4(a), (b), and (c)). Since The light sources were placed asymmetrically to the camera
the light reflected from the fabric is taken, the image changes to amplify the difference in reflected light. p1 represents the
depending on the color of the fabric, and thus we chose fabrics curved part of the folded upper pattern, which becomes flattened
from dark to bright colors. In addition, images were taken while as it goes away from the seam line. p2 and p3 represent the
changing the exposure time from 10,000 to 100,000 µ sec. to seam line and the flat part of the bottom pattern, respectively.
enable detection under various illumination conditions. The size Ideally, the incident light on the seam line is minimally reflected.
of the image was 4096 × 3000 pixels, and our system had a Thus, the pixel intensity is the local minimum at this point (p2 )
resolution of 68 µm per pixel. (Fig. 6(b)-(i), (b)-(ii), and (b)-(iii)). Since the light source did
Before training the model, we performed a sequence of image not cover the entire area of the template, the average value of the
preprocessing (Fig. 5). Since the input image size was large, pixel intensity differed according to the value of d, representing
which was not suitable for training the model, the image was the pixel distance from the center of the light source. However,
resized to 1000 × 750 while maintaining the aspect ratio of the regardless of the value of d, the local minimum intensity in the
image (Fig. 5(i)). Then, a contrast-limited adaptive histogram template window was obtained from the pixel position p, near
equalization (CLAHE) [20] was applied to the image. The the seam line, except for both boundaries.
CLAHE divides the image into a grid of tiles, places a limit We developed an image processing algorithm for seam line
on pixel intensity, redistributes the values over the limit, and detection based on the following assumptions: i) In the detected
equalizes the histogram of the grid. Since the installed light template window, each column of the image array has a single

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
KU et al.: AUTOMATED SEWING SYSTEM ENABLED BY MACHINE VISION FOR SMART GARMENT MANUFACTURING 5683

Algorithm 1: Suggest Possible Seam Line Points. Algorithm 2: Determine Seam Line.
Input: width of detected template window w, adjacent Input: width of detected template window w, number of
pixel distance adjd , i-th column of template window seam line points g, seam line candidates L, j-th seam
array Ci , number of seam line points g, rejected group line candidate SLj , j-th group connectivity score sj ,
of index R, temporary array A j-th group minimum pixel difference minj , k-th pixel
Output: seam line candidates L index difference Δpk , (i − 1)-th pixel index in SLj
Initialization : adjd ← 3, g ← 5 pi−1 [j], j-th group seam line index pj
1: for i = 0 to w do Output: u-th seam line SLu
2: j = 0, array A Initialization : g ← 5, SLj [0] ← L[0][j]
3: while A length = g do 1: for i = 1 to w do
4: index of j-th minimum intensity in Ci kj , 2: for j = 0 to g do
j ←j+1 3: minj ← 9999, pi−1 [j] ← SLj [i − 1]
5: if Ci [kj − 1] = 0 or Ci [kj + 1] = 0 then 4: for k = 0 to g do
6: append kj to R 5: Δpk = abs(L[i][k] − pi−1 [j])
7: else 6: if Δpk < minj then
8: for u = 1 to adjd do 7: minj ← Δpk , pj ← L[i][k]
9: if kj + u ∈ R or kj − u ∈ R then 8: end if
10: append kj to R 9: end for
11: else 10: append pj to SLj , sj ← sj + minj
12: append kj to A and R 11: end for
13: end if 12: end for
14: end for 13: u ← argmin(s1 , s2 , . . .sg )
15: end if 14: return SLu
16: end while
17: append A to L
18: end for
19: return L Algorithm 3: Smoothing and Generating Top Stitch Path.
Input: width of detected template window w, seam line
SL, smoothing window size m, number of window n,
seam line point. ii) The seam line points of adjacent columns smoothed seam line point ŷ, temporary array A,
are at similar locations. iii) The seam line point in each col- resolution of vision system r, j-th top stitch point tsj ,
umn has the minimum intensity. With these assumptions, the predetermine gap dg
seam line detection was possible using only the pixel intensities Output: top stitch path T S
without applying additional image processing techniques (e.g., Initialization : m ← 300
thresholding, edge detection) and also possible regardless of 1: interval n = integer(w/m)
the illumination conditions. An algorithm to find the points of 2: for i = 0 to n do
interest (e.g., p1 , p2 , and p3 ) with the local minimum intensity in 3: A = SL[i ∗ m : (i + 1) ∗ m]
the template window is first developed, as shown in Algorithm 1. 4: apply Savitzky-Golay filter to A, result ŷ
This algorithm found g indexes of pixels with the minimum in- 5: for j = 0 to m do
tensity in all columns included in the detected template window. 6: tsj = ŷ[j] + dg /r and append pj to T S
Ideally, the index with the minimum intensity in the template 7: end for
window is near the seam line, but we found g (g = 5) candidates 8: end for
for the seam line points. At the boundaries of the template 9: return T S
window, the pixel intensities were 0. Since these values were
not of interest, they were excluded from the candidates of the
seam line points, and all the indexes connected to these indexes
were excluded, too. In addition, if the index that was found was was smoothed because the seam line did not have a smooth curve
within the adjacent distance, adjd (adjd = 3), this index was due to noise caused by dust or substances on the patterns even
also excluded from the seam line point since it was considered if the connectivity was minimized. Therefore, we divided the
connected. By adjusting only two parameters, we were possible template window into n intervals, smoothed each interval, and
to find candidates of the seam line points in the image. merged them. Rather than applying a moving average that is
From the suggested seam line points L, the actual seam line signficantly affected by outliers, we applied a Savitzky-Golay
points are found based on the connectivity between the adjacent smoothing filter (ŷ) [21], a finite impulse response system
columns (Algorithm 2). Since the seam line generated by folding analysis method, instead of time-consuming local regression.
the upper pattern is a smooth curve, the actual seam line among After smoothing the seam line, the point pj of the path T S was
the candidates has the minimum sum of the change in the pixel generated and merged, using the distance per pixel r of the vision
position. system and the predetermined gap for the top stitch dg . dg was
The top stitch path was generated from the determined seam set to be 1.6 mm, the smallest gap required for precise clothing
line SLu . Before generating the path, the determined seam line production (Algorithm 3).

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
5684 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 8, NO. 9, SEPTEMBER 2023

Fig. 8. Schematic diagram of communication in the automated sewing system.

communication [22]. In the ROS, each sub-system is represented


as a node, and communication is established through messages
in ROS topics. The system consists of four nodes: the sewing
Fig. 7. Components of automated sewing system.
machine node, the vision node, the light node, and the graphical
user interface (GUI) node (Fig. 8). The sewing machine node
III. SYSTEM controls the motors of the x-y stage and monitors the motor
encoder values. The vision node captures the images through
An automated sewing system is composed of a vision system the camera and executes the proposed algorithm. The light node
and a custom-built sewing machine. As mentioned in Section II, manages the state of the LED light sources, and the operator can
the vision system includes the camera and the light sources to monitor and control the top stitching process through the GUI
detect the seam line of the patterns. This section describes the node.
details of the sewing machine, including the hardware spec-
ifications. In addition, the communication methods between
IV. EXPERIMENTS
the components of the system and operational sequences are
introduced. We conducted experiments for evaluating the performances
the inference of the trained segmentation model, the seam line
A. Automatic Sewing Machine detection and the top stitch path generation using the proposed
algorithm, and the sewing machine and stitching quality.
To automatically sew along the generated top stitch path, we
developed an automatic sewing machine using a commercial pat-
A. Instance Segmentation
tern former (UAS-H700-D, UNICORN). The operational unit of
the developed machine consists of three motors: two servo mo- For evaluating the trained model, the black (Fig. 9(a)) and the
tors (SGM7J-04AFA21, SGM7J-08AFA21, YASKAWA) that blue (Fig. 9(b)) fabrics were used because the amount of reflected
are attached to the x-y stage to control the position of the light varied with colors. Then, the CLAHE and the Gaussian
template, and one servo motor (SGM7J-08AFA21, YASKAWA) blur were applied to the taken images, which were provided and
that controls the position of the sewing needle and trims the it was input to the trained model as the inputs (Fig. 9(a), (b)-
thread. By connecting the motor controllers and an external middle). The time taken for the segmentation was about 0.3 sec.
PC, it is possible to control the position and the speed of the when using a GPU (Geforce 3080 TI, NVIDIA). The trained
motors without using the factory-installed program. The sewing model detected the template window regardless of the optical
machine also has four pneumatic-guided cylinders which hold noise reflected on the glossy acrylic template, the color and the
the template from vibrations and a presser foot that flattens the unique weaving pattern of the fabric, and the brightness of the
patterns during sewing (Fig. 7). illuminations.
The maximum travel distance of the stage is 1,230 mm in the
x direction, parallel to the long side of the template window, B. Seam Line Detection and Top Stitch Path Generation
and 720 mm in the y direction. The spatial resolution of the
x-y stage is 10 µm in both directions. The maximum RPM of We conducted an experiment to check whether the seam line
the motor of the sewing needle is 600. The average speed for was detected and the top stitch path was properly generated
linear translation is proportional to the RPM of the motor of the through the proposed algorithm, and compared the result with
sewing needle and the sewing interval. The x-y stage has the the outcome from the conventional edge detection algorithm.
maximum translation speed of 60 mm/s when the RPM is 600 Using the segmented area as a mask, only the template window
and the sewing interval is 5 mm. was isolated from the image and became the region of interest
(ROI). By setting the ROI, the cropped image of the template
window, seam line detection and post-processing were simpli-
B. Communication
fied. The result of application of the seam line detection algo-
The entire system can be divided into three sub-systems: rithm and the generated top stitch path can be seen in Fig. 9(a),
the sewing machine, the camera, and the lighting. To facili- (b)-bottom. We compared the result with that of conventional
tate seamless communication between these sub-systems, we edge detection algorithms. A Canny [23], [24], a Laplacian [25],
employed the Robot Operating System (ROS) with TCP/IP and a Sobel [26] operators were applied to the ROI of the black

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
KU et al.: AUTOMATED SEWING SYSTEM ENABLED BY MACHINE VISION FOR SMART GARMENT MANUFACTURING 5685

Fig. 9. Sequence and result of seam line detection and top stitch path generation. Black and blue fabrics with exposure time of (a) 30,000 µsec. and (b) 40,000 µsec.,
respectively. The sequence consists of input and preprocessing (top), segmentation of template window (middle), post-processing (i.e., masking the template window,
smoothing, and top stitch path generation) (bottom). (c) Comparison with conventional edge detection algorithms. L and C represent the left end and the center of
the image, respectively. Results of seam line detection in (d) the embossed fabric, (e) curved seam line in the fabric with horizontal stripes, (f) and (g) the diagonal
seam line. (h) White letter insertion for quantifying performance. (i) Photos of with and without letter. (j) Intensity plots of the reflected light for the first and the
last columns, respectively.

pattern (Fig. 9(c)). In the black pattern, it was unable to find the seam line boundary (Fig. 9(j)). We found the ground truths
the seam line by the Canny nor the Laplacian operator. In the for the black, the blue, and the orange patterns and compared
case of the Sobel operator, as expressed with green, the red the with the results from the proposed algorithm. The average
seam line found by the proposed algorithm overlapped the green position errors for the black, the blue, and the orange patterns
area. All the conventional algorithms were affected by the image were 0.05, 0.05, and 0.09 mm, respectively, with the standard
brightness and required complex image processing before and deviations of 0.04, 0.04, and 0.09 mm, respectively.
after applying the algorithms to detect the seam line. We tested the robustness of the segmentation model and the
The proposed detection algorithm was possible to detect the proposed algorithm for seam line detection (Fig. 10(a)). An
seam lines on the embossed fabric (Fig. 9(d)), the curved seam unwound thread at the edge of the pattern makes the start point
on the fabric with horizontal stripes (Fig. 9(e)), and the diagonal for sewing inconsistent, and the dirt on the seam line makes the
seam (Fig. 9(f) and (g)). top stitch path jagged. The unwound thread was rejected by the
To quantitatively evaluate the performance of the proposed trained segmentation model, and the generated top stitch path
detection algorithm, a piece of white letter was inserted into was smooth by the proposed algorithm, even with the presence
the gap between the patterns (Fig. 9(h)). Since the white color of dirt on the seam line (Fig. 10(b)). The experimental results
is highly reflective, it has a high pixel intensity when pho- show that the developed sewing machine play a key role of the
tographed, making it easier to find the ground truth with contrast pattern former, while precisely operating with the sewing path
near the seam line (Fig. 9(i)). Image processing was used to find generated by the vision system.

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
5686 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 8, NO. 9, SEPTEMBER 2023

Fig. 10. Robustness test for the proposed algorithm. (a) A single strand of the
edge thread is unwound (Left) and lots of dirt is placed on the template window
and the seam line (Right). (b) Result of segmentation, seam line detection, and
top stitch path generation.

Fig. 12. Automation procedure for stitching. (a) Left and right images of
template (top) and stitched image with detected seam line and generated top
stitch path (bottom). (b) Before and (c) after sewing. Magnified image shows
the consistent distance (1.58 mm) between the top stitch and the seam lines.
(d) Detected curved seam line and (e) the result of the top stitching.

Next, we tested the maximum translation speed under the


condition of stitching. We moved the stage 100 mm in the x
direction under the maximum RPM (600) of the sewing needle,
and the result is shown in Fig. 11(e). The black and blue curves
show the position in the x-axis and the speed, respectively. The
stage moved 100 mm for 1.67 sec. with the average speed of
60 mm/s.
Since the optimal stitching interval is different for each fabric,
this length needs to be changed by the operator. We sewed the
pattern by changing the stitching interval from 1 mm to 5 mm,
and the results were captured with the vision system (Fig. 11(f)).
The actual stitching interval showed a similar length to the input
Fig. 11. Test results of the automatic sewing machine. The goal position and
the measured position by the absolute encoder in (a) x-axis and (b) y-axis and
value.
the corresponding visual result in (c) x-axis and (d) y-axis. (e) Position and
velocity measurement in x-axis. (f) Sewing results with stitching intervals from
1 mm to 5 mm. V. APPLICATION
We set up an automated sewing system by integrating the
vision system with the sewing machine and connecting compo-
C. Top Stitching With Automatic Sewing Machine nents through the ROS. The integrated system generated the top
We conducted an experiment to evaluate the performances stitch path based on the captured template image by itself and
of the custom-built sewing machine in terms of the spatial sewed along the path without any interventions of the human
resolution, the maximum translation velocity, and the stitching operator. We demonstrated the automated stitching using the
interval. The sewing machine was controlled to move 1 mm in the integrated system. First, each part of the pattern was taken
x and y directions by moving 100 steps in 10 µm increments with (Fig. 12(a)), and the top stitch path was generated using the
simultaneous measurement of an absolute encoder (Fig. 11(a) proposed algorithm. The process of image merging was required
and (b)). The inset plots indicate that when the measured position since the patterns used to make actual clothes would be longer
of the x-y stage reached the target position, the next goal position than a single capture by the vision. The top stitch path was
was input to the x-y stage. This sequence was repeated 100 times generated at a distance of 1.58 mm (1/16 in) from the seam
to move 1 mm in x and y directions. As a result of analyzing the line in the merged image, and the custom-built machine sewed
images taken by the vision system before and after the operation, along the path. 1.58 mm is the smallest gap between the seam
the system moved 1 mm in each axis and showed the spatial line and the top stitch line used in the typical garment production
resolution of approximately 10 µm (Fig. 11(c) and (d)). field [27], [28]. The images of the patterns for the straight seam

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.
KU et al.: AUTOMATED SEWING SYSTEM ENABLED BY MACHINE VISION FOR SMART GARMENT MANUFACTURING 5687

line before and after the top stitch are shown in Fig. 12(b) and [3] C. Cronin, A. Conway, and J. Walsh, “Flexible manufacturing sys-
(c), respectively. The curved seam line was also detected by tems using IIoT in the automotive sector,” Procedia Manuf., vol. 38,
pp. 1652–1659, 2019.
the proposed algorithm (Fig. 12(d)) and top stitch was possible [4] N. S. Arden, A. C. Fisher, K. Tyner, L. X. Yu, S. L. Lee, and M. Kopcha,
along the generated curved path (Fig. 12(e)). “Industry 4.0 for pharmaceutical manufacturing: Preparing for the smart
factories of the future,” Int. J. Pharm., vol. 602, 2021, Art. no. 120554.
VI. DISCUSSION [5] J. Moyne and J. Iskandar, “Big data analytics for smart manufacturing:
Case studies in semiconductor manufacturing,” Processes, vol. 5, no. 3,
In most cases, the proposed algorithm was able to detect 2017, Art. no. 39.
[6] S. Lee et al., “Implementation of an automated manufacturing process for
the seam line by adjusting two parameters (adjd and g). The smart clothing: The case study of a smart sports bra,” Processes, vol. 9,
proposed algorithm finds the points with the intensities of local no. 2, 2021, Art. no. 289.
minimums and estimates the seam line by connecting them. [7] S. Ku, J. Myeong, H.-Y. Kim, and Y.-L. Park, “Delicate fabric handling
Therefore, if the pattern is matte and dark with a low reflectivity, using a soft robotic gripper with embedded microneedles,” IEEE Robot.
Automat. Lett., vol. 5, no. 3, pp. 4852–4858, Jul. 2020.
it is difficult to distinguish. In addition, if the pattern has large [8] E. Torgerson and F. W. Paul, “Vision-guided robotic fabric manipulation
mesh patterns with high air permeability, it is possible to have for apparel manufacturing,” IEEE Control Syst. Mag., vol. 8, no. 1,
multiple local minimums along the stripes, requiring additional pp. 14–20, Feb. 1988.
image processing. [9] “Brother electronic controlled programmable sewing machine with vi-
sion camera system,” Brochure, Brother Industries, Ltd, Nagoya, Aichi,
Our proposed system and algorithm were designed to au- Japan, 2018. [Online]. Available: https://www.brother-usa.com/products/
tomate legacy production equipment as a starting point for 7905387
smart garment manufacturing. To realize an advanced system for [10] W.-K. Jung et al., “Appropriate smart factory for smes: Concept, applica-
smart manufacturing, individual production machines need to be tion and perspective,” Int. J. Precis. Eng. Manuf., vol. 22, pp. 201–215,
2021.
physically connected. Therefore, we are currently working on a [11] J.-Y. Lee, D.-H. Lee, J.-H. Park, and J.-H. Park, “Study on sensing and
system that integrates a mobile platform, a collaborative robot, monitoring of sewing machine for textile stream smart manufacturing
and a robotic gripper, to autonomously handle and transport innovation,” in Proc. IEEE Int. Conf. Mechatron. Mach. Vis. Pract., 2017,
various types of fabric patterns for multi-step sewing processes pp. 1–3.
[12] Y. Li, W. Zhao, and J. Pan, “Deformable patterned fabric defect detection
to fully automate the production process. with fisher criterion-based deep learning,” IEEE Trans. Automat. Sci. Eng.,
vol. 14, no. 2, pp. 1256–1264, Apr. 2017.
VII. CONCLUSION [13] H. Kim, W.-K. Jung, Y.-C. Park, J.-W. Lee, and S.-H. Ahn, “Broken stitch
detection method for sewing operation using CNN feature map and image-
We developed an automated sewing system by integrating processing techniques,” Expert Syst. Appl., vol. 188, 2022, Art. no. 116014.
a custom-built sewing machine with a machine vision system, [14] A. Shahriar, “The optimization of knitted T-shirt for rapid production
process,” Int. J. Textile Sci., vol. 8, no. 1, pp. 16–25, 2019.
which does not need any interventions or assistance from the [15] R. E. Glock and G. I. Kunz, Apparel Manufacturing: Sewn Product
human operator. All the components were systematically con- Analysis. Upper Saddle River, NJ, USA: Pearson/Prentice Hall, 2005.
nected through the ROS. In the vision part, a trained deep [16] G. Li, C. M. Haslegrave, and E. Corlett, “Factors affecting posture for
learning model and the proposed algorithms were sequentially machine sewing tasks: The need for changes in sewing machine design,”
Appl. Ergonom., vol. 26, no. 1, pp. 35–46, 1995.
executed. A template window was detected and segmented by [17] T. Gries and V. Lutz, “Application of robotics in garment manufacturing,”
the trained deep learning model in the captured image, and the in Automation in Garment Manufacturing, R. Nayak and R. Padhye, Eds.
top stitch path was generated based on the seam line by the Sawston, U.K.: Woodhead Publishing, 2018, pp. 179–197.
proposed algorithms. The custom-built sewing machine was [18] L. Torrey and J. Shavlik, “Transfer learning,” in Handbook of Research
on Machine Learning Applications and Trends: Algorithms, Methods, and
controlled by an external PC and showed a spatial resolution Techniques. Hershey, PA, USA: IGI Global, 2010, pp. 242–264.
of 10 µm, the maximum translation speed of 60 mm/s, and the [19] G. Jocher et al., “ultralytics/YOLOv5: V6.1 - TensorRT, tensorflow edge
adjustable stitching interval from 1 mm to 5 mm. Through this TPU and OpenVINO export and inference,” 2022. [Online]. Available:
integrated system, automated pattern sewing and simultaneous https://doi.org/10.5281/zenodo.6222936
[20] K. Zuiderveld, “Contrast limited adaptive histogram equalization,” in Proc.
monitoring were possible. By repeating the image processing of Graph. Gems IV, 1994, pp. 474–485.
the result after top stitching, the quality of the stitched path was [21] A. Savitzky and M. J. Golay, “Smoothing and differentiation of data
evaluated and the resulting data were generated. The automation by simplified least squares procedures,” Anal. Chem., vol. 36, no. 8,
of the sewing machine, the quality assessment of the output pp. 1627–1639, 1964.
[22] M. Quigley et al., “ROS: An open-source robot operating system,” in Proc.
through the vision system, and the generated data for use in the ICRA Workshop Open Source Softw., 2009, vol. 3, Art. no. 5.
next process will enable seamless production by significantly [23] J. Canny, “A computational approach to edge detection,” IEEE Trans.
increasing the connectivity of multiple processes in garment Pattern Anal. Mach. Intell., vol. PAMI-8, no. 6, pp. 679–698, Nov. 1986.
manufacturing. Therefore, we expect our system to play an [24] Y. Han et al., “Smart skin: Vision-based soft pressure sensing system for
in-home hand rehabilitation,” Soft Robot., vol. 9, no. 3, pp. 473–485, 2022.
essential role in achieving smart manufacturing in the future [25] X. Wang, “Laplacian operator-based edge detectors,” IEEE Trans. Pattern
garment industry. Anal. Mach. Intell., vol. 29, no. 5, pp. 886–890, May 2007.
[26] N. Kanopoulos, N. Vasanthavada, and R. L. Baker, “Design of an image
edge detection filter using the sobel operator,” IEEE J. Solid-State Circuits,
REFERENCES vol. 23, no. 2, pp. 358–367, Apr. 1988.
[1] B. Chen, J. Wan, L. Shu, P. Li, M. Mukherjee, and B. Yin, “Smart factory [27] D. Aasen, R. S. Mong, and P. Fendley, “Topological defects on the lattice:
of industry 4.0: Key technologies, application case, and challenges,” IEEE I the Ising model,” J. Phys. A: Math. Theor., vol. 49, no. 35, 2016,
Access, vol. 6, pp. 6505–6519, 2018. Art. no. 354001.
[2] H. Lasi, P. Fettke, H.-G. Kemper, T. Feld, and M. Hoffmann, “Industry [28] J. H. Hammond and A. Raman, Sport Obermeyer Ltd. Boston, MA, USA:
4.0,” Bus. Inf. Syst. Eng., vol. 6, no. 4, pp. 239–242, 2014. Harvard Bus. Sch., 1994.

Authorized licensed use limited to: University of Punjab. Downloaded on May 13,2024 at 11:30:46 UTC from IEEE Xplore. Restrictions apply.

You might also like