0% found this document useful (0 votes)
19 views49 pages

11.HistoryBased Fault CNN-ResNet

Uploaded by

m.saman.1995
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views49 pages

11.HistoryBased Fault CNN-ResNet

Uploaded by

m.saman.1995
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Intelligent Control and Fault Diagnosis


Lecture 9: History-Based IFD (CNN,
ResNet)

Farzaneh Abdollahi

Department of Electrical Engineering

Amirkabir University of Technology

Farzaneh Abdollahi Intelligent FaultWinter


Diagnosis 2024 Lecture 9 1/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

CNN-Based Approaches
Convolution Stage
Pooling

ResNet-Based Approaches

Further Discussion
Performance Evaluation Indices

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 2/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution NN [1, 2]

▶ CNN is a supervised deep learning method


▶ A convolution layer includes
▶ Convolution Stage: convolve input with a kernel (filter)
▶ Detector Stage: consider an activation fcn like Relu, sigmoid, etc
▶ Pooling Stage: reduce the size

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 3/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution Stage

▶ Conseider a 6 × 6 gray scale image

3 0 1 2 7 4
1 5 8 9 3 1
2 7 2 5 3 1
0 1 3 1 7 8
4 2 1 6 2 8
2 4 5 2 3 9

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 4/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution Stage

▶ Conseider a 6 × 6 gray scale image


▶ Convolve it by a 3 × 3 filter (Kernel)

3 0 1 2 7 4
1 5 8 9 3 1
1 0 -1
2 7 2 5 3 1
∗ 1 0 -1
0 1 3 1 7 8
1 0 -1
4 2 1 6 2 8
2 4 5 2 3 9

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 4/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution Stage

▶ Conseider a 6 × 6 gray scale image


▶ Convolve it by a 3 × 3 filter
▶ It results to a 4 × 4 matrix

31 00 1−1 2 7 4
11 50 8−1 9 3 1 -5
1 0 -1
21 70 2−1 5 3 1
∗ 1 0 -1 =
0 1 3 1 7 8
1 0 -1
4 2 1 6 2 8
2 4 5 2 3 9

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 4/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution Stage

▶ Conseider a 6 × 6 gray scale image


▶ Convolve it by a 3 × 3 filter
▶ It results to a 4 × 4 matrix

3 01 10 2−1 7 4
1 51 80 9−1 3 1 -5 -4
1 0 -1
2 71 20 5−1 3 1
∗ 1 0 -1 =
0 1 3 1 7 8
1 0 -1
4 2 1 6 2 8
2 4 5 2 3 9

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 4/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution Stage
▶ Conseider a 6 × 6 gray scale image
▶ Convolve it by a 3 × 3 filter
▶ It results to a 4 × 4 matrix

3 0 1 2 7 4
1 5 8 9 3 1 -5 -4 0 8
1 0 -1
2 7 2 5 3 1 -10 -2 2 3
∗ 1 0 -1 =
0 1 3 1 7 8 0 -2 -4 -7
1 0 -1
4 2 1 6 2 8 -3 -2 -3 -16
2 4 5 2 3 9

▶ n × n∗f × f = (n − f + 1) × (n − f + 1)
▶ f is usually an odd number

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 4/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

▶ How to choose proper filter ?


▶ Consider the elements of the filter as weights of a NN
▶ Use BP alg to learn and train the wights properly

w1 w2 w3
w4 w5 w6
w7 w8 w9

▶ In Convolution stage:
▶ The features which is similar to the defined kernel is captured.
▶ Makes connection sparse( make problems simple)
▶ Each output value depends only on small number of inputs (local)
▶ There is no condensed connection between two layers they are sparse
▶ Share parameters
▶ Filter can be useful in different parts of the input(image)

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 5/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN


▶ Padding
▶ By conv.
▶ The size of image shrinks
▶ Some info of the edge of the image is thrown away(the pixels in the
edge of the image has less contribution in obtaining final matrix)

∗ =

▶ To avoid these problems we can Pad the image with additional borders

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 6/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN


▶ Padding
▶ By conv.
▶ The size of image shrinks
▶ Some info of the edge of the image is thrown away(the pixels in the
edge of the image has less contribution in obtaining final matrix)
▶ To avoid these problems we can Pad the image with additional borders

∗ =

▶ Padding by 1 (p=1)
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 6/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Padding

▶ For convolution we have n × n∗f × f = (n − f + 1) × (n − f + 1)


▶ By padding, we will have
(n+2p) × (n+2p)∗f × f = (n+2p − f + 1) × (n+2p − f + 1)
▶ Same conv.: to have same size of input image in out put: p = f −1
2

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 7/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN

▶ Strided Convolution

23 34 74 4 6 2 9
61 60 92 8 7 4 3
3−1 40 83 3 8 9 7 3 4 4 91
7 8 3 6 6 3 4 ∗ 1 0 2 =
4 2 1 8 3 4 6 -1 0 3
3 2 4 1 9 8 3
0 1 3 9 2 1 4

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 8/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN

▶ Strided Convolution
▶ Stride by 2

2 3 73 44 64 2 9
6 6 91 80 72 4 3
3 4 8−1 30 83 9 7 3 4 4 91 100
7 8 3 6 6 3 4 ∗ 1 0 2 =
4 2 1 8 3 4 6 -1 0 3
3 2 4 1 9 8 3
0 1 3 9 2 1 4

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 8/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN

▶ Strided Convolution
▶ Stride by 2

2 3 7 4 6 2 9
6 6 9 8 7 4 3
33 44 84 3 8 9 7 3 4 4 91 100 83
71 80 32 6 6 3 4 ∗ 1 0 2 = 69
4−1 20 13 8 3 4 6 -1 0 3
3 2 4 1 9 8 3
0 1 3 9 2 1 4

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 8/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Some other operations in Convolutional NN

▶ Strided Convolution

2 3 7 4 6 2 9
6 6 9 8 7 4 3
3 4 8 3 8 9 7 3 4 4 91 100 83
7 8 3 6 6 3 4 ∗ 1 0 2 = 69 91 127
4 2 1 8 3 4 6 -1 0 3 44 72 74
3 2 4 1 9 8 3
0 1 3 9 2 1 4

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 8/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Strided Convolution

▶ (n+2p) × (n+2p)∗f × f = ( n+2p−f


s + 1) × ( n+2p−f
s + 1)
▶ S: stride
▶ If n+2p−f
s is not integer then round it down:
(n+2p) × (n+2p)∗f × f = (⌊ n+2p−fs + 1⌋) × (⌊ n+2p−f
s + 1⌋)

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 9/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution over Volumes


▶ Number of channels of the images and filter should be the same

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 10/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Convolution over Volumes


▶ We can apply several filters (vertical, horizontal, 60 degree and so on)
to an image simultaneously by using multiple filters

▶ n × n × nc ∗f × f × nc = (n − f + 1) × (n − f + 1) × nc′
▶ nc : number of Channels; nc′ : number of filters
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 11/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Types of layers in Conditional Network

▶ Convolution (Conv)
▶ Pooling (Pool)
▶ Fully Connected (FC)
▶ It is a simple single layer NN with bias

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 12/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Pooling Layer: Max Pooling

▶ Pooling reduces the size of the feature map and training parameters
▶ It overcome with over fitting
▶ It speeds up computation
▶ Pooling makes translation invariant since it behaves like taking a
summery of the info(image)
▶ Pooling can be considered as down sampling
▶ Good for the images that are not centered
▶ Length of pooling the dimension could be dynamic

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 13/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Pooling Layer: Max Pooling


▶ With a 2 × 2 filter (f=2), stride 2 (s=2)
▶ Take max at each step
▶ If the image has more than one channel, nc is image and output of
pooling is the same

f=2
s=2 9 2

6 3

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 14/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Pooling Layer: Average (Mean) Pooling


▶ It is less popular comparing to Max pool
▶ Take the average of pixels at each step

f=2
s=2 3.75 1.25

4 2

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 15/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Pooling Layer

▶ Good for shrinking the size rapidly (nH , nW )


▶ Usually no Padding is considered for these layers
▶ No training is required for this layer
▶ An nH × nW × nC results in an ⌊ nHs−f + 1⌋ × ⌊ nWs−f + 1⌋ × nC

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 16/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

A Convolutional NN

▶ nH , nW ↓, nC ↑
▶ Each Conv and pooling is considered as a layer
▶ At the end we have FC
▶ Last layer is softmax
▶ Consider the training set: (x (1) , y (1) )...(x (m) , y (m) )
▶ Using BP alg. optimize the params to min. cost fcn:
J = m1 m (i) (i)
P
i=1 L(ŷ , y )
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 17/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Why Convolutional NN

▶ This conv NN has (5 × 5 params for filter +1 for bias=26; 6 filters


⇝26 × 6 = 156 params
▶ If we use simple NN: 3072 × 4704 ≈ 14M params!!
▶ To deal with big data and ovoid overfitting Conv NN can help
▶ Param Sharing:A feature detector (like vertical edge detector) that is
useful for one part of image is probably useful in another part of image
▶ Sparsity of Connections: At each layer,each output value depends only
on a small number of inputs
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 18/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

CNN Pros and Cons

▶ CNN can capture the shiftvariant properties of input data.


Therefore, comparing with the stacked AE and DBN, CNN-based
diagnosis can directly learn features from the raw monitoring data
without preprocessing such as the frequency-domain transformation.
▶ By sharing weights, the number of training parameters in diagnosis
models is reduced . Therefore, it can accelerate the convergence and
restrain the over fitting.
▶ Its performance is subject to training with sufficient labeled samples.
▶ Computational Complexity:Convolutions are expensive

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 19/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Example [3]
▶ An unsupervised anomaly detection convolutional AE (CAE) is
proposed
▶ They claimed that combining these two approaches
▶ exponentially reduces the computational cost
▶ decreases the required training data by its extraction capabilities of
essential features in spatial input data.
▶ identifies errors larger than usual pre-trained, reconstructed errors
▶ 1-D conv is applied

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 20/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Configuration of each layer

[3]
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 21/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Data Set

▶ Data set sampled from an industrial gas turbine in operation.


▶ The data sets consist of sensor signals from vibration monitoring
system (VMS) and distributed control system (DCS) over 30 days
collected simultaneously without specifying date and time
▶ In VMS, 8 signals collected from the vibration transmitters attached
to the bearing housing of the turbine rotor.
▶ In DCS, signals provided from the 3 temperature and 2 pressure
transmitters attached across to the combustors and rotor casing.
▶ The recording interval of the sample is at every second.
▶ Total data volume is approximately 33.11 million, which consisted of
signal labels in 13 columns and sample values in 2,547,211 rows.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 22/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Data Set

[3]
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 23/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Pre-Processing
▶ The data includes sudden spike or dip patterns under the threshold
alarm level.
▶ They had no labels
▶ the interquartile range (IQR) is used as a measure of the dispersion
in the data.
▶ (IQR) contains the second and third quartiles
▶ Any values fall below the lower quartile Q1 – 1.5 IQR, or above the
upper quartile Q3 + 1.5 IQR, are considered as anomalies.
▶ another column is added to the data (14th column)
▶ any rows include anomalies are labeled to 1
▶ without anomalies are labeled to 0
▶ This labeling is only considered for comparing with other supervised
approaches
▶ CAE model the 14th column is not considered
▶ The set has been normalized to a range of 0 and 1
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 24/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Hyperparameters set up

▶ CAE: Layers compiled on Keras.


▶ The window size: 32.
▶ Stride set to 1.
▶ Mini batch size: 128.
▶ Kernel size: 5 for all convolutions.
▶ Max pooling size: 2.
▶ L2 regularization: 0.001
▶ The learning rate: 0.01.
▶ Adam for or gradient descent optimizer
▶ The model is trained for 15 epochs.
▶ The test data set size is 20% of the total data volume.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 25/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

ResNet-Based Approaches [5]

[4]
▶ Resnet is made by stacking several residual blocks.
▶ The residual blocks consist of:
▶ Forward channels are made by stacking some convolutional layers.
▶ For example, two convolutional layers handle the input features by

g (xil−1 |θl ) = σr (xl−1


i ∗ kl1 + bl1 ) ∗ kl2 + bl2

θl = {kl1 , kl2 , bl1 , bl2 }: training params in the lth residual block
▶ Shortcut connection calculates the sum of forward channels output
and input features
▶ The output of the residual block is

xli = σr (g (xl−1
i |θl ) + xl−1
i )
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 26/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

ResNet-Based Approaches [5, 6]

▶ In stacking residual blocks, the output of the previous block is the


input of the next one.
▶ The learned features are finally mapped into the target class by
full-connected layers.
▶ ResNet is developed to obtain higher generalization performance on
the basis of the CNN architecture Therefore, the ResNet-based
diagnosis models inherit the advantages of the CNN-based diagnosis
models.
▶ It can possibly provide higher diagnosis accuracy, especially for the
dynamical complicated operation conditions such as variable-speed
or variable-load conditions
▶ ResNet avoids vanishing and exploding gradients

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 27/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance Evaluation Indices

▶ MSE: calculates the average squared difference between the vectors


of predicted values Ŷ and the correct values Y
n
1X
(Yi − Ŷi )2
n
i=1

▶ Indicates how the quality of the model has estimated the correct
value, in the fraction between 0 and 1.
▶ The better estimation the closer to 0

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 28/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance Evaluation Indices


▶ Precision: how the trained model has made more relevant estimation
than irrelevant ones
TP
p=
TP + FP

▶ TP:the true positives, which means the sum of examples correctly


classified as positive.
▶ FP: the false positives, which is the sum of negative examples
incorrectly classified as positive.
▶ Recall: the trained model has returned most of the relevant results:
TP
r=
TP + FN

▶ FN: the false negative, which is the sum of positive examples


incorrectly classified as negative.
Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 29/41
Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance Evaluation Indices

▶ Accuracy: how the trained model correctly predicts

TP + TN
A=
TP + FP + TN + FN

▶ TN:the true negative, which means the sum of examples correctly


classified as negative.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 30/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance Evaluation Indices

▶ F − 1 score: a test accuracy of binary classifications in the


imbalanced data set.
▶ Precision and recall are related in a trade-off
▶ an F − 1 score is derived for balanced assessment with a single
number, between precision and recall. (a weighted average of
precision and recall value of the test result)
2.p.r
f =
p+r

▶ A higher F − 1 score close to 1, in the fraction between 0 and 1,


implies a higher classification performance is expected.
▶ In most classification cases, the threshold is typically set to 0.5

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 31/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance visualization Tools

▶ Except linear graph, other popular tools for visualization of results in


ML are:
▶ The Confusion Matrix: representing the accuracy of a classification
model.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 32/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance visualization Tools

▶ Bar Charts plots neumeric values for level of a categorical feature as


bars for easy comparision.
▶ We can show possible error of each bar by adding whiskers th the end
of each bar to indicate variability in the individual data points
▶ Since there are many choices for uncertainty measure (e.g. standard
deviation, confidence interval, interquartile range) it is important to
indicate what this error bar represent in comment or an annotation

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 33/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance visualization Tools


▶ Correlation matrix: aids in understanding relationships between
variables. Correlation matrix represents how different variables
interact with each other.
▶ A value closer to 0 means low (or not) correlated features,
▶ A value closer to 1 means high correlated features,
▶ A value closer to -1 means inverted high correlated features.
▶ Heatmap: used to compare the performance of various models or
algorithms on a given task or dataset.
▶ For instance, you can use them to assess the accuracy scores of
different models on different subsets or folds of data,

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 34/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Performance visualization Tools


▶ Box plot: to represent the distribution of a data set. This type of
graph shows key statistics of your data, including the median,
quartiles, and outliers
▶ Minimum: the min value in the dataset excluding the outliers.
▶ First Quartile (Q1): 25% of the data lies below the First (lower)
Quartile.
▶ Median (Q2): the mid-point of the dataset. Half of the values lie
below it and half above.
▶ Third Quartile (Q3) – 75% of the data lies below the Third (Upper)
Quartile.
▶ Maximum – the max value in the dataset excluding the outliers.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 35/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Box plot

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 36/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Further Discussion

▶ Although Deep IFD provides acceptable performance, these


approaches always count on assumption
”The labeled data are sufficient and contain completed
information about the health states”
▶ This assumption is usually unpractical because of
▶ Imbalanced data: Collecting data containing sufficient information to
reflect complete health states is difficult. Since the machines mostly
work under the healthy state, while the faults seldom happen.
▶ Unlabeled data: It is not practical to frequently stop the machines and
inspect the health states.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 37/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

Further Discussion

▶ Solution: Transfer Learning [7]


▶ Reuse the knowledge from one or more diagnosis tasks to other
related but different ones
▶ For example, the diagnosis knowledge from the laboratory-used
bearings may help recognize the health states of bearings in
engineering scenarios.
▶ In such scenario, simulate diverse faults and collect sufficient labeled
data from laboratory-used bearings.
▶ Then the trained IFD diagnosis models is reused in engineering
scenarios

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 38/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

References I

A. Krizhevsky, I. Sutskever, and G.E. Hinton, “Imagenet


classification with deep convolutional neural networks,”
Commun. ACM, vol. 60, pp. 84–90, 2017.
T. Ince, S. Kiranyaz, L. Eren, M. Askar, and M. Gabbouj,
“Imagenet classification with deep convolutional neural
networks,” IEEE Trans. Ind. Electron, vol. 63, pp. 7067–7075,
2016.
G. Lee, M. Jung, M. Song, and J. Choo, “Unsupervised
anomaly detection of the gas turbine operation via
convolutional auto-encoder,” in 2020 IEEE International
Conference on Prognostics and Health Management (ICPHM),
2020.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 39/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

References II

Y. Lei,B. Yang,X. Jiang, F. Jia, N. Li, and A. K. Nandi,


“Observers for nonlinear systems in steady state applications
of machine learning to machine fault diagnosis: A review and
roadmap,” Mechanical Systems and Signal Processing,
vol. 138, 2020.
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning
for image recognition,” in IEEE Conference on Computer
Vision and Pattern Recognition, (Seattle, USA), pp. 770–778,
2016.
L. Su, L. Ma, N. Qin, D. Huang, and A. Kemp, “Fault
diagnosis of high-speed train bogie by residual-squeeze net,”
IEEE Trans. Ind. Inform., vol. 15, pp. 3856–3863, 2019.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 40/41


Outline CNN-Based Approaches ResNet-Based Approaches Further Discussion

References III

S.J. Pan and Q. Yang, “A survey on transfer learning,” IEEE


Trans. Knowl. Data Eng., vol. 22, pp. 1245–1359, 2010.

Farzaneh Abdollahi Intelligent Fault Diagnosis Lecture 9 41/41

You might also like