0% found this document useful (0 votes)
19 views118 pages

Ventricular Arrhythmia Classification Using Convolutional Neural Networks

Uploaded by

Hari Haran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views118 pages

Ventricular Arrhythmia Classification Using Convolutional Neural Networks

Uploaded by

Hari Haran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 118

Abstract

This project applied convolutional neural network (CNN) for classification of electro-
cardiogram (ECG) signals labeled as SR (Sinus Rhythms), VT (Ventricular Tachycar-
dia) and VF (Ventricular Fibrillation). Because CNN is well known for classifying 2-
dimensional images, Short Time Fourier Transform has been used to transform the ECG
signals to spectrogram images as input for CNN we created by ourselves. We have cre-
ated images from 2 second to 10 second (or s in this report) with window size of 1s, 2s
and 3s and overlap of 0, 25% and 50%. With 2s segment size, 2s window size and 25%
overlap, we obtained the best result of sensitivities on SR, VT and VF, which are 97.6%,
76.9% and 78.1% respectively. Comparing the results of di↵erent window sizes and input
image sizes, we draw the conclusion that the classifiers of selected window sizes (1s, 2s
and 3s) perform similarly and images with smaller size (256 ⇥ 256 pixels compare with
192 ⇥ 192 pixels) achieve better results.
Acknowledgements

I would like to express my sincere gratitude to my supervisor Professor Zoran Cvetkovic


for giving me the chance to work on such an interesting project and without his expert
guidance, insight and patience, completion of this project would not have been possible.
Furthermore, I wish to thank my friends and families for supporting and encouraging
me during this period of life.
Contents

1 Introduction and State of Art 1


1.1 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Project Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.4 Literature Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Technical Background 4
2.1 Arrhythmia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.1.1 Sinus Rhythms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.1.2 Ventricular Tachycardia . . . . . . . . . . . . . . . . . . . . . . . . 4
2.1.3 Ventricular Fibrillation . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Short Time Fourier Transformation . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Resolution Issue . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 Spectrogram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.5 Convolutional Neural Network . . . . . . . . . . . . . . . . . . . . . . . . 9
2.5.1 Image Input Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.5.2 Convolutional Layer . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.5.3 Pooling Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5.4 ReLu Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5.5 Fully Connected Layer . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5.6 Layer Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.6 Data Balancing and Approaches . . . . . . . . . . . . . . . . . . . . . . . 13

3 Project Design 14
3.1 Data Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.1.1 Signal Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.1.2 Image Re-processing . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2 CNN Architecture and Parameter . . . . . . . . . . . . . . . . . . . . . . . 15

i
3.3 Result Measurement/Confusion Matrix . . . . . . . . . . . . . . . . . . . . 15
3.3.1 Sensitivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3.2 Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3.3 Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4 Development 18
4.1 Programming Language and Library . . . . . . . . . . . . . . . . . . . . . 18
4.1.1 MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.1.2 Important Libraries . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.2 Spectrogram Generation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.2.1 Segment Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.2.2 Spectrogram Images Generation . . . . . . . . . . . . . . . . . . . 19
4.2.3 Re-process images . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3 Convolutional Neural Network . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3.1 Training Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3.2 Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.4 Confusion Matrix Generation . . . . . . . . . . . . . . . . . . . . . . . . . 23

5 Experiment Procedure and Result 25


5.1 Experiment Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.1.1 Data Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.1.2 Network Construction . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.2 Experiment Result and Analysis . . . . . . . . . . . . . . . . . . . . . . . 26

6 Conclusion 30
6.1 Project Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6.2 Chapters Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6.3 Achievement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
6.4 Future Works and Extension . . . . . . . . . . . . . . . . . . . . . . . . . 31
6.4.1 Fine-tuning Spectrogram Parameters . . . . . . . . . . . . . . . . . 31

ii
6.4.2 Di↵erent Window Function . . . . . . . . . . . . . . . . . . . . . . 32
6.4.3 The Application of 1-dimensional Data for CNN . . . . . . . . . . 32

References 33

A Appendix 35
A.1 Project Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
A.2 Preprocessing Example Code . . . . . . . . . . . . . . . . . . . . . . . . . 36
A.2.1 Preprocessing Example Code for 192 x 192 image data sets . . . . 36
A.2.2 Preprocessing Example Code for 256 x 256 image data sets . . . . 80
A.3 Training Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
A.3.1 Training Scripts for 192 x 192 images data sets . . . . . . . . . . . 109
A.3.2 Training Scripts for 256 x 256 images data sets . . . . . . . . . . . 111

iii
List of Figures

1 Example of Sinus Rhythm ECG tracing . . . . . . . . . . . . . . . . . . . 5


2 Comparison of 1s and 3s window size spectrogram images . . . . . . . . . 8
3 Example of Spectrogram with window size 3s . . . . . . . . . . . . . . . . 9
4 Convolutional layer example . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5 Example of Confusion Matrix . . . . . . . . . . . . . . . . . . . . . . . . . 16
6 The snapshot of spectrogram function. . . . . . . . . . . . . . . . . . . . . 20
7 The snapshot of code for reprocessing the image. . . . . . . . . . . . . . . 20
8 Comparison of prepossessed and after-processed spectrogram images . . . 21
9 Training Option in MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . 21
10 CNN Architecture and Parameters in MATLAB . . . . . . . . . . . . . . 22
11 Example of confusion matrix . . . . . . . . . . . . . . . . . . . . . . . . . 24
12 CNN architecture for 1D data . . . . . . . . . . . . . . . . . . . . . . . . . 32

List of Tables

1 Experiment Result of 192 ⇥ 192 size image . . . . . . . . . . . . . . . . . 27


2 Experiment Result of the selected samples with 3 seconds window size . . 28
3 Comparison between di↵erent window sizes of 2-6 seconds samples . . . . 29
4 Comparison between di↵erent dimension spectrograms from selected 2-6
seconds samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
1. Introduction and State of Art 1

1 Introduction and State of Art

1.1 Problem Statement

As the World Health Organization stated, cardiovascular disease has been one of the
leading causes of human death for both developed and developing countries. Therefore,
many researchers are trying to carry up appropriate and e↵ective medical treatment for
cardiac arrhythmia. One of the key factors of their success is to distinguish di↵erent
forms of arrhythmias and give the definitions. Until recently, precise diagnostic criteria
have been established in terms of language, which is a problem for computer-aid diagnosis
because automatic discrimination between those diseases cannot be programmed without
a numeric form. Among the diseases, Ventricular Fibrillation (VF) and Ventricular
Tachycardia (VT) can be problematic for computer-aid diagnosis since VF and VT are
not always unequivocal. This is because VT is not always monomorphic, and a range of
transient and persistent forms of polymorphic VT can happen. Discrimination between
VF and polymorphic ventricular tachycardia (VT) or Torsade De Pointes (TDP) is very
challenging, but it is deemed critical since di↵erent treatment is needed for VT and VF
as well as the false classfication may cause the damage to patient’s health or even cause
death.[1][2]
As the computer-aid diagnosis is one trend of machining learning application, the issue
aforementioned has encouraged plenty of studies to find the reliable approach to classify
VT and VF by utilizing fragments of electrocardiogram (ECG) signal. It is interesting to
note that even many of these studies claimed that they have obtained very high accuracy
in such case, but the death rate of the diseases is still extremely high and this could result
from few causes, for example, the lack of ability of re-conducting the research because
of the data authentication. [1][2]

1.2 Introduction

This project aims to classify normal Sinus Rhythms, Ventricular Fibrillation and Ven-
tricular Tachycardia by using convolutions neural network as the classifier. The original
1. Introduction and State of Art 2

data is 1-dimensiona signal dataset which has the frequency of 100Hz and it contributes
to one of the major difficulties of developing this project. Because CNN has good per-
formance in image classification with 2-dimensional data, the visualization of the data
by spectrogram is applied. The reason for choosing this method and how it can be de-
veloped will be explained in later sections. In addition, the essential properties such as
segment size (how many samples in one spectrogram image), window size (size of win-
dow function) and overlap size (how many percents each window overlaps with adjacent
windows) are used to achieve better sensitivity for each class (SR, VT, VF) and overall
accuracy will be introduced in later parts as well.

1.3 Project Objective

The objective of this project is to evaluate the CNN classification performance of ECG
signals labeled as SR, VF, and VT in terms of sensibility. The expected sensitivities of
SR, VF, and VT are over 90%, 50%, and 50% respectively.

1.4 Literature Survey

To understand, design and develop this project, we have researched and applied the con-
cepts or methodologies from several studies. This section lists all the important studies
which give us the guide for designing or constructing this project.
Alwan et al. applied Support Vector Machine to classify ECG signals labelled as Sinus
Rhythms, Ventricular Tachycardia and Ventricular Fibrillation. The study picks only
6000 seconds of each class from the original signal to deal with the unbalanced data and
they obtained the result of 91% overall accuracy and 86% sensitives for each class with
2 second observation length.[1]
Research by Yuan and Cao used the convolutional neural network to classify the elec-
troencephalogram (EEG) signals and evaluated the condition of patients su↵ering from
brain damage. A novel method using Short-time Fourier Transform (STFT) to generate
spectrogram from the EEG signals, and using the spectrograma images as the input
dataset of CNN, is proposed in this paper. Achieving the highest accuracy of 96% and
1. Introduction and State of Art 3

94% for classification on common and brain death EEG signals, the approach is con-
sidered as a remarkable success.[3] Similar approaches have been also applied to other
datasets such as audio signals.[4]
Research by Kostiantyn Pylypenko suggests that the preprocessing is necessary when
the dimension of input dataset is huge. The reason is stated as this preprocessing has
the computational benefit and this also helps deal with overfitting problem. [5]
According to Sinalis et al., the input of CNN can be raw 1-dimensional signals other
than only 2-dimensional data. They have successfully applied convolutional neural net-
work to EEG signals as input and compared the results with the outcomes from using
spectrogram images as input images. Although the raw signal does not outperform spec-
trogram in this case, it still is able to learn usable features. As there is a growing interest
in applying CNN with 1-dimensional data, this study appears to be very appealing. [6]
2. Technical Background 4

2 Technical Background

This section lists and describes all the technical knowledge the reader has to acquire to
understand the report clearly.

2.1 Arrhythmia

This section explains the definition Sinus Rhythem, Ventricular Tachycardia and Ven-
tricular Fibrillation as well as their characteristics and symptoms.

2.1.1 Sinus Rhythms

Sinus Rhythm or SR, sometimes refers as normal sinus rhythm, is considered as a normal
and regular heart rate and rhythm. The shape of ECG tracing in Figure 1 shows the main
traits of SR. As the figure illustrates, the first peak noted by P is the atria contraction,
also called depolarization, which is caused by the atria squeezed. After that, there is
an intensive wave noted as QRS wave causing by ventricle depolarize, which is much
stronger because the ventricular beats stronger than atria. Then, a T wave appears
because of the repolarization of ventricular, the atria also have this activity but are
overwhelmed by ventricle depolarization. If the patient’s ECG shows the characteristics
of SR, there is no specific treatment required urgently.

2.1.2 Ventricular Tachycardia

Ventricular Tachycardia, also know as V-Tach or VT, is a fast yet regular heart rate
raise from the lower chambers of the hear, namely ventricles, which is the main pumpers
of the heart that pump blood to the rest of the body. In normal condition, the heart
typically beats from 60-100 beats per minute (bpm), and between the heart beats the
ventricular walls relax to fulfill the heart with blood then when heart beats the walls
contract the blood has been pushed to the rest of the body. For VT, however, the
heart is beating very fast, sometimes up to 250 bpm, and this results in the heart cannot
circulate enough blood to the body. Although this might seem constitutive, as in normal
2. Technical Background 5

Figure 1: Example of Sinus Rhythm ECG tracing

situation we would assume that the faster the heart beats the more blood will be send
to the body, the truth is that during the heart beats the blood will be fulfilled into the
heart and if heart beats too fast it will not be enough time for heart to fill enough blood
and as a result the blood circulate to the rest of the body will be insufficient. When one
su↵er from VT they might fell the shortness of breathless, chest pain, light headed or
dizzy. In extreme cases, the oxygen to the brain is too slow so that the person might
even faint or pass out. Therefore, VT is a very serious condition and requires immediate
medical attention. [7]

2.1.3 Ventricular Fibrillation

Ventricular Fibrillation, also known as V-Fib or VF, is abnormal, deadly heart rhythm
causing by the ventricles lose the ability to contract and circulate blood to the rest of
the body. It is caused by the heart lose conducted the signal in ventricles, and instead,
it will have the rapid, random and chaotic signals, which leads to the ventricular walls
spasming. And when this happens, blood cannot be circulated to the body and brain,
instead, they will stay in the heart and deprive all other organs of oxygen. If the VF
is not reversed by electronic shock immediately, the patient is going to have permanent
brain damage or even death, because the brain and the body are not getting enough
2. Technical Background 6

oxygen. [8]

2.2 Short Time Fourier Transformation

To comprehend better of Short Time Fourier Transform or STFT, a clear understanding


of Discrete Fourier Transformation (DFT) is necessary. DFT is regarded as the Con-
tinuous Fourier Transformation but DFT is applied for signals which known only at N
samples separated by sampling time T. The original Fourier transformation will be:

Z 1
jwt
F (j, w) = f (t)e dt (2.1)
1

where the f (t) is the continuous signal, in DFT the N sample aforementioned can be
denoted as as f (0), f (1), ..., f (N 1), which results in:

j0 jwT jw(N 1)T


f (0)e + f ()e + ... + f (N 1)e (2.2)

or,
N
X1
jwkT
F (j, w) = f ([k])e (2.3)
K=0

2⇧
In principle, we could apply 2.3 for every w,which is the radial frequency equals to T ,

but since only the N samples significance in DFT, we will consider only the correlated
outputs. As the DFT assumes the data are periodic, the w can be set as fundamental
frequency and it’s harmonics as

2⇧ 2⇧
w = 0, , ..., ⇥ (N 1) (2.4)
NT NT

Therefore, the equation 2.3 can be transformed as

N
X1
j 2⇧ nk
F [n] = f [k]e N (2.5)
k=0
2. Technical Background 7

F [n] is the DFT coefficient of of the sequence f [k].


Short Time Fourier Transformation is one type of Fourier Transformation which sepa-
rates the data into few equal length segments and apply DFT to each of these segments
to generate few results. Additionally, a window function has been multiplied to DFT
function to each segment and these windows can usually overlap, typically from 25% to
50% with each to reduce the e↵ect of spectral leakage. [9]

2.3 Resolution Issue

One thing to keep in mind when using STFT is the resolution issue according to the
Uncertainty principle, which means the di↵erent window size will a↵ect the way how the
information will be presented. Trade o↵ between good frequency resolution and time
resolution have to be made. This principle suggests that the better frequency resolution
but poor time resolution will be obtained from the longer window and vice versa. The
example of this is shown as Figure 2. It is shown that image with 3s window size (the
right part) has more DFT coefficients and displays more frequencies than 1s window
size, which is located in the left part. This image presents the location of changes in
a more precise way while the frequency resolution is reduced. This particular principle
is one of the primary reasons why we have created spectrogram images with di↵erent
window sizes.

2.4 Spectrogram

Spectrogram can be described as the visual presentation of the intensity of the frequencies
of the signal which has the trait that the frequency changes over time. As Figure 3
illustrates, the X-axis represents frequency which in this case is the normalized frequency.
Normalized frequency ranges from 0 to 1 and can be calculated as cycles/sample. For
example of 800 seconds, if the STFT result is 500Hz, the exact number in this scale is
500/800 which is 0.62. Y-axis, on the other hand, always changes because it represents
the time, which in this case, is actually the sample numbers. The frequency of the ECG
signal of this project is 100Hz which means 100 samples will be the equivalence of 1s.
2. Technical Background 8

Figure 2: Comparison of 1s and 3s window size spectrogram images

Segment time is presented with the Y-axis. For instance, 3s figure has Y-axis equal to
300. Additionally, the color of the spectrogram stands for the intensity or magnitude.
The brighter color in spectrogram means high intensity while dark color stands for low
intensity. Furthermore, one can notice the horizontal lines when looking closely at the
provided spectrogram images. Those lines are, in fact, the result of each window function
and the size of each line are a↵ected by the overlap of the windows. For instance, Figure
3 illustrates the spectrogram image result from 3s segment size (300 samples are used to
generate one image) with 1s window function with 50% overlap on each window. From
the example, it can be seen the length of first and last window is around 75 which is
di↵erent from other windows with the length of 50. This is the result of the overlap
operation. When 50% overlap window is specified, it has been split into two overlap
part of 25% to each end of the window. And since the first and last window have only
one side overlap with neighbour windows, these two windows have the size of 75 while
others have 50 size due to the two overlaps at each end. [10]
2. Technical Background 9

Figure 3: Example of Spectrogram with window size 3s

2.5 Convolutional Neural Network

Convolutional neural network, also known as CNN, has been proven very e↵ective in some
fields such as image recognition and classification. The basic idea of CNN is that giving
the computer images, which will be dealt by the computer as an array of numbers, it will
then calculate out the numbers that indicate the probability of the input being a certain
class and claim the class input belong to, which is based on the highest probability.
CNN is one type of feed-forward artificial neural network where the neurons inside each
layer have learnable weights and biases. The connectivity pattern between the neurons
in CNN layers is inspired by the animal or human visual cortex organization. Because
when a person look at a picture of human, he is capable to classify what is the object
that the picture presents if the image has identifiable features such as hands or fingers.
Computers, on the other hand, is capable of performing tasks like image classification
only by looking for features at low level such as edges and curves at first. Then, through
a sequence of di↵erent layers, computers can then build abstract concepts such as fingers.
CNN stacks multiple layers that are input layer, output layer and hidden layers which are
convolution layer, pooling layer, Re-Lu layer and fully connected layer. In the following
section, each layer will be described in detail.[11]
For the better understanding of convolutional neural network, there are two terminologies
2. Technical Background 10

have to comprehend ahead, which are Convolution Kernels and Receptive Fields.
A convolution kernel, also known as filter, is practically a small sized matrix of real valued
entries and then convolve with the input volume to generate the Activation Maps. Those
maps are created by summing the results of selected field using dot production. After
record the dot production, the kernel then slides by a stride value until the entire input
volume has been processed. This process happens in every color channel. Figure 4
illustrates how this operation works for an input matrix size of 7 ⇥ 7, which applies a
3 ⇥ 3 filter with stride equals to 1 and generates the activation map of 5 ⇥ 5. The green
color highlighted result is the result of the red region from the dot production of the
convolution kernel, the calculation is presented in Equation 2.6.[12] [13]

(1, 1, 0).(1, 0, 1) + (1, 1, 0).(0, 1, 0) + (1, 1, 1).(1, 0, 1) = 1 + 1 + 2 = 4 (2.6)

Through few hidden layers, regular Neuron Network receives and transforms an input
into output. Each of these hidden layer consists of a number of neurons and each of the
neurons fully connected to all the neurons in the previous layer.
CNN defines each neuron connect to a relatively smaller 2-dimensional spatial region
name Receptive Region (W idth ⇥ Height) and extends it to the full size of input volume
depth by default. The way each neuron connects to paired Receptive Field is, however,
fully connection and this means that the neuron connects to each pixel inside the filed.
For example if we have an input size of 192 ⇥ 192 ⇥ 3 and the receptive field is 3 ⇥ 3 then
each neuron in this layer will have weights of 9 (3 ⇥ 3 ⇥ 3). These small regions enables
the cross-section operation of CNN and produces activation maps. [12] [13]

2.5.1 Image Input Layer

Images are typical inputs of CNN and each one is treated as a matrix of pixel values
by computer. Therefore, Image Input layer is the CNN layer for receiving images as
the CNN input. Having di↵erent bit size, each pixel in a image encodes various values.
Most common sizes of the pixel are 8 bit or 1 Byte, thus resulting the range from 0 to
2. Technical Background 11

Figure 4: Convolutional layer example

255. Additionally, for colorful images, especially RGB (Red, Green, Blue) images, the
separated color channels (3 if the input is RGB images) adds an additional depth field
to the input data, making it 3-dimensional. As a result, for every given image if it is
RGB and has 240 ⇥ 240 (W idth ⇥ Height) pixels, it is necessary to have 3 matrices
for each image, besides the width and height we need another one for the color channel.
Therefore, the images having a 3-dimensional structure are referred as the Input Volume,
in this case, (240 ⇥ 240 ⇥ 3). [12] [13]

2.5.2 Convolutional Layer

Also known as Conv layer, convolutional layer is the crucial part of CNN and does the
core operations such as training the network and refines the neurons. As the name
suggests, it does convolution operations to the input as mentioned in convolutional
kernel part, for each input it applies the filer named convolutional kernels, and move
this filter around by a specifies number (stride) all over the input volume. Consisting of
few neurons, Conv layer arranges them in a 3-dimensional manner with Width, Height
and Depth Column which is same depth of the input volume and fully connects each
neuron to a small region of input named Receptive Filed as stated before. Similar to all
the feed-forward neural networks, CNN trains and upgrades the weights through back
propagation algorithm on each learning iteration with a learning rate which defines how
much does the parameters changes every iteration. [12] [13]
2. Technical Background 12

2.5.3 Pooling Layer

Also known as down-sampling layer, this layer is conventionally inserted between two
Conv layers to reduce the spatial dimension, W idth ⇥ Height, of input volume of the
Conv layer placed afterward, while the depth dimension will remain exactly same. As
the name suggested, this layer does downsampling operation because of information
loss due to size reduction, which is actually beneficial in Neuron Networks because
not only it reduces the computation overhead for later layers, it helps with overfitting
as well. The most common pooling layer applies 2 ⇥ 2 filters with a stride of 2 and
picks out the max value in the selected region to form an output, discarding 75% of
the activation map generated by the previous layer. Except for the max operation,
pooling layer can perform other operations as well such as average pooling. In summary,
pooling layer accepts an input volume of size W ⇥ H ⇥ D and downsamples this input
by two hyperparameters of spatial extent F and a stride of S then generates an output
of [(W F )/S + 1] ⇥ [(H F )/S + 1] ⇥ D. It is worth to mention that the most common
pooling layers are F=3, S=2 (which the filter overlap with each other) and F=2, S=2,
larger receptive fields than this will be counterproductive. [12] [13]

2.5.4 ReLu Layer

ReLu here is the abbreviation of Rectifier Linear Unit, in this layer all the negative
values in activation map change to be 0, which is for generalization. [11]

2.5.5 Fully Connected Layer

As its neurons is fully connected to all outputs of the previous layer like regular Neural
Networks, Fully Connected layer takes the output matrix of the previous layer and
generates an output vector. The vector generated by Fully Connected layer has the
number of dimensions same as classes that the program has to predict the input to be.
In this layer, all the values vote for the final result, and for each class the associated
weight on each value will be di↵erent, thus, result in di↵erent outcomes. Regularly, the
2. Technical Background 13

outcome of highest values becomes the class from CNN predicted outcome and this is
commonly the last stage of CNN. [12] [13]

2.5.6 Layer Pattern

The most common pattern of CNN is to stack up few Conv+ReLu layers follow by a
Polling layer and repeat this pattern (normally, between 1 to 3 times) until the input
image has been processed to a small spatial size. The last fully connected layer holds
the class from CNN predicted outcome. [12] [13]

2.6 Data Balancing and Approaches

Typically, imbalanced data refers to the situation where the minority class is significantly
smaller than the major class, which is very usual among medical data set, such as ECG
or EEG, because there are always much more general behaviors than focused ones. This
imbalanced data can make classification algorithms su↵er due to the fact that typical
classifier is more sensitive to major class while less sensitive to the minority class. The
preprocessing of these imbalanced data before feeding them into classifiers is so called
data balancing. One thing to note that the absolute equivalence of data is in fact very
rare, so for the small di↵erence we can just ignore while the significant di↵erence is not
tolerable.
According to the ECG signals we use for this project, we have much more SR data
than VT and VF, therefore the preprocessing of the imbalance data is very necessary.
There are more than one ways to deal with imbalanced data, in this report we will only
emphasize on two methods, namely subsampling[1] and confusion matrix. As one of the
most common and straightforward strategies to cope with imbalanced data, subsampling
in this project means to pick the same amount of samples from each of 3 classes to
achieve balanced data. As the accuracy is not the best measurement for imbalanced
data, we choose another matrix named confusion matrix to see the details of accurate
and inaccurate predictions, as well as other measurement terms such as sensitivities.
The details of confusion matrix will be explained in the design chapter.
3. Project Design 14

3 Project Design

3.1 Data Transformation

In this project, we process the original signal data with 2 steps. Firstly, the STFT has
been used to transform the 1-dimensional signal data into 2-dimensional spectrogram
images. Then, the unnecessary parts are discarded as well as the size of images are
reduced for performance reasons.

3.1.1 Signal Processing

The original data of this project is 1-dimensional ECG signal and the classifier we choose
to use is convolutional neural network which is typically used to classify 2-dimensional
data such as images. As a consequence, it is not difficult to understand why it is
necessary that we need to reprocess the original 1-dimensional data into 2-dimensional
images. There is no research trying to apply CNN to ECG signals as of now, but luckily,
we have found one research using the same classifier, CNN, on electroencephalogram
(EEG) signal and choose this study to be the guideline of how to process the signal
data. In the study, Yuan and Cao tried to use CNN to classify brain damage and normal
EEG data by transforming every 20 seconds signals into spectrogram images using STFT
method on MATLAB platform. Furthermore, to make the most of the finite size data,
each window in STFT is 20% overlapped with the adjacent windows. Following this
research, we decided to transform our ECG signal into spectrogram images to train
CNN with parameters which are number of samples to generate one image (segment
size), window size and overlap percentage remains to be exposed by experiment for
achieving the optimal result. [3]

3.1.2 Image Re-processing

Due to the input layer format in CNN, the produced spectrograms have to be re-sized
in order to be able to feed into the network.[5] Yuan and Cao suggest the image size of
256 ⇥ 256 [3] which is the max spatial limitation of the input layer of CNN while another
3. Project Design 15

study conducted by Kostiantyn Pylypenko argued that re-sizing the image to be 192⇥192
actually leads to better performance in terms of running time and classification result.[5]
To guarantee that we will achieve the best result as we can, in this project some images
are re-sized both in 256 ⇥ 256 and 192 ⇥ 192 to compare the results and provide the
guide for the future work.

3.2 CNN Architecture and Parameter

The CNN architecture and parameters are defined with MATLAB provided functions,
which will be explained in detail in Development section 4.3.

3.3 Result Measurement/Confusion Matrix

Because of the imbalanced original data and wishing to explore more aspects of exper-
iment outcomes, we choose confusion matrix to be the measurement of classification
result.
As Kevin Markham’s descriptions, the confusion matrix is a table lists all the test data
with their true values and predicted values. One thing to note is that the confusion
matrix can be more than 2 ⇥ 2, in our case the matrix has size of 3 ⇥ 3. For easier un-
derstanding, Figure 5 is a given example of confusion matrix of a binary classifier. The
further explanation will be based on this example. Firstly, there are four basic terms to
understand which are:
True Positives (TP): This is the case when the classifier predicted yes and the actual
value is yes as well, which is the cell of second row and second column.
True Negatives (TN): This is the case when the classifier predicted no and the actual
value is also no , which is the cell of first row and first column.
False Positives (FP): This is the case when the classifier predicted yes and the actual
value is no, which is the cell of second row and first column.
False negatives (FN): This is the case when the classifier predicted no and the actual
value is yes, which is the cell of first row and second column.
Based on the four basic terms, we can continue to some rates used to describe the clas-
3. Project Design 16

Figure 5: Example of Confusion Matrix

sification performance of the classifier. [14]

3.3.1 Sensitivity

Sensitivity, also known as Recall, means when it is actual yes how often does the classifier
predict yes or when it is actual no how often does the classifier predict no.

T P/ActualY es = 100/105 = 0.95 (3.1)

T N/Actualno = 50/60 = 0.83 (3.2)

3.3.2 Accuracy

Accuracy, which means in overall, how often does the classifier make correct predictions.

(T P + T N )/T otal = (100 + 50)/165 = 0.91 (3.3)

Error Rate, also known as misclassification rate, means how often does the classifier gives
the wrong prediction. The rate can be calculated as one minus accuracy or as Equation
3.4.
(F P + F N )/T otal = (10 + 5)/165 = 0.09 (3.4)
3. Project Design 17

3.3.3 Precision

This rate means when the classifier make a prediction, how often it is accurate.

T P/ActualY es = 100/110 = 0.91 (3.5)

T N/ActualN o = 50/55 = 0.90 (3.6)

[14]
4. Development 18

4 Development

This section describes the development tools used for the project as well as the expla-
nation of essential functions.
To develop this project, a Windows 10 computer with an Intel Core (TM) i7-5500 U
(2.40 GHz), a 16.0 GB DDR3 RAM and a nVidia 940m with 4GB VRAM Graphics
Processing Unit (GPU) was used.

4.1 Programming Language and Library

4.1.1 MATLAB

The programming language choice of this project is MATLAB (version R2017a-9.2.0.538062)


which stands for MATrix LABoratory and was originally written for providing easy ac-
cess to matrix software developed by linear system package and Eigen system package
projects. Integrating visualization, programming environment, and computation, MAT-
LAB is not only a high-performance programming language but also an language envi-
ronment which contains sophisticated data structure and built-in debugging tools. Due
to the mentioned traits, MATLAB has been the excellent tool for researching. Most
importantly, MATLAB has powerful libraries for neural network, signal processing and
image processing, which is extremely necessary and helpful for this project due to the
problem domain. [15][16]

4.1.2 Important Libraries

Specific applications which are collected in a package are referred as Toolbox in MAT-
LAB. To complete this project, we choose a MATLAB toolbox called Neural Network
Toolbox which provides apps, algorithms, and already trained models for creating and
training the neural networks. In addition, the Parallel Computing and Signal Process-
ing toolbox have been used for utilizing GPU in computation and processing signal data
such as spectrogram image generation as well.
4. Development 19

4.2 Spectrogram Generation

This section describes how spectrogram images are generated from the given ECG sig-
nals. The main steps are choosing the number of samples for generating one spectrogram
image (segment size), image generation and re-process images for training performance.

4.2.1 Segment Size

The sample numbers inside one spectrogram image are defined by variable segment size,
and for each channel (SR, VT and VF) we define start point which is initially 1 then
change to the ending point after each iteration and end point which equals to the size of
start point plus segment size. After defining the variables, we use while loop to apply
this interval iteratively in SR, VT and VF to pick around 6000s (600000 samples) from
each of the three channels. One thing to notice is that the original data is stored in
cells and as one cell contains di↵erent numbers of samples it will result in such case
that the remaining samples in the cell is less than the window size we defined. Under
such situation, we will discard the remaining data in this cell and jump to next cell with
start point equals to 1 again.

4.2.2 Spectrogram Images Generation

Using STFT to generating the spectrogram based on several given parameters, the spec-
trogram(x, window, noverlap) function is applied for crafting all the required spectro-
gram images. x is the input signal which in our case are defined by start point and
end point, which picks out segment size of samples from the cells of each class. Defined
as variable window size, parameter window in this function are used to split the signals
into pieces and perform window function on each piece. The last parameter overlap
means we will use this number of samples to overlap between adjacent pieces. This
defined by the multiplying di↵erent percentages with window size for the purpose of
searching for optimal result. The parameters of this function change values according
to the images we aim to produce, while hamming window function are chosen for all the
experiments. Figure 6 is the snapshot of example spectrogram function.
4. Development 20

Figure 6: The snapshot of spectrogram function.

Figure 7: The snapshot of code for reprocessing the image.

4.2.3 Re-process images

Typically, the images produced by spectrogram(x, window, noverlap) function will be


larger than the defined spectrogram size of CNN image input layer. Because those im-
ages include other information such as annotations which does not contain any useful
information for CNN. Consequentially, we remove the unnecessary part and re-size the
images in order to use them as input images for CNN. Fortunately, this process can be
simply completed by leaving xlabel and ylabel to be empty and set image position and
paper position to be the aimed size. Figure 8 shows the comparison of processed and
after-processed spectrogram images while Figure 7 is the snapshot of code for reprocess-
ing the image.

4.3 Convolutional Neural Network

This sections focus on addressing how the CNN are created and how it is trained with
the images generated in previous section.

4.3.1 Training Options

As it is providing a list of already written functions designed specifically for setting up


and training a convolutional neural network, Neural Network Toolbox reduces our work-
load significantly. Before building up a network, we specified few training options such
4. Development 21

Figure 8: Comparison of prepossessed and after-processed spectrogram images

Figure 9: Training Option in MATLAB

as learning rate by applying trainingOptions(solverName,Name,Value) function which


returns a set of training options specified by name value argument pairs. Figure 9 is the
screen shot of the training options we specified, where we set to train the network using
stochastic gradient descent and maximum number of training epochs is 30 with initial
learning rate is 0.0001 which remains same through the training process.

4.3.2 Network Architecture

Neural Network Toolbox provides functions for every CNN layer, here, we will only de-
scribe the one we need and used for this project. Figure 10 shows the CNN architecture
and parameters used for this project.
Image input layer: This layer is defined by function imageInputLayer(inputSize) where
4. Development 22

Figure 10: CNN Architecture and Parameters in MATLAB

inputSize specifies a vector of two integers standing for [Height,Width] or three numbers
of [Height,Width,Channels]. To obtain the optimal result we use two size images as
we mentioned before, the image input layers have two inputSize of 192 ⇥ 192 ⇥ 3 and
256 ⇥ 256 ⇥ 3.
Convolutional layer: This layer is defined by function convolution2dLayer(filterSize,numFilters),
in which the filterSzie defines the filters with the width and height and if only one num-
ber is given which means it is a scalar value, the width and height will have same value.
Another parameter numFilters, number of filters, specifies an integer which determines
the number of feature maps will be generated by the Conv layer. Having thirty filters
with height and width of 5, our conv layer is defined as convolution2dLayer(5,30) which
does not define stride and padding variables and let them to be default values of 1 and
0 respectively.
ReLu layer: This layer is defined by function reluLayer(), which performs the typical
threshold operation, where all the negative values are set to be 0.
Pooling layer: This layer is defined by function maxPooling2dLayer(poolSize) which per-
forms max pooling by dividing the input into regions with both Width and Height of
poolSzie and returning the biggest value in each region. In this project, all the pooling
layers are defined as maxPooling2dLayer(2,’Stride’,2) which performs max pooling op-
eration on 2 ⇥ 2 regions having no overlap with each other.
Fully connected layer: This layer is defined by function fullyConnectedLayer(outputSize),
4. Development 23

where outputSize should be equal to the number of classes to be classified if this is the
last layer placed before the softmax layer. Thus, in our case, the outputSize is 3 because
the signals data for our project are labeled as SR, VT and VF.
The last two layers of the CNN are softmax layer and classfication layer defined by soft-
maxLayer() and classificationLayer() respectively.
Instead of using pretrained CNN provide by Neural Network Toolbox, we created a new
CNN from ground up. The architecture pattern of CNN of this project follows the
suggestion mentioned in Section 2.5.6, and this means the CNN architecture we use is
having 2 Conv-ReLu stacks, and is followed by ReLu layers.
After defining the parameters of each layer and the architecture, the network are
trained with trainNetwork(imds,layers,options) provided by Nerual Network Toolbox,
where imds parameter means labeled images for the classification, layers and options
are CNN layers and the training option we defined earlier.

4.4 Confusion Matrix Generation

As it is sufficient for illustrating sensitivity, precision and overall accuracy, Confusion


Matrix has been chosen to be the tool for showing the results. MATLAB provides
specific function designed as plotconfusion(targets,outputs). This function plots the
confusion matrix based on targets matrix and outputs matrix. In this function, targets
is a N ⇥ M matrix which represents the actual class labels, where N is class numbers
and M is the number of all testing samples. Each column in targets has 1 one and
N-1 zero, indicating the correct class and false class respectively. Similarly, outputs are
specified by a N ⇥ M matrix as well which stands for the predictions made by CNN
on same testing samples. The di↵erence between targets and outputs is that, in targets
each column contains values only 0 or 1 while for outputs each column can contains
the possibilities between 0 to 1. And the summation those elements in one column will
equal to 1. [17] The difficulty for applying this function directly to our project is that
all the testing data we use are class labels not numeric matrix. Thus, we implement a
function named confusionmattransform() to transform all the data to be same format as
4. Development 24

Figure 11: Example of confusion matrix

required by plotconfusion(targets,outputs). Figure 11 shows one example of confusion


matrix generated by MATLAB in which 1, 2, 3 stands for SR, VF and VT respectively.
5. Experiment Procedure and Result 25

5 Experiment Procedure and Result

This chapter reports the essential procedures during the experiments in sequence and
reveals the important results as well as the analysis of the presented results.

5.1 Experiment Procedures

5.1.1 Data Preprocessing

As we mentioned before, to obtain the optimal result, the di↵erent lengths of segments
for forming one spectrogram image are used, which the first step of our experiment. Un-
til this point, we generate spectrograms from 2s (200 samples) until 10s (1000 samples).
Furthermore, the images are generated by STFT, which means that the window overlap
may e↵ect the spectral leakage and consequentially e↵ect the experiment results. There-
fore, we also generate spectrogram with overlap of 0, 25% and 50% for every segment
size.

5.1.2 Network Construction

The final CNN architecture of this project is two stacks of Conv-ReLu (the details are
explained in Section 3.5.6), each one follow by a pooling layer and the last three layers
are Fully connected layer, Softmax layer and classification layer. The filter sizes are set
to be 5, while the numbers of filters are set to be 30 which is limited by the computation
capability of the resources. The stride values for both Conv layer and pooling layer are
set to be 1, and the regions in pooling layer where max pooling happens is always set to
be 2⇥2. Moreover, the learning rate, epoch and optimization method stay as 0.01,30 and
Stochastic gradient descent through all the experiments. Figure 7 and Figure 8 shows
the training option and CNN, all the presented results in next section are obtained form
this CNN.
5. Experiment Procedure and Result 26

5.2 Experiment Result and Analysis

This section lists the results obtained from all experiments in terms of sensitivities of
each class and overall accuracy.
Table 1 shows the experiment outcomes using 192 ⇥ 192 ⇥ 3 spectrogram images with
1s and 2s window size. According to the presented data, all the overall accuracy are
considerably high with the least value at 76.90%. Also, all sensitivities of SR exceed 95%
while the sensitivities for classifying VF varies significantly from 91.6% to 62.0% from
with di↵erent segment size. Having even more gap between outcomes, sensitivities of
VT cover from slightly below 50.0% to 88.0%. Also, when looking closely into Table 1,
one may notice that for one training, usually when VF achieves very high sensitivity, the
sensitivity of VT, however, is low. For example, the experiment with 10s segment size,
1s window size, 25% overlap gives highest sensitivity of VF of 92.0% while sensitivity
of VT dropped dramatically to 48.7%. Thus, when we pick the best result to present,
it is necessary to make the trade o↵ between VF and VT sensitivities. And according
to this, the classifier acting on 2s segment size, 2s window size and 25% percent overlap
are considered to give best result, with sensitivities of SR , VF and VT achieving 97.6%,
78.1% and 76.9% respectively and overall accuracy of 84.20%.
As the Uncertainty principle suggests that the window size has the significant influ-
ence on how the information is presented in spectrogram, we create another 12 experi-
ments of 3s window size on selected samples, the outcomes are shown in Table 2.
To compare the outcomes of 1s, 2s and 3s window size of selected samples, Table 3
has been created. As the outcomes indicate classifies on window size 3s perform similarly
as 2s and 1s window size on same segment size and overlap.
To check the theory raised by Kostiantyn Pylypenko that smaller sized inputs lead
to better classification performance,[5] we compared 5 sets of image with di↵erent sizes
(256⇥256 pixels and 192⇥192 pixels) and present the comparison in Table 4. In general,
the classifiers perform on smaller sized images give better outcomes.
5. Experiment Procedure and Result 27

Segment Size Window Size Overlap SR Sens. (%) VF Sens. (%) VT Sens. (%) Accuracy
0.00 96.10 83.50 66.30 82.00
1 second 0.25 98.30 85.10 64.30 82.50
0.50 98.50 78.10 72.50 83.10
2 seconds
0.00 98.30 74.80 78.80 84.00
2 seconds 0.25 97.60 78.10 76.90 84.20
0.50 98.00 76.50 76.80 83.80
0.00 98.60 70.60 70.20 79.80
1 second 0.25 98.60 68.80 71.80 79.70
0.50 99.20 69.60 73.20 80.70
3 seconds
0.00 98.40 73.80 66.80 79.70
2 seconds 0.25 98.20 72.40 70.80 80.50
0.50 98.60 78.60 67.40 81.50
0.00 95.50 56.30 82.40 78.10
1 second 0.25 99.20 72.50 70.10 80.60
0.50 98.90 63.20 80.00 80.70
4 seconds
0.00 97.10 83.70 66.40 82.40
2 seconds 0.25 97.90 77.30 69.30 81.50
0.50 97.90 78.70 72.50 83.00
0.00 97.30 72.30 71.00 80.20
1 second 0.25 98.00 72.70 68.70 79.80
0.50 95.40 67.30 71.70 78.20
5 seconds
0.00 99.00 79.30 63.00 80.40
2 seconds 0.25 97.00 66.00 76.70 79.90
0.50 97.70 74.00 66.70 79.40
0.00 99.60 74.40 56.80 76.90
1 second 0.25 95.20 86.80 53.20 78.40
0.50 99.20 70.40 70.00 79.90
6 seconds
0.00 99.60 78.40 66.00 81.30
2 seconds 0.25 99.60 72.00 78.40 82.90
0.50 100.00 84.00 67.20 83.70
0.00 96.30 72.00 65.40 77.90
1 second 0.25 98.10 80.80 59.30 79.40
0.50 99.10 72.00 65.40 78.80
7 seconds
0.00 99.10 80.80 62.10 80.70
2 seconds 0.25 98.60 77.10 66.40 80.70
0.50 99.10 86.40 59.30 81.60
0.00 98.90 73.30 62.60 78.30
1 second 0.25 97.90 59.90 78.60 78.80
0.50 99.50 66.30 71.10 79.00
8 seconds
0.00 99.10 80.80 62.10 80.70
2 seconds 0.25 99.50 66.80 73.30 79.90
0.50 98.40 68.40 72.70 79.90
0.00 96.80 73.10 65.00 79.00
1 second 0.25 98.80 91.60 49.40 79.90
0.50 95.20 66.90 83.10 81.70
9 seconds
0.00 96.80 75.10 65.00 79.00
2 seconds 0.25 97.60 75.90 74.10 82.50
0.50 98.20 68.70 77.10 81.30
0.00 98.00 84.70 61.30 81.30
1 second 0.25 98.00 92.00 48.70 77.60
0.50 97.30 70.00 73.30 80.20
10 seconds
0.00 97.30 62.00 88.00 82.40
2 seconds 0.25 97.30 82.70 62.70 80.90
0.50 97.30 73.30 74.70 81.80

Table 1: Experiment Result of 192 ⇥ 192 size image


5. Experiment Procedure and Result 28

Segment Size Window Size Overlap (%) SR Sens. (%) VF Sens. (%) VT Sens. (%) Accuracy
0.00 100.00 73.40 75.40 82.90
3 seconds 3 seconds 0.25 100.00 70.40 75.80 82.10
0.50 99.80 75.20 72.00 82.30
0.00 98.90 76.50 74.40 83.30
4 seconds 3 seconds 0.25 98.70 73.30 76.80 82.90
0.50 98.70 70.70 77.90 82.40
0.00 97.00 68.70 73.00 79.60
5 seconds 3 seconds 0.25 98.00 68.00 73.70 79.90
0.50 98.30 78.70 66.70 81.20
0.00 99.20 78.40 66.40 81.30
6 seconds 3 seconds 0.25 99.60 79.60 67.60 82.30
0.50 99.60 73.60 78.00 83.70

Table 2: Experiment Result of the selected samples with 3 seconds window size
5. Experiment Procedure and Result 29

Segment Size Overlap (%) Window Size SR Sens. (%) VF Sens. (%) VT Sens. (%) Accuracy
1 second 98.60 70.60 70.20 79.80
0.00 2 seconds 98.40 73.80 66.80 79.70
3 seconds 100.00 73.40 75.40 82.90
1 second 98.60 68.80 71.80 79.70
3 seconds 0.25 2 seconds 98.20 72.40 70.80 80.50
3 seconds 100.00 70.40 75.80 82.10
1 second 99.20 69.60 73.20 80.70
0.50 2 seconds 98.60 78.60 67.40 81.50
3 seconds 99.80 75.20 72.00 82.30
1 second 95.50 56.30 82.40 78.10
0.00 2 seconds 97.10 83.70 66.40 82.40
3 seconds 98.90 76.50 74.40 83.30
1 second 99.20 72.50 70.10 80.60
4 seconds 0.25 2 seconds 97.90 77.30 69.30 81.50
3 seconds 98.70 73.30 76.80 82.90
1 second 98.90 63.20 80.00 80.70
0.50 2 seconds 97.90 78.70 72.50 83.00
3 seconds 98.70 70.70 77.90 82.40
1 second 97.30 72.30 71.00 80.20
0.00 2 seconds 99.00 79.30 63.00 80.40
3 seconds 97.00 68.70 73.00 79.60
1 second 98.00 72.70 68.70 79.80
5 seconds 0.25 2 seconds 97.00 66.00 76.70 79.90
3 seconds 98.00 68.00 73.70 79.90
1 second 95.40 67.30 71.70 78.20
0.50 2 seconds 97.70 74.00 66.70 79.40
3 seconds 98.30 78.70 66.70 81.20
1 second 99.60 74.40 56.80 76.90
0.00 2 seconds 99.60 78.40 66.00 81.30
3 seconds 99.20 78.40 66.40 81.30
1 second 97.60 64.80 73.60 78.70
6 seconds 0.25 2 seconds 99.60 72.00 78.40 82.90
3 seconds 99.60 79.60 67.60 82.30
1 second 99.20 70.40 70.00 79.90
0.50 2 seconds 98.40 74.00 71.20 81.60
3 seconds 99.60 73.60 78.00 83.70

Table 3: Comparison between di↵erent window sizes of 2-6 seconds samples


Segment Size Window Size Figure Dimension Overlap (%) SR Sens. (%) VF Sens. (%) VT Sens. (%) Accuracy
192 x 192 0.00 98.30 74.80 78.80 84.00
2 seconds 2 seconds
256 x 256 0.00 99.20 73.70 76.90 83.30
192 x 192 0.25 98.60 68.80 71.80 79.70
3 seconds 2 seconds
256 x 256 0.25 98.60 70.40 68.60 79.20
192 x 192 0.50 97.90 78.70 72.50 83.00
4 seconds 2 seconds
256 x 256 0.50 96.40 73.30 77.30 83.00
192 x 192 0.00 97.30 72.30 71.00 80.20
5 seconds 1 seconds
256 x 256 0.00 97.30 45.30 83.70 75.40
192 x 192 0.25 99.60 72.00 78.40 82.90
6 seconds 2 seconds
256 x 256 0.25 99.20 66.80 82.40 82.80

Table 4: Comparison between di↵erent dimension spectrograms from selected 2-6 seconds
samples
6. Conclusion 30

6 Conclusion

6.1 Project Summary

This project proposes to solve the classification problem of Sinus Rhythms, Ventricular
Tachycardia and Ventricular Fibrillation by Convolutional Neural Network. To use the
original ECG signal data with CNN, which requires input data to be 2-dimensional as
images, the Short Time Fourier Transform is applied to transform signals into spec-
trograms. Another important trait of the original data is that the data is imbalanced
which means SR has much more samples than VT and VF. To address this problem,
another signal processing method named subsampling has been applied, and this means
that we picked 6000s samples from each class to manually make the data balance. The
processed images are then fed into CNN we created to acquire confusion matrix which
shows the sensitivity of each class and overall accuracy. And we have successfully achieve
the objective of obtaining over 90% sensitivity on SR and 50% on VT and VF.

6.2 Chapters Overview

Chapter 1 presents the introduction of cardiovascular disease as well as the state-of-art


which tries to classify the SR,VT and VF. It also gives the introduction of this project
based on the approach used to solve same problem and the goal.
Chapter 2 defines a number of technical backgrounds which could help the reader to
understand better of this project in terms of necessity and methodology.
Chapter 3 describes the important decisions about how the encountered problems are
solved and the important decisions are made.
Chapter 4 provides the detailed explanation of how each design are actually implemented
based on the chosen programming language and toolkit. In this chapter, the critical func-
tions, either provided by toolkit or written by us, used to solve a major problem has
been listed and explained as well as the parameters.
Chapter 5 gives the information about the procedures of how the experiments are con-
ducted. More importantly, all the results,including sensitivity of each class and overall
6. Conclusion 31

accuracy, obtained in each experiment are listed in tables as well as the analysis from
the obtained results.
Chapter 6 summarizes the work had been done and provides the directions of how this
project could be improved and the new approach worth trying.

6.3 Achievement

Obtaining sensitivities of detection for SR with 97.6% and VF and VT with 78.1% and
76.9% respectively, this project is deemed to successfully achieve the objective which is to
obtain over 90% sensitivity of SR and 50% for both VF and VT. Also, this project com-
pares di↵erent window size on same samples, and give the conclusion that the classifiers
applied with selected window sizes (1s,2s and 3s) do not behave drastically di↵erent.
Last but not least, comparison between di↵erent input image sizes are made on this
project, from the results we can draw the conclusion that the small-sized images bring
good e↵ect the training performance.

6.4 Future Works and Extension

This section lists and explains the potential future works which can be undertaken to
improve the performance or to establishing a new approach that can be used to deal
with this partuclar classification problem.

6.4.1 Fine-tuning Spectrogram Parameters

Due to the time and resource limitations, the segment sizes we used for experimenting
are limited from 2s to 10s, and the window sizes are set to be only 1s, 2s and 3s on some
selected samples. As the experiment result section indicates, the performance varies
on di↵erent segment size. Also, according to the uncertainty principle, the window size
a↵ects both frequency resolution and time resolution. Hence, further experiments can be
conducted with more segment sizes and window sizes to try to obtain the better result.
6. Conclusion 32

Figure 12: CNN architecture for 1D data

6.4.2 Di↵erent Window Function

As stated in development craft image section, the hamming window function is what
we applied for generating all the images, the future experiment can try to use di↵erent
window function to generate the spectrograms in order to pursue the better result.

6.4.3 The Application of 1-dimensional Data for CNN

Gaining remarkable results in image classification, CNN are normally used to deal with
2D data such as images and this is the reason why many researches using CNN with
signals convert the 1d data to be 2d image such as spectrogram. However, there is a
small but interesting trend which is using CNN directly on raw 1D data. This potentially
could be the inspiration of the new ways to conduct the future work on this subject.
There is one existed study raised by Orestis Tsinalis et. al. The study proposes the
application of the raw EEG signal as input for CNN as well as shows that the overall
the accuracy is more than 70%. Figure 12 shows the architecture of the CNN used in
this study which can be the guideline of future work of this project due to the similarity
of data set and classifier.[6]
References 33

References

[1] Y. Alwan, Z. Cvetkovic, and M. J. Curtis, “Classification of human ven-


tricular arrhythmia in high dimensional representation spaces,” arXiv preprint
arXiv:1312.5354, 2014.

[2] Y. Alwan, Z. Cvetkovic, and M. J. Curtis, “Methods for improved discrimination


between ventricular fibrillation and tachycardia,” 2011.

[3] L. Yuan and J. Cao, “Patients eeg data analysis via spectrogram image with a con-
volution neural network,” Intelligent Decision Technologies 2017 Smart Innovation,
Systems and Technologies, p. 1321, 2017.

[4] S. Dieleman and B. Schrauwen, “End-to-end learning for music audio,” 2014.

[5] K. Pylypenko, “Right whale detection using artificial neural network and principal
component analysis,” 2015.

[6] O. Tsinalis, P. M. Matthews, Y. Guo, and S. Zafeiriou, “Automatic sleep stage


scoring with single-channel eeg using convolutional neural networks,” 2015.

[7] “Ventricular tachycardia (vtach) (video) — khan academy.” https:


//www.khanacademy.org/test-prep/nclex-rn/rn-cardiovascular-diseases/
rn-dysrhythmia-and-tachycardia/v/ventricular-tachycardias. (Accessed
on 08/24/2017).

[8] “What is ventricle fibrillation (vfib)? — khan academy.” https:


//www.khanacademy.org/test-prep/nclex-rn/rn-cardiovascular-diseases/
rn-dysrhythmia-and-tachycardia/v/ventricular-fibrillation. (Accessed
on 08/24/2017).

[9] J. O. S. III, Mathematics of the Discrete Fourier Transform (DFT). 8 2002.

[10] J. O. S. III, Spectral Audio Signal Processing.


References 34

[11] “A beginner’s guide to understanding convolutional neural net-


works adit deshpande cs undergrad at ucla (’19).” https:
//adeshpande3.github.io/adeshpande3.github.io/A-Beginner’
s-Guide-To-Understanding-Convolutional-Neural-Networks/. (Accessed
on 08/23/2017).

[12] “Convolutional neural networks (cnns): An illustrated ex-


planation - xrdsxrds.” http://xrds.acm.org/blog/2016/06/
convolutional-neural-networks-cnns-illustrated-explanation/. (Ac-
cessed on 08/23/2017).

[13] “Cs231n convolutional neural networks for visual recognition.” http://cs231n.


github.io/convolutional-networks/. (Accessed on 08/23/2017).

[14] “Simple guide to confusion matrix terminology.” http://www.dataschool.io/


simple-guide-to-confusion-matrix-terminology/. (Accessed on 08/23/2017).

[15] D. Houcque, “Introduction to matlab for engineering students,” 8 2005.

[16] “Introduction to matlab.” https://www.math.utah.edu/~wright/misc/matlab/


matlabintro.html. (Accessed on 08/23/2017).

[17] “Plot classification confusion matrix - matlab plotconfusion - mathworks


united kingdom.” https://uk.mathworks.com/help/nnet/ref/plotconfusion.
html. (Accessed on 08/23/2017).
A. Appendix 35

A Appendix

A.1 Project Gantt Chart


A. Appendix 36

A.2 Preprocessing Example Code

A.2.1 Preprocessing Example Code for 192 x 192 image data sets

Note: XXX is replaced with the segment size during the implementation.
On 1 second window with no overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 100;

4 l i m i t = 600000;

5 expected capped input no = limit / segment size ;

6 s r s i z e = s i z e (SR) ;

7 s r s i z e = s r s i z e (2) ;

8 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

9 f o r i =1: s r s i z e

10 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

11 end

12 sr sum = sum( s r c e l l s i z e ) ;

13 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

14 s r i m p o r t e d c o u n t = 0 ;

15 s r i m p o r t e d s e g m e n t = 0 ;

16 i n d e x = 1 ;

17 name = 1 ;

18 s t a r t p o s = 1 ;

19 e n d p o s = s e g m e n t s i z e ;

20 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

21 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
22 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
23 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
A. Appendix 37

24 xlabel ( ’ ’ ) ;
25 ylabel ( ’ ’ ) ;
26 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
27 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
28 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
29 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
30 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
31 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
32 delete ( ch ) ;
33 s a v e a s ( h , tmpstr ) ;
34 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
35 sr imported count = sr imported count + 1;
36 name = name + 1 ;
37 s t a r t p o s = end pos ;
38 end pos = end pos + segment size ;
39 else
40 index = index + 1 ;
41 sr imported segment = 0;
42 start pos = 1;
43 end pos = segment size ;
44 end
45 close a l l
46 end

47 v f s i z e = s i z e (VF) ;

48 v f s i z e = v f s i z e ( 2 ) ;

49 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

50 f o r i =1: v f s i z e

51 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

52 end
A. Appendix 38

53 vf sum = sum( v f c e l l s i z e ) ;

54 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

55 v f i m p o r t e d c o u n t = 0 ;

56 v f i m p o r t e d s e g m e n t = 0 ;

57 i n d e x = 1 ;

58 name = 1 ;

59 s t a r t p o s = 1 ;

60 e n d p o s = s e g m e n t s i z e ;

61 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

62 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
63 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
64 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
65 xlabel ( ’ ’ ) ;
66 ylabel ( ’ ’ ) ;
67 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
68 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
69 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
70 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
71 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
72 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
73 delete ( ch ) ;
74 s a v e a s ( h , tmpstr ) ;
75 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
76 vf imported count = vf imported count + 1;
77 name = name + 1 ;
78 s t a r t p o s = end pos ;
79 end pos = end pos + segment size ;
80 else
A. Appendix 39

81 index = index + 1 ;
82 vf imported segment = 0;
83 start pos = 1;
84 end pos = segment size ;
85 end
86 close a l l
87 end

88 v t s i z e = s i z e (VT) ;

89 v t s i z e = v t s i z e ( 2 ) ;

90 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

91 f o r i =1: v t s i z e

92 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

93 end

94 vt sum = sum( v t c e l l s i z e ) ;

95 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

96 v t i m p o r t e d c o u n t = 0 ;

97 v t i m p o r t e d s e g m e n t = 0 ;

98 i n d e x = 1 ;

99 name = 1 ;

100 s t a r t p o s = 1 ;

101 e n d p o s = s e g m e n t s i z e ;

102 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

103 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


104 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
105 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
106 xlabel ( ’ ’ ) ;
107 ylabel ( ’ ’ ) ;
108 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
A. Appendix 40

109 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;


110 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
111 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
112 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
113 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
114 delete ( ch ) ;
115 s a v e a s ( h , tmpstr ) ;
116 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
117 vt imported count = vt imported count + 1;
118 name = name + 1 ;
119 s t a r t p o s = end pos ;
120 end pos = end pos + segment size ;
121 else
122 index = index + 1 ;
123 vt imported segment = 0;
124 start pos = 1;
125 end pos = segment size ;
126 end
127 close a l l
128 end

On 1 second window with 25% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 100;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(


A. Appendix 41

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
A. Appendix 42

34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;
A. Appendix 43

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
A. Appendix 44

90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
A. Appendix 45

117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;


118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 1 second window with 50% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 100;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;
A. Appendix 46

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /SR/%d . png ’ ,
name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
A. Appendix 47

36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;
A. Appendix 48

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /VF/%d . png ’ ,
name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
A. Appendix 49

91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
111 xabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /VT/%d . png ’ ,
A. Appendix 50

name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with no overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;
A. Appendix 51

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
A. Appendix 52

37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;
A. Appendix 53

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;
A. Appendix 54

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
A. Appendix 55

120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with 25% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;
A. Appendix 56

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
A. Appendix 57

40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
A. Appendix 58

68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,


hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e
A. Appendix 59

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
A. Appendix 60

123 name = name + 1 ;


124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with 50% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;
A. Appendix 61

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
A. Appendix 62

43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
69 xlabel ( ’ ’ ) ;
A. Appendix 63

70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;
A. Appendix 64

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
111 xabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
A. Appendix 65

126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 3 seconds window with no overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 300;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;
A. Appendix 66

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
A. Appendix 67

46 end pos = segment size ;


47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
A. Appendix 68

73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
A. Appendix 69

101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
A. Appendix 70

129 start pos = 1;


130 end pos = segment size ;
131 end
132 close a l l
133 end

On 3 seconds window with 25% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 300;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;
A. Appendix 71

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
A. Appendix 72

49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
A. Appendix 73

76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;
A. Appendix 74

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
A. Appendix 75

132 close a l l
133 end

On 3 seconds window with 50% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 300;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o
A. Appendix 76

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;
A. Appendix 77

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
A. Appendix 78

79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;
A. Appendix 79

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
111 xabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 92 9 2 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end
A. Appendix 80

A.2.2 Preprocessing Example Code for 256 x 256 image data sets

Note: XXX is replaced with the segment size during the implementation.
On 1 second window with no overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 100;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o
A. Appendix 81

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;
A. Appendix 82

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
A. Appendix 83

79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;
A. Appendix 84

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end
A. Appendix 85

On 1 second window with 25% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 100;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
A. Appendix 86

hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;
A. Appendix 87

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
A. Appendix 88

82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
A. Appendix 89

110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,


hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 1 second window with 50% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;
A. Appendix 90

3 window size = 100;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
A. Appendix 91

29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /SR/%d . png ’ ,
name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;
A. Appendix 92

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /VF/%d . png ’ ,
name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
A. Appendix 93

83 end pos = end pos + segment size ;


84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
A. Appendix 94

hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
111 xabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ 22 08 17/6 2 0.25 x256 /VT/%d . png ’ ,
name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with no overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;
A. Appendix 95

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
A. Appendix 96

29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;
A. Appendix 97

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
A. Appendix 98

85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , 0 )
111 xlabel ( ’ ’ ) ;
A. Appendix 99

112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with 25% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;
A. Appendix 100

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
A. Appendix 101

32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;
A. Appendix 102

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
A. Appendix 103

88 end pos = segment size ;


89 end
90 close a l l
91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e ) / 4 )
111 xlabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
A. Appendix 104

115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;


116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

On 2 seconds window with 50% overlapping on window

1 load ( ’RHYTHMS. mat ’ ) ;

2 s e g m e n t s i z e = XXX;

3 window size = 200;

4 l i m i t = 600000;

5 t r a i n r a t i o = 75;

6 expected capped input no = limit / segment size ;

7 expected capped train no = int16 ( expected capped input no ⇤(

t r a i n r a t i o /100) ) ;
A. Appendix 105

8 s r s i z e = s i z e (SR) ;

9 s r s i z e = s r s i z e (2) ;

10 s r c e l l s i z e = zeros ( s r s i z e , 1 ) ;

11 f o r i =1: s r s i z e

12 s r c e l l s i z e ( i ) = s i z e (SR{ i } , 1 ) ;

13 end

14 sr sum = sum( s r c e l l s i z e ) ;

15 e x p e c t e d s r i n p u t n o = i n t 1 6 ( sr sum / s e g m e n t s i z e ) ;

16 e x p e c t e d s r t r a i n n o = i n t 1 6 ( e x p e c t e d s r i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
17 s r i m p o r t e d c o u n t = 0 ;

18 s r i m p o r t e d s e g m e n t = 0 ;

19 i n d e x = 1 ;

20 name = 1 ;

21 s t a r t p o s = 1 ;

22 e n d p o s = s e g m e n t s i z e ;

23 while s r i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

24 i f e n d p o s < s i z e (SR{ 1 , i n d e x } , 1 )
25 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
26 s p e c t r o g r a m (SR{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
27 xlabel ( ’ ’ ) ;
28 ylabel ( ’ ’ ) ;
29 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
30 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
31 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
32 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
33 tmpstr = s p r i n t f ( ’ Output /SR/%d . png ’ , name ) ;
34 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
A. Appendix 106

35 delete ( ch ) ;
36 s a v e a s ( h , tmpstr ) ;
37 s r i m p o r t e d s e g m e n t = s r i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
38 sr imported count = sr imported count + 1;
39 name = name + 1 ;
40 s t a r t p o s = end pos ;
41 end pos = end pos + segment size ;
42 else
43 index = index + 1 ;
44 sr imported segment = 0;
45 start pos = 1;
46 end pos = segment size ;
47 end
48 close a l l
49 end

50 v f s i z e = s i z e (VF) ;

51 v f s i z e = v f s i z e ( 2 ) ;

52 v f c e l l s i z e = zeros ( v f s i z e , 1 ) ;

53 f o r i =1: v f s i z e

54 v f c e l l s i z e ( i ) = s i z e (VF{ i } , 1 ) ;

55 end

56 vf sum = sum( v f c e l l s i z e ) ;

57 e x p e c t e d v f i n p u t n o = i n t 1 6 ( vf sum / s e g m e n t s i z e ) ;

58 e x p e c t e d v f t r a i n n o = i n t 1 6 ( e x p e c t e d v f i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
59 v f i m p o r t e d c o u n t = 0 ;

60 v f i m p o r t e d s e g m e n t = 0 ;

61 i n d e x = 1 ;

62 name = 1 ;
A. Appendix 107

63 s t a r t p o s = 1 ;

64 e n d p o s = s e g m e n t s i z e ;

65 while v f i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

66 i f e n d p o s < s i z e (VF{ 1 , i n d e x } , 1 )
67 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
68 s p e c t r o g r a m (VF{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
69 xlabel ( ’ ’ ) ;
70 ylabel ( ’ ’ ) ;
71 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
72 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
73 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
74 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
75 tmpstr = s p r i n t f ( ’ Output /VF/%d . png ’ , name ) ;
76 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
77 delete ( ch ) ;
78 s a v e a s ( h , tmpstr ) ;
79 v f i m p o r t e d s e g m e n t = v f i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
80 vf imported count = vf imported count + 1;
81 name = name + 1 ;
82 s t a r t p o s = end pos ;
83 end pos = end pos + segment size ;
84 else
85 index = index + 1 ;
86 vf imported segment = 0;
87 start pos = 1;
88 end pos = segment size ;
89 end
90 close a l l
A. Appendix 108

91 end

92 v t s i z e = s i z e (VT) ;

93 v t s i z e = v t s i z e ( 2 ) ;

94 v t c e l l s i z e = zeros ( v t s i z e , 1 ) ;

95 f o r i =1: v t s i z e

96 v t c e l l s i z e ( i ) = s i z e (VT{ i } , 1 ) ;

97 end

98 vt sum = sum( v t c e l l s i z e ) ;

99 e x p e c t e d v t i n p u t n o = i n t 1 6 ( vt sum / s e g m e n t s i z e ) ;

100 e x p e c t e d v t t r a i n n o = i n t 1 6 ( e x p e c t e d v t i n p u t n o ⇤ ( t r a i n r a t i o

/100) ) ;
101 v t i m p o r t e d c o u n t = 0 ;

102 v t i m p o r t e d s e g m e n t = 0 ;

103 i n d e x = 1 ;

104 name = 1 ;

105 s t a r t p o s = 1 ;

106 e n d p o s = s e g m e n t s i z e ;

107 while v t i m p o r t e d c o u n t < e x p e c t e d c a p p e d i n p u t n o

108 i f e n d p o s < s i z e (VT{ 1 , i n d e x } , 1 )


109 h = figure ( ’ V i s i b l e ’ , ’ o f f ’ )
110 s p e c t r o g r a m (VT{ 1 , i n d e x } ( s t a r t p o s : end pos , 1 ) ,
hamming ( w i n d o w s i z e ) , c e i l ( w i n d o w s i z e / 2 ) )
111 xabel ( ’ ’ ) ;
112 ylabel ( ’ ’ ) ;
113 set ( f i n d o b j ( ’ type ’ , ’ a x e s ’ ) , ’ f o n t s i z e ’ , 5 ) ;
114 set ( h , ’ PaperUnits ’ , ’ p o i n t s ’ ) ;
115 set ( h , ’ P a p e r P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
116 set ( h , ’ P o s i t i o n ’ , [ 0 0 123 1 2 3 ] ) ;
117 tmpstr = s p r i n t f ( ’ Output /VT/%d . png ’ , name ) ;
A. Appendix 109

118 ch=f i n d a l l ( h , ’ t a g ’ , ’ C o l o r b a r ’ ) ;
119 delete ( ch ) ;
120 s a v e a s ( h , tmpstr ) ;
121 v t i m p o r t e d s e g m e n t = v t i m p o r t e d s e g m e n t+s e g m e n t s i z e ;
122 vt imported count = vt imported count + 1;
123 name = name + 1 ;
124 s t a r t p o s = end pos ;
125 end pos = end pos + segment size ;
126 else
127 index = index + 1 ;
128 vt imported segment = 0;
129 start pos = 1;
130 end pos = segment size ;
131 end
132 close a l l
133 end

A.3 Training Scripts

A.3.1 Training Scripts for 192 x 192 images data sets

Note: XXX is replaced with the data set name during the implementation and YYY is
replaced with the desired confusion matrix output directory.

1 d a t a s e t n a m e = XXX;

2 f i g u r e p a t h = YYY;

4 digitDatasetPath = f u l l f i l e ( dataset name ) ;

5 digitData = imageDatastore ( digitDatasetPath , ’ I n c l u d e S u b f o l d e r s ’

, true , ’ LabelSource ’ , ’ foldernames ’ ) ;


6
A. Appendix 110

7 CountLabel = d i g i t D a t a . countEachLabel ;

8 img = readimage ( d i g i t D a t a , 1 ) ;

9 s i z e ( img )

10 t r a i n i n g N u m F i l e s = c e i l ( CountLabel { 1 , 2 } ⇤ 0 . 7 5 ) ;

11 rng ( 1 ) % For r e p r o d u c i b i l i t y

12 [ t r a i n D i g i t D a t a , t e s t D i g i t D a t a ] = s p l i t E a c h L a b e l ( d i g i t D a t a ,

t r a i n i n g N u m F i l e s , ’ randomize ’ ) ;
13 l a y e r s = [ imageInputLayer ( [ 1 9 2 192 3 ] )
14 convolution2dLayer (5 ,30)
15 reluLayer
16 maxPooling2dLayer ( 2 , ’ S t r i d e ’ , 2 )
17 convolution2dLayer (5 ,30)
18 reluLayer
19 maxPooling2dLayer ( 2 , ’ S t r i d e ’ , 2 )
20 fullyConnectedLayer (3)
21 softmaxLayer
22 classificationLayer () ] ;
23

24 o p t i o n s = t r a i n i n g O p t i o n s ( ’ sgdm ’ , ’ MaxEpochs ’ , 3 0 , ’
InitialLearnRate ’ ,0.0001) ;
25 convnet = t r a i n N e t w o r k ( t r a i n D i g i t D a t a , l a y e r s , o p t i o n s ) ;

26 YTest = c l a s s i f y ( convnet , t e s t D i g i t D a t a ) ;

27 TTest = t e s t D i g i t D a t a . L a b e l s ;

28

29 t = c o n f u s i o n m a t t r a n s f o r m ( TTest ) ;

30 y = c o n f u s i o n m a t t r a n s f o r m ( YTest ) ;

31

32 h = f i g u r e ( ’ V i s i b l e ’ , ’ o f f ’ )

33 p l o t c o n f u s i o n ( t , y ) ;
A. Appendix 111

34 strname = s t r c a t ( dataset name , ’ . fig ’) ;


35 output = s t r c a t ( f i g u r e p a t h , strname ) ;

36 s a v e f i g ( h , output ) ;

37 c l o s e a l l

A.3.2 Training Scripts for 256 x 256 images data sets

Note: XXX is replaced with the data set name during the implementation and YYY is
replaced with the desired confusion matrix output directory.

1 d a t a s e t n a m e = XXX;

2 f i g u r e p a t h = YYY;

4 digitDatasetPath = f u l l f i l e ( dataset name ) ;

5 digitData = imageDatastore ( digitDatasetPath , ’ I n c l u d e S u b f o l d e r s ’

, true , ’ LabelSource ’ , ’ foldernames ’ ) ;


6

7 CountLabel = d i g i t D a t a . countEachLabel ;

8 img = readimage ( d i g i t D a t a , 1 ) ;

9 s i z e ( img )

10 t r a i n i n g N u m F i l e s = c e i l ( CountLabel { 1 , 2 } ⇤ 0 . 7 5 ) ;

11 rng ( 1 ) % For r e p r o d u c i b i l i t y

12 [ t r a i n D i g i t D a t a , t e s t D i g i t D a t a ] = s p l i t E a c h L a b e l ( d i g i t D a t a ,

t r a i n i n g N u m F i l e s , ’ randomize ’ ) ;
13 l a y e r s = [ imageInputLayer ( [ 2 5 6 256 3 ] )
14 convolution2dLayer (5 ,30)
15 reluLayer
16 maxPooling2dLayer ( 2 , ’ S t r i d e ’ , 2 )
17 convolution2dLayer (5 ,30)
18 reluLayer
A. Appendix 112

19 maxPooling2dLayer ( 2 , ’ S t r i d e ’ , 2 )
20 fullyConnectedLayer (3)
21 softmaxLayer
22 classificationLayer () ] ;
23

24 o p t i o n s = t r a i n i n g O p t i o n s ( ’ sgdm ’ , ’ MaxEpochs ’ , 3 0 , ’
InitialLearnRate ’ ,0.0001) ;
25 convnet = t r a i n N e t w o r k ( t r a i n D i g i t D a t a , l a y e r s , o p t i o n s ) ;

26 YTest = c l a s s i f y ( convnet , t e s t D i g i t D a t a ) ;

27 TTest = t e s t D i g i t D a t a . L a b e l s ;

28

29 t = c o n f u s i o n m a t t r a n s f o r m ( TTest ) ;

30 y = c o n f u s i o n m a t t r a n s f o r m ( YTest ) ;

31

32 h = f i g u r e ( ’ V i s i b l e ’ , ’ o f f ’ )

33 p l o t c o n f u s i o n ( t , y ) ;

34 strname = s t r c a t ( dataset name , ’ . fig ’) ;


35 output = s t r c a t ( f i g u r e p a t h , strname ) ;

36 s a v e f i g ( h , output ) ;

37 c l o s e a l l

You might also like