0% found this document useful (0 votes)
5 views6 pages

2 Peimankar2019

This document presents a study on the automatic detection of cardiac arrhythmias using an ensemble learning approach applied to ECG signals. The proposed method utilizes three classification algorithms—Random Forest, Adaptive Boosting, and Artificial Neural Networks—along with feature extraction from ECG data to achieve high classification accuracy. The ensemble model demonstrated an overall accuracy of 96.18% in classifying heartbeats into five arrhythmia classes, highlighting its potential for real-time tele-health monitoring applications.

Uploaded by

Sreelatha Edara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views6 pages

2 Peimankar2019

This document presents a study on the automatic detection of cardiac arrhythmias using an ensemble learning approach applied to ECG signals. The proposed method utilizes three classification algorithms—Random Forest, Adaptive Boosting, and Artificial Neural Networks—along with feature extraction from ECG data to achieve high classification accuracy. The ensemble model demonstrated an overall accuracy of 96.18% in classifying heartbeats into five arrhythmia classes, highlighting its potential for real-time tele-health monitoring applications.

Uploaded by

Sreelatha Edara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Automatic Detection of Cardiac Arrhythmias Using

Ensemble Learning
Abdolrahman Peimankar* Mona Jafar Jajroodi* Sadasivan Puthusserypady
Department of Health Technology Department of Health Technology Department of Health Technology
Technical University of Denmark Technical University of Denmark Technical University of Denmark
Kongens Lyngby, Denmark Kongens Lyngby, Denmark Kongens Lyngby, Denmark
[email protected] [email protected] [email protected]

Abstract—Objectives: Electrocardiogram (ECG) is an impor- automatic analysis and classification of cardiac rhythms. This
tant clinical tool for diagnosis of cardiac abnormalities. Physi- can significantly simplify the diagnosis of cardiac arrhythmia,
cians make diagnoses by visual examination of ECGs. Analysing which is a great assistance to the clinicians [1], [4]–[6].
huge amounts of ECGs however, can be very time consuming
and cumbersome. Hence, developing analytic software is of great In the past decades, several techniques have been pro-
importance to automatically analyse these ECG signals to detect posed for automatic ECG-based heartbeat classification and
efficiently the common cardiac arrhythmias. Methods: Proposed arrhythmia detection. Faezipour et.al proposed a wavelet based
an ensemble learning approach for automatic processing of ECG technique to classify ECG beats [4]. A linear discriminant
signals and classification of arrhythmias. Twenty six features analysis based algorithm was introduced in [1]. In [5], the
(based on wavelets, heartbeat intervals, and RR-intervals) are
extracted and three algorithms, namely, Random Forest (RF), authors presented a neural network (NN) based method for
Adaptive Boosting (AdaBoost) and Artificial Neural Network automatic classification of five different ECG classes (Table
(ANN) are utilized for classification. Results: The proposed I). Support vector machine (SVM) based methods are also
method is evaluated on ECG signals from 44 recordings of applied to classify ECG signals [3] [7] [8]. Melgani and
the MIT-BIH arrhythmia database. The overall classification Bazi proposed a SVM based method to automatically classify
accuracy of the RF, AdaBoost, and ANN are 96.16%, 96.16%
and 94.49%, respectively. Additionally, the overall classification ECG beats and compared it with two other classifiers (k-
accuracy of the ensemble model is improved to 96.18%. Con- Nearest Neighbors (kNN) and Radial Basis Function (RBF)
clusion: Experimental results show that the performance of the NN classifier) [7]. Mondéjar-Guerraa proposed a new method
ensemble model for ECG heartbeat classification improves the for ECG classification based on an ensemble of SVMs [3].
overall accuracy. Significance: This paper proposes an accurate Ensemble learning is one of the most popular machine
and easy to use approach to classify heartbeats into one of the
five classes recommended by ANSI/AAMI standard, which can learning techniques, which can be utilized in different types
be used in real-time within a tele-health monitoring framework. of problems such as classification and regression [9]. It has
been used in electrical engineering as well as biomedical en-
Index Terms—electrocardiogram (ECG), heartbeat classifica- gineering [12] [10]. Every ensemble learning models contains
tion, ensemble learning, feature extraction. three main parts: (1) Creating a training set by sampling from
the dataset, (2) Training a group of different classification
I. I NTRODUCTION
algorithms, and (3) Combining the predictions of classification
Cardiac arrhythmia is an important issue due to its high algorithms. One of the most desirable advantages of using
prevalence, associated high mortality and high cost of treat- ensemble learning is having a more accurate classification
ment [1]–[3]. According to the World Health Organization model by bypassing the selection of a single weak classifier
(WHO), one of the major causes of death in the world is [9].
cardiovascular diseases [3]. Twenty-six features (Table II) are extracted and used as
In order to diagnose cardiac arrhythmia and provide im- inputs to the three classifiers in order to distinguish between
portant information about heart’s conditions, the most basic normal heartbeats and four different arrhythmia classes. The
and available method is the electrocardiography (ECG) [2], three classification algorithms used in this study are, Random
[4], [5]. However, it can be cumbersome for a physician to Forests (RF) [13], Adaptive Boosting (AdaBoost) [14], and
visually analyse these long ECGs in a short time. For instance, Artificial Neural Networks (ANN) [15]. Each of these three
diagnosis of some arrhythmia takes up to several hours to single classification algorithms is trained by utilizing 5-fold
even days by visual analysis [1], [4]. Therefore, to facilitate cross validation technique. Later, the outputs of these classifi-
the analysis of long ECG recordings, there is a need to use cation algorithms are added by using Dempster-Shafer theory
computational techniques and develop effective algorithms for (DST) [16] to better the performance.
The rest of the paper contains five sections. Section II briefly
*Both authors contributed equally to this work.
This research work is supported by the Innovation fund, Denmark, REAFEL presents the methods used in this study. Ensemble heartbeat
(IFD Project No: 6153-00009B). classification algorithm is explained in Section III. Later,

978-1-7281-1895-6/19/$31.00 2019
c IEEE 383
Section IV shows experimental results as well as comparison Next, various statistical features are extracted from the wavelet
with other algorithms, and lastly, Section V provides the coefficients, such as the minimum, maximum, average and
conclusion of this paper. coefficient variance. These twenty features are extracted from
each subband.
II. M ATERIALS AND M ETHODS
2) RR-interval features: RR-intervals is the distance be-
A. Database tween the heartbeats R peaks. Three RR-intervals based fea-
ECG data for this work are collected from MIT-BIH ar- tures were extracted, which are the pre-RR interval, post-RR
rhythmia database [17], [18]. It contains 48 data records each interval and local average RR-interval. The pre- and post-
about 30 min long (sampled at 360Hz). As per AAMI’s RRintervals are the RR-interval between a given heartbeat with
recommendation, four recordings (#s102, 104, 107, and 217) the previous and following heartbeat, respectively. Lastly, the
which contain paced beats, were removed from the analysis local average RR-interval is defined as the average of the RR-
since these beats do not preserve appropriate signal quality. intervals of ten RR-intervals neighboring one heartbeat i.e. five
In addition, the AAMI convention that is used to combine the RR-intervals to the left and five RR-intervals to the right of
beats into five classes as shown in Table I [19]. the given heartbeat.
3) Heart-beat morphology: Three morphology features,
TABLE I: Heartbeats categories in MIT-BIH database which are QRS duration (time interval between the beginning
and end of the QRS segment (from QRS onset to QRS offset),
Group Class
Normal (N) beat T-wave duration (time interval between the QRS offset and the
Left (L) and right (R) bundle branch block beats T-wave offset) and P-wave absence/present, are extracted from
N Atrial escape (e) beat each ECG heartbeat intervals. The P wave flag is defined as
Nodal (junctional, j) escape beat
Atrial premature (A) beat a Boolean variable indicating the presence or absence of a P-
Aberrated atrial premature (a) beat wave in each heartbeat segmentation. These twenty six features
S Nodal (junctional, j) premature beat are summarised in Table II.
Supraventricular premature (S) beat
V Premature ventricular (V) contraction
Ventricular escape (E) beat TABLE II: Features Description
F Fusion of ventricular and normal (F) beat
Fusion of paced and normal (f) beat Features Feature Description
Q Unclassifiable (U) beat F1-F5 Minimum of wavelet coefficients
F6-F10 Maximum of wavelet coefficients
F11-F15 Mean of wavelet coefficients
F16-F20 Variance of wavelet coefficients
B. Preprocessing F21 Post-RR interval
F22 Pre-RR interval
1) Signal filtering: The ECG signals are filtered by using
F23 Local average RR-interval
a band-pass filter with a minimum-order FIR filter (due to the F24 QRS duration
input signal is long enough) [20], and in range of 0.5-40 Hz, F25 T-wave duration
in order to remove wander and high frequency noise. F26 P-wave flag
2) QRS complex detection and heartbeat segmentation:
One cycle of an ECG signal consists of three component
waveforms: QRS comples, P-wave and T-Wave. The accurate
locations of the signal peaks, beginning and end points of these D. Beat Classification
component waveforms are used to segment hearbeats [8]. In
order to detect the QRS complexes and locating the beginning, 1) Random Forest (RF): RF is one of the commonly used
peak as well as the end of the P, QRS, and T waves, the WFDB ensemble machine learning approaches [22]. This algorithm
Toolbox in Physionet is used [17], [21]. Next, the signals are consists of various decision trees in the learning phase and
segmented into beats, which means that it starts from P-wave makes a decision based on the majority voting [23]. In order
onset and ends at T-wave offset. If P-wave is not present, then to achieve optimal values for trees, different values of trees are
each segment start from QRS onset and end at T-wave offset. implemented and their classification accuracy are compared.
Accordingly, in this study, 20 trees are chosen in RF as it gave
C. Feature Extraction the optimal accuracy results.
1) Wavelet transform: Wavelet transform provides a charac- 2) Adaptive Boosting (AdaBoost): This algorithm was in-
terization in frequency and temporal domains [8]. In this paper, troduced in 1997 [14]. AdaBoost is more sensitive to noisy
the Daubechies wavelet function (db2) is utilized, because it is data and outliers but less susceptible to the over-fitting com-
more similar to the ECG signal morphology compared to other pared to RF. In this method, the decision trees are used
wavelet functions. Additionally, the number of decomposition as base classifiers and assign higher weights to the mis-
levels is four. This means that ECG signals are decomposed classification cases in the next iteration. In this way, the chance
into the details D1 − D4 , as well as one approximation of incorrectly classified samples to be chosen for training the
coefficient A4 . These five decomposition are called subbands. next decision trees are higher.

384 2019 IEEE Region 10 Conference (TENCON 2019)


3) Artificial Neural Network (ANN): In ANN, the struc-
tured layers are connected to the nodes similar to the neurons
in the human brain. It is capable of self-learning from data ECG (MIT-BIH)
and can be trained for the purposes of pattern recognition,
data classification and prediction. In ANN, the input is bro-
ken down to abstraction layers. Therefore, it can be trained Filtering 1
over many samples to identify the patterns in the data. The
connection of each element, its strength and weight define the QRS complex detection 2
behaviour of the network. The connection weights are adjusted
automatically during the training (based on specified learning
rule) until the NN performs the desired task accurately [15]. Segmentation 3
E. Dempster-Shafer combination rule
Dempster-Shafer theory (DST) is one of the most common Feature Extraction 4
technique utilized for information fusion and capture the grade
of certainties among various information sources [16], [24].
Three important math functions used in DST are: (i) mass ADASYN 5
probability function (m), (ii) belief function (Bel), and (iii)
plausibility function (P l). Function m is very critical in clas-
Cross Validation 6
sification issues and should fulfill the following requirements

Validation set
[24]:  
Training set
m : 2X → 0, 1 , (1) Train all classifiers 7
m(∅) = 0, (2)

m(A) = 1, (3) Evaluate trained classifiers 8
A⊆X

where X and ∅ are defined as the universal set and the empty Combined the outputs of
9
set, respectively. DST combines the two mass probabilities to the classifiers using DST
define more precise evidence, as below:
1 
(m1 ⊕ m2 , ) = m1 (B)m2 (C), (4) Fig. 1: Flowchart of the ECG classification technique.
1−P
B∩C=A=∅

where, P = B∩C=∅ m1 (B)m2 (C).
of biased classification [25]. In order to balance the
III. E NSEMBLE H EARTBEAT C LASSIFICATION
dataset and consequently improve the performance of the
Figure 1 illustrates the flowchart of the proposed ensemble classification algorithms, a method, namely ADASYN
model for heartbeat classification. The algorithm is described (adaptive synthetic over-sampling technique) was used.
in following steps: It contains three steps: (i) First, the degree of class
1) Reducing noise: In this step, the artifacts and baseline imbalance is computed in order to determine the number
wanders are filtered and removed from the ECG records. of synthetic samples for the minority class, (ii) In a
2) Detecting QRS complex: The P-waves, QRS complexes, minority class, the Euclidean distance is determined in
and T-waves in each ECG signal are detected using the order to find the K nearest neighbours, and (iii) Finally,
WFDB toolbox in PhysioNet. the synthetic samples are generated for the minority
3) Segmentation: All the signals are segmented into the class by using, di = xi + λ(xki − xi ). Here, xi is a
heart beats, as mentioned in Section II-B. sample from minority class, xki is one of the samples
4) Feature extraction: The aim of extracting features is to from the K nearest neighbors. Additionally, λ ∈ [0, 1]
reduce the pattern vector to a lower dimension, with the is a random number.
most discriminatory information from original vector. 6) Cross validation: The dataset is split into training and
Three different types of features are extracted as men- validation sets, by utilizing the 5-fold cross validation.
tioned in Section II-C, which are based on RR-Intervals, 7) Training the classifiers: The 5-fold training sets are
Heart-beat morphology, and Wavelet transform. utilized to train the classifiers (RF, AdaBoost and ANN).
5) Synthetic dataset generation: The used dataset is imbal- 8) Evaluating the trained classifiers: Then, the performance
anced and the number of normal heartbeats are much of the algorithms are evaluated on the 5-fold validation
more than the number of heartbeats for other classes. sets.
Using the imbalanced dataset may increase the chance 9) Combining of classifier outputs: Lastly, the predictions

2019 IEEE Region 10 Conference (TENCON 2019) 385


(a) (b)
Fig. 2: Scatter plots of two arbitrary features for two classes (N and F): (a) Imbalanced, and (b) Synthetically balanced.

of classifiers are combined by utilizing DST technique. TABLE IV: The Accuracy of Individual Classifiers.

IV. R ESULTS AND D ISCUSSION Algorithms AdaBoost RF ANN


Training (%) 96.21 96.21 94.50
A. Experimental Validation Validations (%) 96.16 96.16 94.49

In this study, MIT-BIH arrhythmia database is used from


PhysioNet [18] for investigating the performance of the devel-
oped algorithm. The heartbeats number for the classes before which are mostly predicted as S and F classes. From Fig.
and after creating synthetic data are given in Table III. As an 3, it can be also seen that S, V, F, and Q classes are
example, Fig. 2a and Fig. 2b, illustrate the distribution of the mainly misclassified as class N, class F, class N, and class
imbalanced and balanced heartbeats of two random features S, respectively.
(Maximum of wavelet coefficients and QRS-duration) for two
TABLE V: Average performance of ensemble algorithm on
classes (N and F), respectively. It can be seen that scatter
training and validation set over all classes.
plot shown in Fig. 2b shows more balanced distribution of the
features compared to Fig. 2a. Measure (%) Acc F1-score Se Sp
The three classification algorithms are trained by utilizing 5- Training 96.20 90.39 90.41 97.63
fold cross validation techniques and the synthetically balanced Validation 96.18 90.35 90.37 97.62
features as inputs. The average accuracy for the training and
validation sets are reported in Table IV. According to this In addition, sensitivity (Se), specificity (Sp), and F1-score
table, the performance of AdaBoost is similar to RF while for the proposed ensemble algorithm are also reported in Table
ANN achieves the lowest accuracy. V. These metrics are defined as follows:
TP
TABLE III: Number of imbalanced and balanced datasets. Se = , (5)
TP + FN
TN
Class Imbalanced beats Balanced beats Sp = , (6)
N 89727 89727 TN + FP
S 2766 97773 P P V.Se
V 6755 93815 F − score = (1 + β 2 ) 2 , (7)
F 798 99764 (β .P P V ) + Se
Q 418 100056 TP + TN
Acc = , (8)
(T P + F N + F P + T N
Later, the outputs of these three classifiers are merged where TP, TN, FN, and FP represent the number of true
by utilizing DST. This slightly improves the classification positive, true negative, false negative, and false positive cases,
accuracy of the best classifiers from 96.16% to 96.18%. respectively. P P V in Eq.(7) is the Positive Predictive Value
The confusion matrix of the proposed ensemble model on and is defined as, P P V = T PT+F P
P . The F-score can be
validation set is shown in Fig. 3. For example, the number used as a single measure of performance of the test for the
of misclassifications for class N is higher than other classes, positive classes. This measure is called the balanced F-score

386 2019 IEEE Region 10 Conference (TENCON 2019)


TABLE VI: Comparison with published algorithms in literature using MIT-BIH database in an intra-patient scheme.

Reference Number of classes Feature Classifier Acc (%)


Chen et al. [26] 2 RR-interval Set of rules 95
Tsipouras et al. [27] 9 RR-interval Deterministic automata 96
Güler and Übeylı [28] 4 Wavelet Ensemble of NNs 96
Yeh et al. [29] 9 Morphological, RR-interval Clustering 94
Kallas et al. [30] 3 Kernel PCA SVM 97
Mehmet [31] 4 Higher-order statistics, Wavelet Minimum Distance, kNN, Bayes 98
Korürek and Nizam [32] 6 RR-interval, ECG-segments Clustering, kNN 94
Lin et al. [33] 7 Wavelet Adaptive wavelet network 90
This work 5 RR-interval, Morphological, Wavelets Ensemble learning 96

high for other classes as well. For example, the sensitivity of


the algorithm is the highest on class Q and the lowest on class
N, while other classes are in the same range.
B. Comparison with Other Heartbeat Classification Models
Table VI lists some of the main models, considered by us,
published in literature. All the references in this table have
used the same MIT-BIH database in an intra-patient scheme
to assess the performance of their algorithms. The proposed
method in this study outperforms most of the other models
listed in Table VI. Although, the proposed ensemble algorithm
is outperformed by [30], [31], the number of classes (3 and 4,
respectively) classified in these two models are less than the
standard five classes considered in this study.
V. C ONCLUSIONS
Fig. 3: Confusion matrix of the proposed ensemble model on
validation set. In this study, an ensemble method has been proposed to
detect cardiac arrhythmias in ECG recordings. First, three set
of features (RR-Interval, Heart-beat morphology, and Wavelet
transform features) were defined using ECG records. Next,
these features were used as inputs to three classification
algorithms (AdaBoost, RF and ANN). Moreover, each of
these three classification algorithms is by utilizing using cross
validation technique. Lastly, the DST is used to combine the
outputs of the classifiers to enhance the overall accuracy.
The obtained results have shown that performance of the
ensemble model is slightly better the best single classification
algorithms for detection of cardiac arrhythmias. One of the
most advantage of the ensemble model is robustness of the
algorithm which is less likely to over-fit compared to single
classification algorithms.
R EFERENCES
[1] P. de Chazal and R. B. Reilly, “A patient-adapting heartbeat classifier
using ECG morphology and heartbeat interval features,” IEEE Transac-
Fig. 4: Performance of the ensemble model for five different tions on Biomedical Engineering, vol. 53, no. 12, pp. 2535-2543, 2006.
classes. [2] Ö. Yldrm, P. Pławiak, R.-S. Tan, and U. R. Acharya, “Arrhythmia
detection using deep convolutional neural network with long duration
ECG signals,” Computers in Biology and Medicine, vol. 102, pp. 411-
420, 2018.
(F1-score), when β = 1. Its range is between 0 (lowest [3] V. Mondéjar-Guerraa, J. Novo, J. Rouco, M. Penedo, and M. Ortega,
“Heartbeat classification fusing temporal and morphological information
classification ability) and +1 (highest classification ability). of ECGs via ensemble of classifiers,” Biomedical Signal Processing and
Moreover, the performance of the proposed ensemble model Control, vol. 47, pp. 41-48, 2019.
on validation set using different metrics for the five classes is [4] M. Faezipour, A. Saeed, S. C. Bulusu, M. Nourani, H. Minn, and L.
Tamil, “A patient-adaptive profiling scheme for ECG beat classification,”
illustrated in Fig. 4. In general, the model performs the best IEEE Transactions on Information Technology in Biomedicine, vol. 14,
on class Q. However, as shown in Fig. 4, its performance is no. 5, pp. 1153-1165, 2010.

2019 IEEE Region 10 Conference (TENCON 2019) 387


[5] M. K. Das and S. Ari, “ECG beats classification using mixture of [30] M. Kallas, C. Francis, L. Kanaan, D. Merheb, P. Honeine, and H.
features,” International Scholarly Research Notices, vol. 2014, 2014. Amoud “Multi-class SVM classification combined with kernel PCA
[6] E. J. d. S. Luz, W. R. Schwartz, G. Cámara-Chávez, and D. Menotti, feature extraction of ECG signals,” 2012 19th International Conference
“ECG-based heartbeat classification for arrhythmia detection: A survey,” on Telecommunications (ICT), pp. 1–5, 2012.
Computer Methods and Programs in Biomedicine, vol. 127, pp. 144-164, [31] E. Mehmet “ECG beat classification using neuro-fuzzy network,” Pattern
2016. Recognition Letters, vol. 25, no. 15, pp. 1715–1722, 2004.
[7] F. Melgani and Y. Bazi, “Classification of electrocardiogram signals [32] M. Korürek and A. Nizam “A new arrhythmia clustering technique based
with support vector machines and particle swarm optimization,” IEEE on Ant Colony Optimization,” Journal of Biomedical Informatics, vol.
Transactions on Information Technology in Biomedicine, vol. 12, no. 41, no. 6, pp. 874–881, 2008.
5,pp. 667–677, 2008. [33] C-H. Lin, Y-C. Du, and T. Chen “Adaptive wavelet network for multiple
[8] C. Ye, B. V. K. Vijaya Kumar, and M. T. Coimbra, “Heartbeat classifica- cardiac arrhythmias recognition,” Expert Systems with Applications, vol.
tion using morphological and dynamic features of ECG signals,” IEEE 34, no. 4, pp. 2601–2611, 2008.
Transactions on Biomedical Engineering, vol. 59, no. 10, pp. 2930–
2941, 2012.
[9] R. Polikar, “Ensemble based systems in decision making,” IEEE Circuits
and Systems Magazine, vol. 6, no. 3, pp. 21–45, 2006.
[10] L. Shi, L. Xi, X. Ma, M. Weng, and X. Hu “A novel ensemble algo-
rithm for biomedical classification based on Ant Colony Optimization,”
Applied Soft Computing, vol. 11, no. 8, pp. 5674–5683, 2011.
[11] D. West, S. Dellana, and J. Qian “Neural network ensemble strategies
for financial decision applications,” Computers & Operations Research,
vol. 32, no. 10, pp. 2543–2559, 2005.
[12] A. Peimankar, S. J. Weddell, T. Jalal, and A. C. Lapthorn “Evolutionary
multi-objective fault diagnosis of power transformers,” Swarm and
Evolutionary Computation, vol. 36, pp. 62–75, 2017.
[13] L. Breiman, “Random forests,” Machine Learning, vol. 45, no. 1, pp.
5-32, 2001.
[14] Y. Freund and R. E. Schapire, “A decision-theoretic generalization of
on-line learning and an application to boosting,” Journal of Computer
and System Sciences, vol. 55, no. 1, pp. 119–139, 1997.
[15] Haykin, Simon. Neural networks: a comprehensive foundation. Prentice
Hall PTR, 1994.
[16] Shafer, Glenn. A mathematical theory of evidence. Vol. 42. Princeton
University Press, 1976.
[17] I. Silva and G. B. Moody, “An open-source toolbox for analyzing and
processing Physionet databases in Matlab and octave,” Journal of Open
Research Software, vol. 2, no. 1, 2014.
[18] MIT-BIH Arrhythmia Database. (2010). [Online]. Available:
www.physionet.org/physiobank/database/mitdb
[19] Testing and Reporting Performance Results of Cardiac Rhythm and ST
Segment Measurement Algorithms. Association for the Advancement of
Medical Instrumentation, 1998.
[20] Oppenheim, Alan V., and Alan S. Willsky. “with Ian T. Young.” Signals
and Systems, 1983.
[21] A. L. Goldberger, L. A. Amaral, L. Glass, J. M. Hausdorff , P. C.
Ivanov, R. G. Mark, J. E. Mietus, G. B. Moody, C. K. Peng, H. E.
Stanley, “PhysioBank, PhysioToolkit, and PhysioNet: Components of a
New Research Resource for Complex Physiologic Signals,” Circulation,
vol. 10, no. 23, pp. e215–e220, 2000.
[22] Ho, Tin Kam. “Random decision forests,” Proceedings of 3rd IEEE
International Conference on Document Analysis and Recognition, Vol.
1, 1995.
[23] Hastie, Trevor, Robert Tibshirani, and Jerome Friedman. “The elements
of statistical learning: data mining, inference, and prediction, Springer
Series in Statistics.”: xxii–745, 2009.
[24] Klein, Lawrence A., and Lawrence A. Klein. Sensor and data fusion:
a tool for information assessment and decision making. Vol. 324.
Bellingham: SPIE press, 2004.
[25] He, Haibo, et al. “ADASYN: Adaptive synthetic sampling approach
for imbalanced learning.” 2008 IEEE International Joint Conference
on Neural Networks (IEEE World Congress on Computational Intel-
ligence)”, IEEE, 2008.
[26] S-W. Chen, P. M. Clarkson, and Q. Fan “A robust sequential detection
algorithm for cardiac arrhythmia classification,” IEEE Transactions on
Biomedical Engineering, vol. 43, no. 11, pp. 54–64, 1996.
[27] M. G. Tsipouras, C. Voglis, and D. I. Fotiadis “A robust sequential detec-
tion algorithm for cardiac arrhythmia classification,” IEEE Transactions
on Biomedical Engineering, vol. 54, no. 11, pp. 2089–2105, 2007.
[28] I. Güler and E. Übeylı “ECG beat classifier designed by combined neural
network model,” Pattern Recognition, vol. 38, no. 2, pp. 199–208, 2005.
[29] Y-C. Yeh, C. W. Chiou, and H-J. Lin “Analyzing ECG for cardiac
arrhythmia using cluster analysis,” Expert Systems with Applications,
vol. 39, no. 1, pp. 1000–1010, 2012.

388 2019 IEEE Region 10 Conference (TENCON 2019)

You might also like