Machine Learning Algorithms: A Review: Department of CSE, Gautam Buddha University, Greater Noida, Uttar Pradesh, India
Machine Learning Algorithms: A Review: Department of CSE, Gautam Buddha University, Greater Noida, Uttar Pradesh, India
Machine Learning Algorithms: A Review: Department of CSE, Gautam Buddha University, Greater Noida, Uttar Pradesh, India
www.ijcsit.com 1
Fig. 2. Workflow of supervised machine learning algorithm [4]
B. Unsupervised Learning
The unsupervised learning algorithms learns few features
from the data. When new data is introduced, it uses the
previously learned features to recognize the class of the
data. It is mainly used for clustering and feature reduction.
An example of workflow of unsupervised learning is given
in Fig. 9.
Fig. 11. Pseudo code for k-means clustering [13]
REFERENCES
[1] W. Richert, L. P. Coelho, “Building Machine Learning Systems with
Python”, Packt Publishing Ltd., ISBN 978-1-78216-140-0
[2] M. Welling, “A First Encounter with Machine Learning”
[3] M. Bowles, “Machine Learning in Python: Essential Techniques for
Predictive Analytics”, John Wiley & Sons Inc., ISBN: 978-1-118-
96174-2
[4] S.B. Kotsiantis, “Supervised Machine Learning: A Review of
Classification Techniques”, Informatica 31 (2007) 249-268
Fig. 21. Reinforced Neural Network [25]
[5] L. Rokach, O. Maimon, “Top – Down Induction of Decision Trees
Classifiers – A Survey”, IEEE Transactions on Systems,
H. Instance-Based Learning [6] D. Lowd, P. Domingos, “Naïve Bayes Models for Probability
In instance-based learning, the learner learns a particular Estimation”
[7] https://webdocs.cs.ualberta.ca/~greiner/C-
type of pattern. It tries to apply the same pattern to the 651/Homework2_Fall2008.html
newly fed data. Hence the name instance-based. It is a type [8] D. Meyer, “Support Vector Machines – The Interface to libsvm in
of lazy learner which waits for the test data to arrive and package e1071”, August 2015
then act on it together with training data. The complexity of [9] S. S. Shwartz, Y. Singer, N. Srebro, “Pegasos: Primal Estimated sub -
Gradient Solver for SVM”, Proceedings of the 24th International
the learning algorithm increases with the size of the data. Conference on Machine Learning, Corvallis, OR, 2007
Given below is a well-known example of instance-based [10] http://www.simplilearn.com/what-is-machine-learning-and-why-it-
learning which is k-nearest neighbor [26]. matters-article
[11] P. Harrington, “Machine Learning in action”, Manning Publications
1) K-Nearest Neighbor: In k-nearest neighbor (or KNN), Co., Shelter Island, New York, 2012
the training data (which is well-labeled) is fed into the [12] http://pypr.sourceforge.net/kmeans.html
[13] K. Alsabati, S. Ranaka, V. Singh, “An efficient k-means clustering
learner. When the test data is introduced to the learner, it algorithm”, Electrical Engineering and Computer Science, 1997
compares both the data. k most correlated data is taken from [14] M. Andrecut, “Parallel GPU Implementation of Iterative PCA
training set. The majority of k is taken which serves as the Algorithms”, Institute of Biocomplexity and Informatics, University
new class for the test data [27]. The pseudo code for KNN of Calgary, Canada, 2008
[15] X. Zhu, A. B. Goldberg, “Introduction to Semi – Supervised
is given in Fig. 22. Learning”, Synthesis Lectures on Artificial Intelligence and Machine
Learning, 2009, Vol. 3, No. 1, Pages 1-130
[16] X. Zhu, “Semi-Supervised Learning Literature Survey”, Computer
Sciences, University of Wisconsin-Madison, No. 1530, 2005
[17] R. S. Sutton, “Introduction: The Challenge of Reinforcement
Learning”, Machine Learning, 8, Page 225-227, Kluwer Academic
Publishers, Boston, 1992
[18] L. P. Kaelbing, M. L. Littman, A. W. Moore, “Reinforcement
Learning: A Survey”, Journal of Artificial Intelligence Research, 4,
Page 237-285, 1996
[19] R. Caruana, “Multitask Learning”, Machine Learning, 28, 41-75,
Kluwer Academic Publishers, 1997
[20] D. Opitz, R. Maclin, “Popular Ensemble Methods: An Empirical
Study”, Journal of Artificial Intelligence Research, 11, Pages 169-
198, 1999
[21] Z. H. Zhou, “Ensemble Learning”, National Key Laboratory for
Novel Software Technology, Nanjing University, Nanjing, China
[22] https://en.wikipedia.org/wiki/Boosting_(machine_learning)
[23] https://en.wikipedia.org/wiki/Bootstrap_aggregating
[24] V. Sharma, S. Rai, A. Dev, “A Comprehensive Study of Artificial
Neural Networks”, International Journal of Advanced Research in
Computer Science and Software Engineering, ISSN 2277128X,
Volume 2, Issue 10, October 2012
[25] S. B. Hiregoudar, K. Manjunath, K. S. Patil, “A Survey: Research
Summary on Neural Networks”, International Journal of Research in
Engineering and Technology, ISSN: 2319 1163, Volume 03, Special
Issue 03, pages 385-389, May, 2014
[26] https://en.wikipedia.org/wiki/Instance-based_learning
[27] P. Harrington, “Machine Learning in Action”, Manning Publications
Co., Shelter Island, New York, ISBN 9781617290183, 2012
Fig. 22. Pseudo code for K-Nearest Neighbor [28] [28] J. M. Keller, M. R. Gray, J. A. Givens Jr., “A Fuzzy K-Nearest
Neighbor Algorithm”, IEEE Transactions on Systems, Man and
Cybernetics, Vol. SMC-15, No. 4, August 1985