0% found this document useful (0 votes)
3 views

4. Learning Algorithm

Uploaded by

Châu Hiệp
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

4. Learning Algorithm

Uploaded by

Châu Hiệp
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Introduction to

Artificial Intelligent
(AI)
4. Learning
Algorithms
Motivation
Real world example:
• Fish packing plant: separate sea
bass from salmon using optical
sensing
• Features: Physical differences such
as length, lightness, width, number
and shape of fins, position of the
mouth
• Noise: variations in lighting,
position of the fish on the conveyor,
“static” due to the electronics of 2
Motivation

Histograms for the length feature for the two categories


3
Motivation

Histograms for the lightness feature for the two categories


4
decision boundary

The two features of lightness and width for sea bass and salmon

How would our system automatically determine the decision boundary? 5


Loss

Loss is the function of error over training data

Error is the difference between a single actual value and a single predicted
value 6
Loss

Regression loss functions


Python code
import numpy as np
def rmse(predictions, targets):
differences = predictions - targets
differences_squared = differences ** 2
mean_of_differences_squared =
Mean square loss differences_squared.mean()
rmse_val = np.sqrt(mean_of_differences_squared)
return rmse_val

Python code
import numpy as np
def mae(predictions, targets):
differences = predictions - targets
absolute_differences = np.absolute(differences)
mean_absolute_differences =
Mean absolute loss absolute_differences.mean()
return mean_absolute_differences 7
Learning algorithms

1969: Strassen's  1950s - 1970s: Early Foundations


algorithm for  1957: Perceptron (Frank Rosenblatt) – One of the earliest neural networks
matrix designed for binary classification.
multiplication
 1960s: K-nearest neighbors (KNN) – A simple instance-based learning
method developed for classification tasks.
 1980s: The Rise of Neural Networks
 1980: Multi-layer Perceptron training by Backpropagation – Developed by
Paul Werbos, later popularized in the 1980s for training neural networks.
 1990s: Advancements in Ensemble Methods and Optimization
All algorithms  1995: Random Forest (Leo Breiman) – A decision tree-based ensemble
are implemented learning technique that reduces overfitting.
on the CPU
 1995: Support Vector Machines gain practical relevance with the advent of
kernel methods.

8
Learning algorithms
 2000s: Kernel Methods and Probabilistic Models
2006: NVIDIA
release CUDA  2001: Adaboost – An adaptive boosting method developed by Yoav Freund and
Robert Schapire.
2009: Andrew Ng
utilized GPUs to  2010s: Deep Learning Revolution
accelerate the  2012: AlexNet (Krizhevsky et al.) – A deep convolutional neural network that
training of large won the ImageNet competition, leading to breakthroughs in computer vision.
neural networks  2014: Generative Adversarial Networks (GANs) (Ian Goodfellow et al.) –
Introduced a new framework for generating synthetic data through adversarial
learning.
 2017: Transformers (Vaswani et al.) – Revolutionized natural language
processing (NLP) by eliminating the need for recurrent neural networks.
 2020s: Scalable AI and Further Innovations
 2020: GPT-3 (OpenAI) – A large-scale transformer-based model demonstrating
significant progress in language understanding and generation.

9
Random forest

10
Adaboost
decision stumps or decision trees

‘Boosting’ : a family of algorithms which converts weak learners to


strong learners.
𝑛
𝐻 ( 𝑥 ) =𝑠𝑖𝑔𝑛 ∑ 𝛼 𝑖 h𝑖 (𝑥)
𝑖=1
: learners
: weight of the leaner

11
Adaboost

12
Adaboost
 Weak learners for image recognition

Haar filters
Common features 160,000+ possible features
associated with each 24 x 24 window

13
Cascade filter

14
Cascade filter
Prepare data

Negative Images Positive Images


images which do not contain the target object images which contain the target object

A proportion of 2:1 or higher between negative and positive samples is considered accept
15
Cascade filter

16

You might also like