4. Learning Algorithm
4. Learning Algorithm
Artificial Intelligent
(AI)
4. Learning
Algorithms
Motivation
Real world example:
• Fish packing plant: separate sea
bass from salmon using optical
sensing
• Features: Physical differences such
as length, lightness, width, number
and shape of fins, position of the
mouth
• Noise: variations in lighting,
position of the fish on the conveyor,
“static” due to the electronics of 2
Motivation
The two features of lightness and width for sea bass and salmon
Error is the difference between a single actual value and a single predicted
value 6
Loss
Python code
import numpy as np
def mae(predictions, targets):
differences = predictions - targets
absolute_differences = np.absolute(differences)
mean_absolute_differences =
Mean absolute loss absolute_differences.mean()
return mean_absolute_differences 7
Learning algorithms
8
Learning algorithms
2000s: Kernel Methods and Probabilistic Models
2006: NVIDIA
release CUDA 2001: Adaboost – An adaptive boosting method developed by Yoav Freund and
Robert Schapire.
2009: Andrew Ng
utilized GPUs to 2010s: Deep Learning Revolution
accelerate the 2012: AlexNet (Krizhevsky et al.) – A deep convolutional neural network that
training of large won the ImageNet competition, leading to breakthroughs in computer vision.
neural networks 2014: Generative Adversarial Networks (GANs) (Ian Goodfellow et al.) –
Introduced a new framework for generating synthetic data through adversarial
learning.
2017: Transformers (Vaswani et al.) – Revolutionized natural language
processing (NLP) by eliminating the need for recurrent neural networks.
2020s: Scalable AI and Further Innovations
2020: GPT-3 (OpenAI) – A large-scale transformer-based model demonstrating
significant progress in language understanding and generation.
9
Random forest
10
Adaboost
decision stumps or decision trees
11
Adaboost
12
Adaboost
Weak learners for image recognition
Haar filters
Common features 160,000+ possible features
associated with each 24 x 24 window
13
Cascade filter
14
Cascade filter
Prepare data
A proportion of 2:1 or higher between negative and positive samples is considered accept
15
Cascade filter
16