0% found this document useful (0 votes)
18 views

Topic 07-Part1 Introduction To Deep Neural Networks

Uploaded by

S As
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Topic 07-Part1 Introduction To Deep Neural Networks

Uploaded by

S As
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

COE 292

Introduction to Artificial Intelligence

Introduction to Deep Learning


The slides are mostly adopted from material developed by:
Presentation is based on the content developed by Dr. Akram F. Ahmed, COE Dept.
Dr. Aiman El-Maleh, KFUPM, COE Department

04/04/2024 INTRODUCTION TO AI 1
Outline
What is Deep Learning
Why Deep Learning
Neural Networks
Neural Networks History
Applications of Deep Learning
What is Deep Learning
 Artificial Intelligence: Techniques that enable computers to mimic
human behavior
 Machine Learning: Ability to learn and improve from experience
without explicitly being programmed
 Typical ML pipeline: Extract features → optimize model → inference
 Deep Learning: Machine learning framework based on learning
multiple levels of representation/abstraction using multi-layer neural
networks. It shows impressive performance on many Artificial
Intelligence tasks.
 Optimize model → inference
Advantages of Deep Learning
 Making the most of your data
 Models data
 Finds good “representations” for data
 Identifies “features”
 Such representations help with:
 Understanding model
 Discrimination and classification
 Compressing data/dimension reduction
 Identify patterns
 Find anomalies
 Predict/restore missing data
 Hierarchical
 Online/adaptive learning
 Lessens the need for a deep mathematical grasp
Why Deep Learning Took off
 Big Data
 Larger data sets
 Easier collection and storage
 Hardware Computing Power
 Graphics Processing Units (GPUs), Google TPU,
Intel Nervana, Movidius HW accelerators
 Open-source software
 Improved techniques, new models, Toolboxes
 Five decades of research in machine learning
 Resources and efforts from large corporations
 Better media coverage and success cases
Neural Computation
 Neural Computation is a general Machine Learning approach that
involves processing information in a similar manner to the networks of
neurons (i.e. Neural Networks) found in human/animal brains
 Artificial Neural Networks (ANNs) are networks of Artificial Neurons,
and hence constitute crude approximations to parts of real brains
 They are massively parallel, which makes them efficient, robust, fault
and noise tolerant
Artificial Neural Networks (ANNs)
 ANNs can learn from training data and generalize to new situations and have
high expressive power
 Neural networks have become one of the major thrust areas recently in various
AI tasks including pattern recognition, prediction, and analysis problems
 In many problems they have established the state of the art, often exceeding
previous benchmarks by large margins
 An ANN is usually characterized by
 The way the neurons are connected to each other
 The method that is used for the determinations of the connection strengths
or weights
 The activation function
Brain vs Computer
 There are approximately 10 billion neurons in the human cortex,
compared with 10’s of thousands of processors in the most powerful
parallel computers
 Each biological neuron is connected to several thousands of other
neurons, similar to the connectivity in powerful parallel computers
 The typical operating speeds of biological neurons is measured in
milliseconds (10-3 s), while a silicon chip can operate in nanoseconds
(10-9 s)
 The human brain is extremely energy efficient, using approximately
10-16 joules per operation per second, whereas the best computers
today use around 10-6 joules per operation per second
Brain vs Computer
 Tasks that are easy for brains are not easy for computers and vice versa
 Brains
 Recognizing faces
 Retrieving information based on partial descriptions
 Organizing information (the more information the better the brain operates)
 Computers
 Arithmetic
 Deductive logic
 Retrieving information based on arbitrary features
 Brains must operate very differently from conventional computers
Neural Network History
 Neural networks originally began as computational models
of the brain - or more generally, models of cognition
 The earliest model of cognition was associationism
 Mid 1800’s: The brain is a mass of interconnected neurons
 1873: The information is in the connections (Alexander Bain)
 The more recent model of the brain is connectionist
 Neurons connect to neurons
 The workings of the brain are encoded in these connections
 The machine has many non-linear processing units
 Current neural network models are connectionist machines
Modeling the Brain
 A neuron:

 Signals come in through the dendrites into the Soma


 A signal goes out via the axon to other neurons
 Only one axon per neuron
 Adult neurons do not undergo cell division
Perceptron
Simplified Mathematical Model
x1 W1

W2 Sum Output Y
x2 ∑
Threshold T Provided a learning algorithm:

xN
WN
d(x): desired output
Y: actual output
Threshold logic:
Fire if combined input exceeds threshold  Update the weights whenever
the perceptron output is wrong
 Proved convergence for linearly
separable classes
Perceptron
 The perceptron can mimic primitive Boolean gates
 AND


 OR
x1 Output
Inputs


Threshold
 • 1 y=x1˅ x2
x2
Neuron
Perceptron
 The perceptron can mimic primitive Boolean gates
 NOT

 No solution for XOR gate!!


Multi-Layer Perceptrons
 Multi-layer perceptrons can model arbitrarily complex Boolean
functions
Applications of Deep Learning Networks
Applications of Deep Learning Networks
Applications of Deep Learning Networks
Applications of Deep Learning Networks
Applications of Deep Learning Networks
China’s Watchful Eye

Beijing bets on facial recognition in a big drive for total surveillance – Washington Post, 1/8/2018
Applications of Deep Learning Networks
Deep Art: Combining Content and Style from Different Images
 Coarse-scale Content from one
image, Fine-scale Style from
another image
 Dynamic Capacity Networks
(DCNs) learn sophisticated multi-
scale representations
Applications of Deep Learning Networks
Generative Adversarial Nets (GANs) for Natural Image Translation

(https://arxiv.org/pdf/1703.10593.pdf)
Applications of Deep Learning Networks

Google Translate App

• Translate between 103 languages by typing


• Offline: Translate 52 languages when you have no
Internet
• Instant camera translation: Use your camera to
translate text instantly in 30 languages
• Camera Mode: Take pictures of text for higher-
quality translations in 37 languages
• Conversation Mode: Two-way instant speech
translation in 32 languages
Applications of Deep Learning Networks
• 2013:
DeepMind uses deep reinforcement learning
to beat humans at some Atari games

• 2016:
DeepMind’s AlphaGo system beats Go grand
master Lee Sedol 4-1
• 2017: AlphaZero
learns to play Go and chess from scratch
Applications of Deep Learning Networks

Image source

Deep learning crucial for the global success of


automotive autonomy – Automotive World, 6/26/2018
Acknowledgments
 Slides have been used from:
 https://www.cs.cmu.edu/~bhiksha/courses/deeplearning/Spring.
2019/www/
 https://www.cs.colorado.edu/~mozer/Teaching/syllabi/DeepLear
ningFall2017/
 https://fleuret.org/ee559/
 https://canvas.northwestern.edu/courses/75723/assignments/syll
abus
 http://slazebni.cs.illinois.edu/fall18/
 https://www.cs.bham.ac.uk/~jxb/inc.html
 http://elec576.rice.edu/schedule-and-syllabus/
Acknowledgments
 Slides have been used from:
 https://www.cs.cmu.edu/~bhiksha/courses/deeplearning/Spring.
2019/www/
 https://www.cs.colorado.edu/~mozer/Teaching/syllabi/DeepLear
ningFall2017/
 https://fleuret.org/ee559/
 https://canvas.northwestern.edu/courses/75723/assignments/syll
abus
 http://slazebni.cs.illinois.edu/fall18/
 https://www.cs.bham.ac.uk/~jxb/inc.html
 http://elec576.rice.edu/schedule-and-syllabus/

You might also like