0% found this document useful (0 votes)
164 views

Chapter #5 - Deep Learning

The document discusses deep learning and provides definitions of key related concepts. It defines deep learning as a class of machine learning algorithms using multiple layers to progressively extract higher-level features from raw input. It defines artificial intelligence as intelligence demonstrated by machines, machine learning as algorithms that improve automatically through experience, and deep learning as a form of machine learning using deep neural networks.

Uploaded by

mzoon mohmmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
164 views

Chapter #5 - Deep Learning

The document discusses deep learning and provides definitions of key related concepts. It defines deep learning as a class of machine learning algorithms using multiple layers to progressively extract higher-level features from raw input. It defines artificial intelligence as intelligence demonstrated by machines, machine learning as algorithms that improve automatically through experience, and deep learning as a form of machine learning using deep neural networks.

Uploaded by

mzoon mohmmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Fahad bin sultan university

College of Computing
MSC 536 - Selected Topic in Artificial Intelligence
CEN 526 - Artificial Intelligence

Spring 2022-2023
Deep Learning

Dr. Mohammad Hujooj

Computer Science and IT | College of Computing |


Deep learning
Deep learning is part of a broader family of machine
learning methods based on Artificial Neural Networks(ANN)
with representation learning. Learning can be supervised,
semi-supervised or unsupervised.

Spring 2022-2023
Deep-learning architectures such as deep neural networks,
deep belief networks, deep reinforcement learning,
recurrent neural networks, convolutional neural networks
and transformers have been applied to fields including
computer vision, speech recognition, natural language
processing, machine translation, bioinformatics, drug
design, medical image analysis, climate science, material
inspection and board game programs, where they have
produced results comparable to and in some cases
surpassing human expert performance.
Computer Science and IT | College of Computing | 2
Deep learning……..

Artificial neural networks (ANNs) were inspired by information


processing and distributed communication nodes in biological

Spring 2022-2023
systems. ANNs have various differences from biological brains.
Specifically, artificial neural networks tend to be static and
symbolic, while the biological brain of most living organisms is
dynamic (plastic) and analog.

Computer Science and IT | College of Computing | 3


Deep learning……..……

The adjective "deep" in deep learning refers to the use of


multiple layers in the network. Early work showed that a

Spring 2022-2023
linear perceptron cannot be a universal classifier, but that
a network with a nonpolynomial activation function with
one hidden layer of unbounded width can. Deep learning
is a modern variation that is concerned with an
unbounded number of layers of bounded size, which
permits practical application and optimized
implementation, while retaining theoretical universality
under mild conditions. In deep learning the layers are
also permitted to be heterogeneous and to deviate
widely from biologically informed connectionist models,
for the sake of efficiency, trainability and
understandability.
Computer Science and IT | College of Computing | 4
Deep learning……..……

Spring 2022-2023
Computer Science and IT | College of Computing | 5
Definition

Spring 2022-2023
Computer Science and IT | College of Computing | 6
Deep learning is a class of machine learning algorithms that
uses multiple layers to progressively extract higher-level
features from the raw input. For example, in image processing,
lower layers may identify edges, while higher layers may
identify the concepts relevant to a human such as digits or

Spring 2022-2023
letters or faces.

From another angle to view deep learning, deep learning refers


to ‘computer-simulate’ or ‘automate’ human learning
processes from a source (e.g., an image of dogs) to a learned
object (dogs). Therefore, a notion coined as “deeper” learning
or “deepest” learning makes sense. The deepest learning
refers to the fully automatic learning from a source to a final
learned object. A deeper learning thus refers to a mixed
learning process: a human learning process from a source to a
learned semi-object, followed by a computer learning process
from the human learned semi-object to a final learned object.
Computer Science and IT | College of Computing | 7
Deep Learning Vs. Machine Learning Vs.
AI: An In-Depth Guide

Spring 2022-2023
Computer Science and IT | College of Computing | 8
Deep Learning Vs. Machine Learning Vs.
AI: An In-Depth Guide…………

▪ Self-driving cars

Spring 2022-2023
▪ Touched-up selfies

▪ Netflix recommendations

• Chatbots that write like people

• Virtual assistants that talk like friends

• Every last one of the billions of daily Google searches…

Computer Science and IT | College of Computing | 9


Deep Learning Vs. Machine Learning Vs.
AI: An In-Depth Guide…………

Artificial intelligence (AI) is all around you, and it’s only getting

Spring 2022-2023
more pervasive. But if you start looking into how it works,
you’ll immediately run into a few questions about the
concepts surrounding it: What is deep learning vs. machine
learning? And how do artificial intelligence, machine learning,
and deep learning relate to one another?

Computer Science and IT | College of Computing | 10


Deep Learning Vs. Machine Learning Vs.
AI: An In-Depth Guide…………

There are simple answers to these questions as well as

Spring 2022-2023
complex ones, and then there’s the math behind it all,
which is so complicated that it’s probably best to set aside
for now. At any level of understanding, these answers are
important—particularly for brands and marketers. After all,
AI increasingly governs interactions with customers, as
illustrated by the rise of voice AI technologies.

Computer Science and IT | College of Computing | 11


Deep Learning Vs. Machine Learning Vs.
AI: An In-Depth Guide…………
At ReadSpeaker, we use deep learning to create advanced
synthetic speech that gives voice to consumer

Spring 2022-2023
touchpoints like voicebots, smart home devices, AI
assistants, and conversational AI platforms of all
descriptions. It’s the perfect illustration of a practical use
for deep learning. But first, if you want to understand
machine learning, AI, and deep learning, start with a few
key definitions.

Computer Science and IT | College of Computing | 12


What do artificial intelligence, machine
learning, and deep learning mean?

Spring 2022-2023
Computer Science and IT | College of Computing | 13
What do artificial intelligence, machine
learning, and deep learning mean?

Wikipedia’s definition of artificial intelligence is broadly


accepted. According to the site, “AI is intelligence

Spring 2022-2023
demonstrated by machines.” Wikipedia contrasts this with
the “natural intelligence displayed by humans and animals,
which involves consciousness and emotionality.” As for
intelligence itself, that’s simply an ability to obtain
information and use it adaptively.

Computer Science and IT | College of Computing | 14


What do artificial intelligence, machine
learning, and deep learning mean?

•Machine learning is “the study of computer algorithms


that improve automatically through experience,” says

Spring 2022-2023
computer scientist Tom Mitchell, who literally wrote the
book on machine learning.

Computer Science and IT | College of Computing | 15


What do artificial intelligence, machine
learning, and deep learning mean?

•Deep learning is a form of machine learning conducted

Spring 2022-2023
(for the most part) through deep neural networks (DNN).
There are other ways to perform deep learning, but DNNs
are far and away the most common in practical use today.

Computer Science and IT | College of Computing | 16


What do artificial intelligence, machine
learning, and deep learning mean?
With these definitions in mind, we can ask our main
question: What’s the relationship between machine
learning, deep learning, and AI? Deep learning is a

Spring 2022-2023
subset of machine learning, and machine learning is a
subset of AI. However, deep learning has become so
dominant in AI circles that, when someone mentions AI,
it’s extremely likely that they’re also talking about deep
learning; you can assume that any discussion of AI is
also a discussion of deep learning (and therefore
machine learning, too).

Computer Science and IT | College of Computing | 17


What is deep learning?

In order to understand deep learning, you need to


know about deep neural networks. There are other
machine learning models that achieve what we call

Spring 2022-2023
“deep learning,” but neural networks have eclipsed all
the rest to the extent that you can safely assume any
mention of deep learning is based on the neural
network model—so much so that an effective (if not
scientifically accurate) definition of deep learning could
be “machine learning through deep neural network
architecture.”

Computer Science and IT | College of Computing | 18


What is deep learning?

The Neural Network computation model was inspired by


the connections of neurons within the human brain—but

Spring 2022-2023
that’s only a rough analogy. When you get down to the
details, human brains and neural networks are extremely
different. Still, the metaphor is helpful for understanding
the broad strokes of neural networks: Neural networks
mimic the way the human brain works to process
information.

Computer Science and IT | College of Computing | 19


What is deep learning?..............

Our brains learn by establishing repeatable patterns of


electrochemical connections between neurons. A neural

Spring 2022-2023
network does something similar. The neural networks,
widely used in the 1990s, consist of three layers of
networked processors, or artificial neurons: the input
layer, a hidden layer, and an output layer. Each neuron
receives input data, performs an operation on that data,
and exports the results of the operation as an input to
the next processing layer.

Computer Science and IT | College of Computing | 20


What is deep learning?..............

These neural networks are also called multi-layered

Spring 2022-2023
perceptrons (MLP), and they efficiently solved some of
the most vexing problems of their era. But with the
increasing availability of larger datasets, and long strains
of research breaking into practice, neural networks
evolved alongside the dawning era of Big Data.

Computer Science and IT | College of Computing | 21


Abstraction of local features gives DNN its power.
Deep neural networks process abstract representations of
features, which are more robust than local
representations. A DNN extracts multiple abstract
representations as data passes through its multiple layers—

Spring 2022-2023
the more layers, the more abstract representations the
system can extract.
For example, say our goal is to train a neural network to
classify images of dogs and cats accurately. If we feed
millions of images into the neural network, the local features
of these images may include:
•Shapes of eyes
•Shapes of ears
•Fur color
•Fur pattern
Computer Science and IT | College of Computing | 22
Abstraction of local features gives DNN its
power…..
➢ Shapes of eyes, Shapes of ears, Fur color and Fur
pattern.
See the trouble? These local features are themselves

Spring 2022-2023
highly variable; there’s no single color of fur that can
differentiate a cat from a dog. You can’t make a one-to-
one connection between a single local feature and the
designation “cat.” Instead, we need to study abstract
features: not just certain eye shapes, but the whole
placement and appearance of eyes within a face. That
helps the system categorize images as “cat” or “dog”
based on abstract relationships between local
representations.
Computer Science and IT | College of Computing | 23
Abstraction of local features gives DNN its
power…..

Multiple hidden layers allow a DNN to learn more of these


abstract features that add up to more informed classifications

Spring 2022-2023
of cats and dogs. Programmers may not even know which
features the DNN is extracting; we just know that when we
pass enormous volumes of data (labeled images of cats and
dogs) through the network, it maps a processing path toward
more accurate categorization of new images.

Computer Science and IT | College of Computing | 24


Abstraction of local features gives DNN its
power…..

To sum up, deep neural networks provide deep learning by


passing data through multiple hidden processing layers to

Spring 2022-2023
conduct more complex processes. It’s a powerful
technology—so powerful, theoretically, that we have to ask
another question: Why are we only beginning to use this
computing architecture now, decades after it was theorized?

Computer Science and IT | College of Computing | 25


How did deep learning go from theory to
practical application?
There are three reasons deep learning took so long to
go from theory to everyday use.

Spring 2022-2023
1.Hardware. When deep neural networks first emerged,
we didn’t have hardware that was efficient enough to
train DNNs. That changed with the development of
modern graphics processing units (GPUs)—the
development of which was hastened along by video
game consoles, not the needs of academics. The GPU on
a $200 video game console can be ideal hardware for
training complex DNN models—much better than even
an advanced central processing unit, or CPU.
Computer Science and IT | College of Computing | 26
How did deep learning go from theory to
practical application?……..

2. Big data. Large datasets weren’t as available in the

Spring 2022-2023
early days of neural networks. To train a complex DNN
model in a stable way, you need extraordinary amounts
of data. In the early days of AI, that data wasn’t available.
Now there are easily accessible platforms that allow
researchers to collect billions and billions of data points
every single day.

Computer Science and IT | College of Computing | 27


How did deep learning go from theory to
practical application?……..

3.Deep learning algorithms. Most importantly, scientists


hadn’t yet devised the algorithms we use to keep highly

Spring 2022-2023
complex DNN models stable during training. Early
algorithms led to unstable models, which weren’t
dependable enough for practical use. Thanks to the work of
enterprising scientists, today’s deep learning algorithms can
train stable DNN models to achieve consistent results.

Computer Science and IT | College of Computing | 28


What’s next for deep learning and artificial
intelligence in Text-to-Speech (TTS)
applications?

Spring 2022-2023
Deep neural networks don’t just play into the best
contemporary vocoders; they’re also introducing a whole
new approach to speech synthesis. Researchers at
Google, ReadSpeaker, and other tech companies are
working on end-to-end TTS models.

What does that mean?

Computer Science and IT | College of Computing | 29


What’s next for deep learning and artificial
intelligence in Text-to-Speech (TTS)
applications?...........

Today’s TTS systems generate speech in multiple steps;


they start with linguistic pre-processing, create an

Spring 2022-2023
acoustic model, and only then transmit predictive data to
the vocoder. Each of these steps may require tweaking by
computational linguists—in other words, most neural TTS
still requires human intervention. The human knowledge
embedded in these linguistic pipelines opens the door for
bias and errors.

Computer Science and IT | College of Computing | 30


What’s next for deep learning and artificial
intelligence in Text-to-Speech (TTS)
applications?...........

An end-to-end TTS model seeks to minimize human


intervention by predicting accurate pronunciation directly

Spring 2022-2023
from characters. Google’s Tacotron is an experimental end-
to-end TTS model that, while not stable enough for
commercial use yet, is proving the possibility of such a
system. It could be that end-to-end TTS is the next step in
the evolution of synthetic speech.

Computer Science and IT | College of Computing | 31


Other research areas (many of which we’re
pursuing in the ReadSpeaker VoiceLab)
include:
▪ Creating more efficient neural TTS models that limit
costs.
▪ Developing a more compact, robust TTS model to

Spring 2022-2023
bring DNN-powered synthetic speech to smaller
devices and low-resource computing environments.
▪ TTS that’s controllable for emotional speaking styles.
▪ Multilingual TTS, so one voice model can speak
multiple languages.
▪ Adjustable speaking styles within a single TTS model.
▪ Expressing the hidden, underlying aspects of speech
that even linguists haven’t mapped out yet

Computer Science and IT | College of Computing | 32


Assignment _ Moodle

Suggest a system in your work that includes the use of


deep learning, with an explanation of the way in which

Spring 2022-2023
this system can be applied. You can use algorithms,
pseudo code, graphs.....etc.

Computer Science and IT | College of Computing | 33


THE END

Spring 2022-2023
Computer Science and IT | College of Computing | 34

You might also like