0% found this document useful (0 votes)
14 views127 pages

Unit 4 (B) NGP

machine learning data

Uploaded by

Neha Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views127 pages

Unit 4 (B) NGP

machine learning data

Uploaded by

Neha Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 127

UNIT 4

https://www.youtube.com/playlist?list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI
Link MIT

https://www.youtube.com/watch?v=zfiSAzpy9NM
Link Code Basics
Let machine detect and learn features/ patterns through hierarchical learning.
Example of Deep Learning
Deep learning
“Deep learning is a collection of statistical techniques of machine
learning for learning feature hierarchies that are actually based on
artificial neural networks.”

Deep-learning architectures such as deep neural networks, deep


belief networks, deep reinforcement learning, recurrent neural
networks, convolutional neural networks and Transformers have
been applied to fields including computer vision, speech
recognition, natural language processing, machine
translation, bioinformatics, drug design, medical image
analysis, climate science, material inspection and board
game programs, where they have produced results comparable to
and in some cases surpassing human expert performance

May 18, 2024 40


It is a machine learning class that makes use of numerous nonlinear
processing units so as to perform feature extraction as well as
transformation. The output from each preceding layer is taken as
input by each one of the successive layers.

Deep learning models are capable enough to focus on the accurate


features themselves by requiring a little guidance from the
programmer and are very helpful in solving out the problem of
dimensionality. Deep learning algorithms are used, especially when
we have a huge number of inputs and outputs.

Deep learning is implemented with the help of Neural Networks,


and the idea behind the motivation of Neural Network is the
biological neurons, which is nothing but a brain cell.

May 18, 2024 41


we provide the raw data of images to the first layer of the input layer. After
then, these input layer will determine the patterns of local contrast that
means it will differentiate on the basis of colors, luminosity, etc. Then the 1st
hidden layer will determine the face feature, i.e., it will fixate on eyes, nose,
and lips, etc. And then, it will fixate those face features on the correct face
template. So, in the 2nd hidden layer, it will actually determine the correct
face here as it can be seen in the above image, after which it will be sent to
the output layer.
May 18, 2024 42
Deep Computer Vision: CNN
https://www.youtube.com/watch?v=NmLK_WQBxB4&list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI&index=3
TensorFlow
• TensorFlow is an open-source platform for machine learning
and a symbolic math library that is used for neural networks.
Keras
• It is an Open-Source Neural Network library that runs on top of
CNTK, Theano or Tensor flow. It is designed to be fast and
easy for the user to use. It is a useful library to construct any
deep learning algorithm of whatever choice we want.
Softmax Activation Function
• Converts real values into probabilities
• Only used as output layer of NN.
• Consider higher probability as actual output.
Filters are just Feature Detectors
Take a feature
map and apply
max pooling
Basically, it helps in detecting position invariant feature detection.
Recurrent Neural Network (RNN)

Recurrent Neural Network(RNN) is a type of Neural Network where the output from the previous step is fed as input
to the current step.

The main and most important feature of RNN is its Hidden state, which remembers some information about a
sequence.

The state is also referred to as Memory State since it remembers the previous input to the network. It uses the same
parameters for each input as it performs the same task on all the inputs or hidden layers to produce the output.

This reduces the complexity of parameters, unlike other neural networks.


• Case Study
Link 1: Diabetic Retinopathy
https://www.coursera.org/lecture/machine-learning-duke/motivaion-diadetic-retinopathy-
C183X

Link 2: Case study: Smart speaker


https://www.coursera.org/lecture/ai-for-everyone/case-study-smart-speaker-ahvm7

Link 3: Self Driving Cars


https://neptune.ai/blog/self-driving-cars-with-convolutional-neural-networks-cnn

RNN:
https://www.youtube.com/watch?v=dqoEU9Ac3ek

May 18, 2024 127

You might also like