AAM Ans
AAM Ans
Deep Learning algorithms take a long time to train from scratch for complex
tasks, leading to inefficiency. For example, a model's training could take
days and weeks when considering an NLP task. However, we could only use
transfer learning to effectively avoid the model training time.
A pretrained deep learning model in a new task is known as transfer
learning. Utilizing pretrained models' knowledge (features, weights, etc.) to
train fresh models allows you to overcome issues like using fewer data for
the new task.
The primary idea behind transfer learning machine learning is to use what
has been learned in one task to enhance generalization in another. Transfer
Learning example, if you initially trained a model to classify cat images, you
would use the knowledge from the model to recognize other images like a
dog. To a new "task B," we apply the weights that a network has learned
at "task A."
If maximum possible data items are placed in training set for construction of
model/classifier, classifier’s error rates and estimates would be very low and
accuracy would be high. This is sign of a good classifier/model.
Example –
A student ‘ABC’ is coached by a teacher. Teacher teaches her all possible topics
which might appear for exam. Hence, she tends to commit very less mistakes in
exam, thus performing well.
If more training data are used to construct a classifier, it qualifies any data used
from test set, to test it (classifier).
Example –
A student ‘ABC is coached by a teacher. Teacher teaches her some topics, which
might appear for the exam. If the student ‘ABC’ is given a number of exams on
basis of this coaching, an accurate determination of student’s weak and strong
points can be found out.
If more test data are used to evaluate constructed classifier, it’s error rate, error
estimate and accuracy can be accurately determined.