0% found this document useful (0 votes)
3 views2 pages

Aids Viva

The document discusses AdaBoost and Random Forest, two popular machine learning algorithms. AdaBoost improves accuracy by combining weak learners and focusing on misclassified instances, while Random Forest uses multiple decision trees to enhance predictive accuracy. Both algorithms have advantages and limitations, with applications in various fields such as banking, medicine, and marketing.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

Aids Viva

The document discusses AdaBoost and Random Forest, two popular machine learning algorithms. AdaBoost improves accuracy by combining weak learners and focusing on misclassified instances, while Random Forest uses multiple decision trees to enhance predictive accuracy. Both algorithms have advantages and limitations, with applications in various fields such as banking, medicine, and marketing.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Gpt : htps://chatgpt.

com/share/671958ca-52a8-8007-b22f-c4aef4d8c555
Bayesian Network : htps://www.geeksforgeeks.org/differences-between-bayesian-networks-
and-neural-networks/

AdaBoost : AdaBoost (Adap�ve Boos�ng) is a powerful ensemble learning algorithm used in


supervised machine learning for classifica�on tasks. It works by combining mul�ple weak learners,
typically decision trees, to form a strong classifier. The key idea behind AdaBoost is to sequen�ally
train weak classifiers, giving more importance (weight) to instances that were misclassified by the
previous ones. This itera�ve process con�nues un�l the model reaches a specified number of weak
classifiers or achieves a desirable level of accuracy. AdaBoost adjusts the weights of each classifier
based on its performance, ensuring that future classifiers focus on harder-to-classify examples,
making the final model more accurate and robust.

Advantages of AdaBoost

AdaBoost gives numerous blessings that make it a popular choice in gadget mastering:

1. Improved Accuracy

AdaBoost can notably improve the accuracy of suscep�ble, inexperienced persons, even when the
usage of easy fashions. By specializing in misclassified instances, it adapts to the tough areas of the
records distribu�on.

4. Resistance to Overfi�ng

AdaBoost tends to be much less at risk of overfi�ng compared to a few different ensemble methods,
thanks to its recogni�on of misclassified instances.

Limita�ons and Challenges

While AdaBoost is an effec�ve algorithm, it is important to be aware of its barriers and challenges:

1. Sensi�vity to Noisy Data

AdaBoost may be sensi�ve to noisy facts and outliers because it offers greater weight to misclassified
�mes. Outliers can dominate the studying system and result in subop�mal consequences.

2. Computa�onally Intensive

Training AdaBoost may be computa�onally intensive, especially while using a massive wide variety of
suscep�ble learners. This could make it much less appropriate for real-�me applica�ons.

4. Model Selec�on

Selec�ng the proper vulnerable learner and tuning hyperparameters may be difficult, as the
Performance of AdaBoost is no�ceably dependent on these alterna�ves.

Prac�cal Applica�ons

AdaBoost has found applica�ons in a huge range of domains, along with but not constrained to:

1. Face Detec�on

AdaBoost has been used in computer imagina�on and prescient for obliga�ons like face detec�on, in
which it allows the percep�on of faces in pics or mo�on pictures.
2. Speech Recogni�on

In speech popularity, AdaBoost can be used to improve the accuracy of phoneme or word popularity
structures.

Random Forest:
Random Forest is a popular machine learning algorithm that belongs to the supervised learning
technique. It can be used for both Classifica�on and Regression problems in ML. It is based on the
concept of ensemble learning, which is a process of combining mul�ple classifiers to solve a complex
problem and to improve the performance of the model.

As the name suggests, "Random Forest is a classifier that contains a number of decision trees on
various subsets of the given dataset and takes the average to improve the predic�ve accuracy of that
dataset." Instead of relying on one decision tree, the random forest takes the predic�on from each
tree and based on the majority votes of predic�ons, and it predicts the final output.

The greater number of trees in the forest leads to higher accuracy and prevents the problem of
overfi�ng

Applica�ons of Random Forest : There are mainly four sectors where Random forest mostly used :

1. Banking: Banking sector mostly uses this algorithm for the iden�fica�on of loan risk.

2. Medicine: With the help of this algorithm, disease trends and risks of the disease can be iden�fied.
3. Land Use: We can iden�fy the areas of similar land use by this algorithm.

4. Marke�ng: Marke�ng trends can be iden�fied using this algorithm.

Advantages : 1. Random Forest is capable of performing both Classifica�on and Regression tasks.

2. It is capable of handling large datasets with high dimensionality.

3. It enhances the accuracy of the model and prevents the overfi�ng issue.

Disadvantages : 1. Although random forest can be used for both classifica�on and regression tasks, it
is not more suitable for Regression tasks.

You might also like