0% found this document useful (0 votes)
45 views2 pages

Algorithm Report

XGBoost is a popular machine learning algorithm that combines the predictions of multiple decision trees to make strong predictions for regression and classification problems. It works by creating a sequence of decision trees, where each tree tries to correct errors from the previous tree. XGBoost optimizes an objective function by iteratively adding new trees to minimize differences between predicted and actual values, while applying techniques like regularization to prevent overfitting. It has become popular for handling large datasets and achieving high accuracy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views2 pages

Algorithm Report

XGBoost is a popular machine learning algorithm that combines the predictions of multiple decision trees to make strong predictions for regression and classification problems. It works by creating a sequence of decision trees, where each tree tries to correct errors from the previous tree. XGBoost optimizes an objective function by iteratively adding new trees to minimize differences between predicted and actual values, while applying techniques like regularization to prevent overfitting. It has become popular for handling large datasets and achieving high accuracy.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

XGBoost:

XGBoost (Extreme Gradient Boosting) is a popular machine learning algorithm that is used for
supervised learning problems, such as regression and classification. It is an ensemble method that
combines the predictions of multiple weak learners to make a strong prediction.

XGBoost works by creating a sequence of decision trees, where each tree tries to correct the errors
of the previous tree. Each tree is trained on a subset of the data, and the samples that are
misclassified by the previous tree are given higher weights to increase their importance.

XGBoost uses a gradient boosting framework, which means that it optimizes an objective function by
iteratively adding new trees to the model. The objective function measures the difference between
the predicted values and the actual values, and the goal is to minimize this difference.

In order to prevent overfitting, XGBoost applies regularization techniques such as L1 and L2


regularization, and it also prunes the trees to remove those that do not contribute significantly to
the model.

XGBoost has become popular in machine learning competitions, due to its ability to handle large
datasets and its high prediction accuracy. It is also easy to use and has a number of hyperparameters
that can be tuned to improve its performance on different types of data.

Random Forest:

Random Forest is a popular ensemble learning algorithm used for classification, regression, and
other machine learning tasks. It is a combination of decision trees, where each decision tree is
trained on a different subset of the input data.

Random Forest works by creating multiple decision trees on different subsets of the input data. Each
tree is trained on a random subset of the input features and a random subset of the input data. The
trees are trained independently and use a random selection of features at each node, which helps to
reduce the variance in the model.

When making a prediction, the Random Forest algorithm combines the predictions of all the decision
trees. For classification tasks, the algorithm takes the majority vote of the predicted classes, while
for regression tasks, it takes the average of the predicted values.

Random Forest is effective at handling large datasets and high-dimensional input spaces. It also has
the ability to handle missing data and outliers, and it is less prone to overfitting than single decision
trees. Additionally, Random Forest can provide estimates of feature importance, which can help in
understanding the underlying relationships between the input features and the output.The main
hyperparameters of Random Forest include the number of decision trees, the depth of each decision
tree, and the number of input features used at each node. These hyperparameters can be tuned to
optimize the performance of the model on a given task.

Mobilenet:

MobileNet is a popular deep learning algorithm used for computer vision tasks such as image
classification and object detection. It is designed to be computationally efficient, making it well-
suited for running on mobile devices and other embedded systems with limited computing
resources.

MobileNet works by using a series of depthwise separable convolution layers, which separate the
spatial and channel-wise operations in a convolution layer. This reduces the number of
computations required and reduces the overall model size, while still maintaining accuracy.

Depthwise separable convolutions consist of two separate layers: a depthwise convolution and a
pointwise convolution. The depthwise convolution applies a single filter to each input channel, while
the pointwise convolution applies a 1x1 convolution to combine the results of the depthwise
convolution across all channels. This process is repeated for multiple layers, and the output of each
layer is passed through a non-linear activation function such as ReLU.

MobileNet also includes other techniques such as global average pooling, which averages the
activations across each channel of the feature map to reduce the number of parameters, and skip
connections, which allow for the reuse of activations from previous layers in the model.

MobileNet has achieved state-of-the-art results on a variety of image classification benchmarks


while using significantly fewer parameters and less computation than other deep learning models.
This makes it well-suited for applications that require real-time inference on devices with limited
resources, such as mobile phones and IoT devices.

You might also like