Python Algorithms
Python Algorithms
Table of Contents
1. Introduction to Logistic Regression
2. Understanding Logistic Regression
3. Mathematical Formulation
4. The Structure of a Logistic Regression
5. Implementation Logistic Regression from Scratch in Python
a. Data Preparation
b. Sigmoid Function
c. Cost Function and Gradient
d. Gradient Descent Algorithm
e. Prediction
f. Logistic regression model
g. Visualizing the Results
h. Diagram: Logistic Regression Decision Boundary
6. Conclusion
2 ANSHUMAN JHA
Building Logistic Regression Algorithm from Scratch in Python
3. Mathematical Formulation
For a given set of features X and labels y:
1. Hypothesis Function: The hypothesis of logistic regression is defined using the sigmoid function:
2. Cost Function: The cost function for logistic regression is derived from the maximum likelihood estimation:
3 ANSHUMAN JHA
Building Logistic Regression Algorithm from Scratch in Python
4 ANSHUMAN JHA
Building Logistic Regression Algorithm from Scratch in Python
4. Implementation in Python
Let's implement a simple Logistic Regression Algorithm in Python.
5 ANSHUMAN JHA
Building Logistic Regression Algorithm from Scratch in Python
Step 4: Gradient Descent
The gradient descent algorithm iteratively updates the weights to minimize the cost function.
def gradient_descent(X, y, weights, learning_rate, num_iterations):
cost_history = []
for i in range(num_iterations):
cost, gradient = compute_cost_and_gradient(X, y, weights)
weights -= learning_rate * gradient
cost_history.append(cost)
Step 5: Prediction
The prediction function uses the trained weights to predict the class labels for new data points.
def predict(X, weights, threshold=0.5):
probabilities = sigmoid(np.dot(X, weights))
return (probabilities >= threshold).astype(int)
# Set hyperparameters
learning_rate = 0.1
num_iterations = 1000
# Make predictions
predictions = predict(X, trained_weights)
# Calculate accuracy
accuracy = np.mean(predictions == y)
print(f'Accuracy: {accuracy * 100}%')
plt.plot(cost_history)
plt.xlabel('Iteration')
plt.ylabel('Cost')
plt.title('Cost Function over Iterations')
plt.show()
6 ANSHUMAN JHA
Building Logistic Regression Algorithm from Scratch in Python
Step 8: Diagram: Logistic Regression Decision Boundary
To understand the model's decision boundary, let's plot it along with the data points.
def plot_decision_boundary(X, y, weights):
x1_min, x1_max = X[:, 1].min(), X[:, 1].max()
x2_min, x2_max = X[:, 2].min(), X[:, 2].max()
plot_decision_boundary(X, y, trained_weights)
5. Conclusion
In this article, we built a logistic regression algorithm from scratch in Python. We started with the fundamental
concepts, implemented the sigmoid function, cost function, gradient descent, and prediction function, and finally
visualized the results. This step-by-step approach helps in understanding the inner workings of logistic regression and
provides a solid foundation for more advanced machine learning algorithms.