0% found this document useful (0 votes)
7 views9 pages

Assignment4 Supritha

Uploaded by

suprithamaleyur7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views9 pages

Assignment4 Supritha

Uploaded by

suprithamaleyur7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

CAN_33717714

Building and Training a Machine Learning Model, Containerizing, and


Deploying a Machine Learning Model Using Flask, Docker Images, and
Kubernetes Services

This project demonstrates the end-to-end process of building, training, and deploying a
machine learning model within a robust DevOps environment. The workflow includes the
following steps:

1. Building and Training the Machine Learning Model

• A simple machine learning model is created using Python and libraries like scikit-learn.
• The model is trained with sample data, serialized into a model.pkl file, and saved for
deployment.

2. Creating a Flask API for Model Serving

• A Flask application is developed to serve the trained model, providing endpoints for
making predictions.
• The API processes input data, loads the serialized model, and returns predictions in real
time.

3. Containerizing the Application Using Docker

• A Docker image is created for the Flask application, bundling the app code,
dependencies, and the trained model.
• Docker commands are used to build, test, and run the container locally to ensure
functionality.

4. Deploying the Containerized Application Using Kubernetes

• Minikube is used to simulate a Kubernetes cluster locally.


• The Docker image is deployed as a Kubernetes Pod, with a Service created to expose
the application.
• kubectl commands manage the deployment, scaling, and health of the Pods.

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

Step 1: Create the ML Model

1.Directory structure:

ml-app/

├── app/

│ ├── model.py

│ ├── app.py

│ └── requirements.txt

├── Dockerfile

├── kubernetes/

│ ├── deployment.yaml

│ └── service.yaml

└── README.md

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

2. ML Model Code (model.py):

import pickle
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier

# Train and save a model


def train_model():
try:
# Load dataset
data = load_iris()
X, y = data.data, data.target

# Train a simple model


model = RandomForestClassifier()
model.fit(X, y)

# Save the model to a file


with open("model.pkl", "wb") as file:
pickle.dump(model, file)
print("Model created and saved as 'model.pkl'")
except Exception as e:
print(f"Error: {e}")

if __name__ == "__main__":
train_model()

3. Flask App Code (app.py):

import pickle
from flask import Flask, request, jsonify

app = Flask(__name__)

# Load the trained model


with open("model.pkl", "rb") as file:
model = pickle.load(file)

@app.route("/")
def home():
return "Welcome to the ML App!"

@app.route("/predict", methods=["POST"])
def predict():
data = request.json
prediction = model.predict([data["features"]])
return jsonify({"prediction": prediction.tolist()})

if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

4. Requirements File (requirements.txt):

Flask==2.3.3
scikit-learn==1.3.2

4. Train the Model:

Run the model.py script to generate the model.pkl file:

Step 2: Create a Docker Image

1. Dockerfile:

FROM python:3.9-slim

WORKDIR /app

# Copy the application and the model file into the Docker
image
COPY app/ /app/
COPY model.pkl /app/

RUN pip install -r requirements.txt

EXPOSE 5000

CMD ["python", "app.py"]

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

2. Build and Push the Docker Image:

Docker build -t ml-app .

docker tag ml-app:latest abhishekkn/ml-app:latest


docker push abhishekkn/ml-app:latest

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

Step 3: Verify Image Availability


Ensure Kubernetes can access the image:
docker images
1. Build the Docker image:

docker build -t ml-app .

2. Test the Docker container locally:


docker run -p 5000:5000 ml-app

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

Step 4: Deploy Using Kubernetes

1. Start Minikube: minikube start

1. Deployment Manifest (deployment.yaml)

apiVersion: apps/v1
kind: Deployment
metadata:
name: ml-app
spec:
replicas: 1
selector:
matchLabels:
app: ml-app
template:
metadata:
labels:
app: ml-app
spec:
containers:
- name: ml-app
image: abhishekkn/ml-app:latest
ports:
- containerPort: 5000

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

2. Service Manifest (service.yaml)

apiVersion: v1
kind: Service
metadata:
name: ml-app-service
spec:
selector:
app: ml-app
ports:
- protocol: TCP
port: 80
targetPort: 5000
type: NodePort

3. Apply Kubernetes Manifests

kubectl apply -f kubernetes/deployment.yaml


kubectl apply -f kubernetes/service.yaml

Check Deployment and Service


1. Check Deployment: Verify the status of the Deployment:
kubectl get deployments
kubectl get pods
kubectl get svc

Supritha mr Maharaja Institute Of Technology Thandavapura


CAN_33717714

2. Access the service: Get the service URL using Minikube:


minikube service ml-app-service

**************

Supritha mr Maharaja Institute Of Technology Thandavapura

You might also like