IBM Assignment 4
IBM Assignment 4
______________________________________________________________________
ml-app/
├── app/
│ ├── model.py
│ ├── app.py
│ └── requirements.txt
├── Dockerfile
└── kubernetes/
├── deployment.yaml
└── service.yaml
1. Write the following Python script to train and save the model:
import pickle
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
def train_model():
data = load_iris()
X, y = data.data, data.target
model = RandomForestClassifier()
model.fit(X, y)
with open("model.pkl", "wb") as file:
pickle.dump(model, file)
print("Model saved as 'model.pkl'")
if __name__ == "__main__":
train_model()
Train the Model
python app/model.py
Write the Flask API (app.py)
python
Copy code
import pickle
from flask import Flask, request, jsonify
app = Flask(__name__)
with open("model.pkl", "rb") as file:
model = pickle.load(file)
@app.route("/")
def home():
return "Welcome to ML App by samiuddin!"
@app.route("/predict", methods=["POST"])
def predict():
data = request.json
prediction = model.predict([data["features"]])
return jsonify({"prediction": prediction.tolist()})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000)
Add Dependencies in requirements.txt
Flask==2.3.3
scikit-learn==1.3.2
…………………………………………………………………………………………………………….....
FROM python:3.9-slim
WORKDIR /app
COPY app/ /app/
COPY model.pkl /app/
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "app.py"]
3. Build the Docker Image
Click on run as shown below in docker desktop from the images created.
2. click on optional settings
3. Give the host port as 5000 and click run. The status of the image is as shown
3. Now open a browser and type 127.0.0.1:5000. The containerized image using docker is
generated.
3. Deploy the Application Using Kubernetes
minikube start
2. Ensure Minikube is running:
kubectl cluster-info
Create Kubernetes Manifests
1. Write deployment.yaml:
apiVersion: apps/v1
kind: Deployment
metadata:
name: ml-app
spec:
replicas: 1
selector:
matchLabels:
app: ml-app
template:
metadata:
labels:
app: ml-app
spec:
containers:
- name: ml-app
image: lazzyxbug/ml-app:latest
ports:
- containerPort: 5000
2. Write service.yaml:
apiVersion: v1
kind: Service
metadata:
name: ml-app-service
spec:
selector:
app: ml-app
ports:
- protocol: TCP
port: 80
targetPort: 5000
type: NodePort