OOP(Python) in DevOps
OOP(Python) in DevOps
(OOP) in Python
Object-Oriented Programming (OOP) is a way of organizing code using objects.
These objects have data (attributes) and actions (methods). Python uses OOP to
make coding simpler, reusable, and more structured.
Basic Example
# Defining a class
class Car:
def __init__(self, brand, model): # Constructor
self.brand = brand # Attribute
self.model = model
# Accessing methods
print(my_car.drive()) # Output: Toyota Camry is driving.
1. Encapsulation
class BankAccount:
def __init__(self, balance):
self.__balance = balance # Private attribute
def get_balance(self):
return self.__balance
2. Inheritance
class Vehicle:
def __init__(self, brand):
self.brand = brand
def drive(self):
return f"{self.brand} is moving."
# Creating an instance
my_car = Car("Honda")
print(my_car.drive()) # Output: Honda is moving.
print(my_car.honk()) # Output: Beep Beep!
3. Polymorphism
python
class Dog:
def speak(self):
return "Woof!"
class Cat:
def speak(self):
return "Meow!"
# Polymorphism in action
animals = [Dog(), Cat()]
for animal in animals:
print(animal.speak())
# Output:
# Woof!
# Meow!
OOP in DevOps
Object-Oriented Programming (OOP) is useful in DevOps for structuring
automation, improving code reusability, and making infrastructure management
more efficient. Here are real-world use cases where OOP is applied in DevOps:
● Example: Using Python classes to manage cloud resources like AWS S3,
EC2, and Lambda.
● Benefit: Encapsulation of cloud operations in reusable objects.
class S3Bucket:
self.s3 = boto3.client("s3")
self.bucket_name = bucket_name
def create_bucket(self):
self.s3.create_bucket(Bucket=self.bucket_name)
# Usage
bucket = S3Bucket("devops-bucket")
print(bucket.create_bucket())
2. CI/CD Pipelines
import jenkins
class JenkinsPipeline:
self.server.build_job(job_name)
# Usage
print(jenkins_pipeline.trigger_build("MyJob"))
class DockerManager:
def __init__(self):
self.client = docker.from_env()
container = self.client.containers.get(container_name)
container.start()
# Usage
manager = DockerManager()
print(manager.start_container("nginx"))
self.url = prometheus_url
return response.json()
# Usage
monitor = PrometheusMonitor("http://localhost:9090")
print(monitor.get_metric("up"))
class AnsibleManager:
self.playbook = playbook_path
def run_playbook(self):
return result.stdout
# Usage
ansible = AnsibleManager("deploy.yml")
print(ansible.run_playbook())
Goal:
import boto3
class EC2Manager:
response = self.ec2.run_instances(
ImageId=ami_id,
InstanceType=instance_type,
MinCount=1,
MaxCount=1
instance_id = response["Instances"][0]["InstanceId"]
self.ec2.stop_instances(InstanceIds=[instance_id])
self.ec2.terminate_instances(InstanceIds=[instance_id])
# Usage Example
ec2_manager = EC2Manager()
We'll create a DockerManager class to start, stop, and list Docker containers.
import docker
class DockerManager:
def __init__(self):
self.client = docker.from_env()
def list_containers(self):
container = self.client.containers.get(container_name)
container.start()
container = self.client.containers.get(container_name)
container.stop()
# Usage Example
docker_manager = DockerManager()
import jenkins
class JenkinsPipeline:
self.server.build_job(job_name)
# Usage Example
print(jenkins_pipeline.trigger_build("MyJob"))
Combine all three classes in a single Python script to fully automate AWS,
Docker, and Jenkins in a DevOps workflow.
def __init__(self):
self.ec2_manager = EC2Manager()
self.docker_manager = DockerManager()
def deploy_infrastructure(self):
ec2_instance = self.ec2_manager.create_instance()
print(ec2_instance)
def manage_containers(self):
print(self.docker_manager.start_container("nginx"))
def trigger_ci_cd(self):
print(self.jenkins_pipeline.trigger_build("MyJob"))
# Usage Example
automation = DevOpsAutomation()
🔹 OOP Benefits:
✅ Encapsulation: Each service (AWS, Docker, Jenkins) is in its own class.
✅ Modularity: We can easily replace or extend functionalities.
✅ Reusability: The same code can be used across multiple projects.
Real-World Use Cases