0% found this document useful (0 votes)
11 views5 pages

Document For SiteScrapper Automation

The document outlines the creation of a CI/CD pipeline for Site Scrapper Automation using AWS services such as ECR, ECS, and CloudFormation. It emphasizes automation for environment management and details the deployment process, including the use of CodePipeline for continuous integration and delivery. The pipeline is designed to trigger automatically upon updates to the GitHub repository, ensuring efficient deployment and updates.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views5 pages

Document For SiteScrapper Automation

The document outlines the creation of a CI/CD pipeline for Site Scrapper Automation using AWS services such as ECR, ECS, and CloudFormation. It emphasizes automation for environment management and details the deployment process, including the use of CodePipeline for continuous integration and delivery. The pipeline is designed to trigger automatically upon updates to the GitHub repository, ensuring efficient deployment and updates.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Site Scrapper Automation

Statement of Confidentiality:
This document is prepared by Meyi Cloud LLP. All the information contained in this
document and any accompanying reference documents or files is confidential and proprietary
of Site Scrapper Automation. This information is not to be disclosed, copied, distributed, or
made public without prior written approval from Meyi Cloud LLP.

Introduction:

Scope:
The scope of this project is to create a CI/CD pipeline that deploys ECR (Elastic Container
Registry), ECS (Elastic container service) for SiteScrapper Automation using a CloudFormation
template. This will ensure that the infrastructure is consistent across environments, reducing
the risk of errors and downtime.

The goal is to provide a highly available, scalable, reliable, and efficient solution for deploying
different AWS services.

Automation and Environment Management:


 Our implementation focuses on automation, enabling seamless environment changes
between Development (Dev), Testing (Test), and Production (Prod) stages.
 We utilize CloudFormation for infrastructure automation, with AWS CloudFormation
offering significant advantages within the AWS ecosystem.
AWS CloudFormation Deployment:
 AWS CloudFormation is a service that allows you to model and provision AWS
infrastructure resources in a declarative manner.
 By defining infrastructure as code using CloudFormation templates, we ensure
consistency and repeatability across environments.
 We leverage AWS ECS Fargate (Elastic Container Service) and ECR (Elastic Container
Registry) for containerized application deployment.
Architecture for Code Pipeline:

Service explanation:

ECS (Elastic Container Service):

 ECS is a highly scalable, high-performance container orchestration service that supports


Docker containers. It allows you to easily run, stop, and manage containers on a cluster
of EC2 instances.
Fargate:
 AWS Fargate is a serverless compute engine for containers. It allows you to run
containers without managing the underlying infrastructure. With Fargate, you can focus
on deploying and managing your applications without worrying about provisioning or
scaling servers.

ECR (Elastic Container Registry):

 ECR is a fully managed Docker container registry service provided by AWS. It allows you
to store, manage, and deploy Docker container images securely. You can easily push and
pull container images to and from ECR using the Docker CLI or AWS CLI.

CodePipeline:
 CodePipeline is a continuous integration and continuous delivery (CI/CD) service
provided by AWS. It automates the build, test, and deployment phases of your release
process. You can create custom pipelines to orchestrate the release workflow, integrate
with various AWS services and third-party tools, and automate the software release
process.

Source:

 In the context of your architecture, "Source" likely refers to the source code repository
where your application code is stored. This could be a version control system such as Git
(e.g., GitHub, Bitbucket) or AWS CodeCommit. CodePipeline can integrate with these
repositories to pull the source code for further processing in the pipeline.

Build:

 The "Build" stage in your pipeline refers to the process of compiling, testing, and
packaging your source code into deployable artifacts. This may involve using build tools
such as AWS CodeBuild, Jenkins, or other build automation tools to perform tasks such
as compiling code, running unit tests, and generating deployable artifacts (e.g., Docker
images, binaries).

Flow explanation:
Deployment Process:
 Run the deploy.sh file to deploy a SAM (Serverless Application Model) application.
 Deployment utilizes a CloudFormation template (pipeline.yml) to create or update a
CloudFormation stack.
 The stack is named mio-moda-magento-dev-pipeline-stack, with the stage being either
'dev', 'staging', or 'prod'.
 Deployment is targeted to the AWS region me-central-1.
Configuration Creation:
From pipeline.yml, the following components are created:
 S3 (We will utilize Amazon S3 to store source code obtained from GitHub)
 CodePipeline
 Amazon ECR (Elastic Container Registry)

CodePipeline:

Source Stage:
The pipeline will use a GitHub repository as the source for the code. The source stage will pull
the latest code from the repository.
Source stage retrieves code from the specified GitHub repository.

 GitHub owner: ludxb


 GitHub repo: py-ubaid-sitescrapper
 GitHub branch: development

Build Stage:
This stage will use a devops_buildspec.yml file to define the deploy commands.
 pre_build: Logging in to Amazon ECR, preparing for Docker image build.
 build: Building a Docker image and tagging it.
 post_build: Pushing the Docker image to Amazon ECR and deploying a SAM stack with
specified parameters.

template.yml:
This CloudFormation template sets up an ECS (Elastic Container Service) environment on
AWS. The following components are created from template.yml

 Task Definition
 ECS Cluster
 ECS Service

Task Definition: Specifies the containerized application's requirements, including image


location in ECR, resource allocation (CPU, memory), and networking details.
ECS Cluster: Sets up the ECS cluster where containers will be deployed and managed.
ECS Service: Defines the ECS service, configuring it to use the specified task definition and
launch type (Fargate). Networking details like subnets and security groups are also specified.

Result:
Upon the execution of the deploy.sh file, the pipeline is configured to be automatically
triggered whenever there are any updates or changes made to the designated GitHub
repository and branch. This ensures seamless integration and deployment processes, allowing
for efficient and timely updates to the system.

You might also like