Document For SiteScrapper Automation
Document For SiteScrapper Automation
Statement of Confidentiality:
This document is prepared by Meyi Cloud LLP. All the information contained in this
document and any accompanying reference documents or files is confidential and proprietary
of Site Scrapper Automation. This information is not to be disclosed, copied, distributed, or
made public without prior written approval from Meyi Cloud LLP.
Introduction:
Scope:
The scope of this project is to create a CI/CD pipeline that deploys ECR (Elastic Container
Registry), ECS (Elastic container service) for SiteScrapper Automation using a CloudFormation
template. This will ensure that the infrastructure is consistent across environments, reducing
the risk of errors and downtime.
The goal is to provide a highly available, scalable, reliable, and efficient solution for deploying
different AWS services.
Service explanation:
ECR is a fully managed Docker container registry service provided by AWS. It allows you
to store, manage, and deploy Docker container images securely. You can easily push and
pull container images to and from ECR using the Docker CLI or AWS CLI.
CodePipeline:
CodePipeline is a continuous integration and continuous delivery (CI/CD) service
provided by AWS. It automates the build, test, and deployment phases of your release
process. You can create custom pipelines to orchestrate the release workflow, integrate
with various AWS services and third-party tools, and automate the software release
process.
Source:
In the context of your architecture, "Source" likely refers to the source code repository
where your application code is stored. This could be a version control system such as Git
(e.g., GitHub, Bitbucket) or AWS CodeCommit. CodePipeline can integrate with these
repositories to pull the source code for further processing in the pipeline.
Build:
The "Build" stage in your pipeline refers to the process of compiling, testing, and
packaging your source code into deployable artifacts. This may involve using build tools
such as AWS CodeBuild, Jenkins, or other build automation tools to perform tasks such
as compiling code, running unit tests, and generating deployable artifacts (e.g., Docker
images, binaries).
Flow explanation:
Deployment Process:
Run the deploy.sh file to deploy a SAM (Serverless Application Model) application.
Deployment utilizes a CloudFormation template (pipeline.yml) to create or update a
CloudFormation stack.
The stack is named mio-moda-magento-dev-pipeline-stack, with the stage being either
'dev', 'staging', or 'prod'.
Deployment is targeted to the AWS region me-central-1.
Configuration Creation:
From pipeline.yml, the following components are created:
S3 (We will utilize Amazon S3 to store source code obtained from GitHub)
CodePipeline
Amazon ECR (Elastic Container Registry)
CodePipeline:
Source Stage:
The pipeline will use a GitHub repository as the source for the code. The source stage will pull
the latest code from the repository.
Source stage retrieves code from the specified GitHub repository.
Build Stage:
This stage will use a devops_buildspec.yml file to define the deploy commands.
pre_build: Logging in to Amazon ECR, preparing for Docker image build.
build: Building a Docker image and tagging it.
post_build: Pushing the Docker image to Amazon ECR and deploying a SAM stack with
specified parameters.
template.yml:
This CloudFormation template sets up an ECS (Elastic Container Service) environment on
AWS. The following components are created from template.yml
Task Definition
ECS Cluster
ECS Service
Result:
Upon the execution of the deploy.sh file, the pipeline is configured to be automatically
triggered whenever there are any updates or changes made to the designated GitHub
repository and branch. This ensures seamless integration and deployment processes, allowing
for efficient and timely updates to the system.