0% found this document useful (0 votes)
11 views47 pages

DevOps - Unit - 4

The document discusses Continuous Delivery (CD) and Continuous Deployment (CD), emphasizing their importance in automating software release processes to enhance efficiency and reduce errors. It introduces containerization with Docker, detailing its components, architecture, and the process of creating and managing Docker images and containers. The document also covers the benefits of using containers and Docker for application deployment, including improved developer productivity and faster delivery of updates.

Uploaded by

sitharavashisht
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views47 pages

DevOps - Unit - 4

The document discusses Continuous Delivery (CD) and Continuous Deployment (CD), emphasizing their importance in automating software release processes to enhance efficiency and reduce errors. It introduces containerization with Docker, detailing its components, architecture, and the process of creating and managing Docker images and containers. The document also covers the benefits of using containers and Docker for application deployment, including improved developer productivity and faster delivery of updates.

Uploaded by

sitharavashisht
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

UNIT-IV

Continuous Delivery
• Importance of Continuous Delivery
• Continuous Deployment: CD Flow
• Containerization with Docker: Introduction to Docker
• Docker Installation
• Docker commands
• Images & Containers
• DockerFile
• Running containers
• working with containers and publish to Docker Hub.
Importance of Continuous Delivery
• Continuously delivering value has become a mandatory
requirement for organizations.
• To deliver value to your end users, you must release
continually and without errors.
• Continuous delivery (CD) is the process of automating
build, test, configuration, and deployment from a build to
a production environment.
• A release pipeline can create multiple testing or staging
environments to automate infrastructure creation and
deploy new builds.
• Successive environments support progressively longer-
running integration, load, and user acceptance testing
activities.
Continous Delivery Apporach
Before Continuous Delivery
• Before Continuous Delivery software release cycles were a
bottleneck for application and operations teams.
• These teams often relied on manual handoffs that resulted in issues
during release cycles.
• Manual processes led to unreliable releases that produced delays
and errors.
After Continuous Delivery
• CD is a lean practice, with the goal to keep production fresh with
the fastest path from new code or component availability to
deployment.
• Automation minimizes the time to deploy and time to mitigate
(TTM) or time to remediate (TTR) production incidents.
• In lean terms, CD optimizes process time and eliminates idle time.
What is Continuous Deployment (CD)?
• Continuous Deployment (CD) is the continuation of
Continuous Integration.
• Once the tests have been validated in the dev
environment, they must be deployed to production.
Continuous deployment, therefore, consists of automating
deployment actions that were previously performed
manually.
• This is why we often talk about CI/CD together.
• To automate deployment actions, tests on the qualification
environment need to be automated to ensure that the new
functionality to be pushed works properly.
Continuous Delivery vs Continuous Deployment
• With continuous delivery, every code change is built, tested, and
then pushed to a non-production testing or staging environment.

• The difference between continuous delivery and continuous


deployment is the presence of a manual approval to update to
production.
• With continuous deployment, production happens automatically
without explicit approval.
Continuous Deployment Benefits

• Automate the Software Release Process : Continuous deployment


lets your team automatically build, test, and prepare code changes
for release to production so that your software delivery is more
efficient and rapid.
• Improve Developer Productivity : These practices help your team
be more productive by freeing developers from manual tasks and
encouraging behaviors that help reduce the number of errors and
bugs deployed to customers.
• Find and Address Bugs Quicker : Your team can discover and
address bugs earlier before they grow into larger problems later
with more frequent and comprehensive testing.
• Continuous deployment lets you more easily perform additional
types of tests on your code because the entire process has been
automated.
• Deliver Updates Faster : Continuous deployment helps your team
deliver updates to customers faster and more frequently.
Containerization
• What Is Container(ization)?
• Containerization is a process of packaging your application together
with its dependencies into one package (a container).
• Such a package can then be run pretty much anywhere, no matter if
it’s an on-premises server, a virtual machine in the cloud, or a
developer’s laptop.
• By abstracting the infrastructure, containerization allows you to
make your application truly portable and flexible.
• A single container might be used to run anything from a small
microservice or software process to a larger application.
• Inside a container are all the necessary executables, binary code,
libraries, and configuration files.
• Compared to server or machine virtualization approaches, however,
containers do not contain operating system images.
• This makes them more lightweight and portable, with significantly
less overhead.
Benefits of containers
• Containers are a streamlined way to build, test, deploy, and
redeploy applications on multiple environments from a developer’s
local laptop to an on-premises data center and even the cloud.
Container use cases
Introduction to Docker
• A Docker container image is a lightweight, standalone, executable
package of software that includes everything needed to run an
application: code, runtime, system tools, system libraries and
settings.

• Container images become containers at runtime and in the case of


Docker containers – images become containers when they run
on Docker Engine.

• Available for both Linux and Windows-based applications,


containerized software will always run the same, regardless of the
infrastructure.

• Containers isolate software from its environment and ensure that it


works uniformly despite differences for instance between
development and staging.
• Docker is designed to benefit both developers and system
administrators making it a part of many DevOps tool chains.
• Developers can write code without worrying about the testing and
production environment.
• Sysadmins need not worry about infrastructure as Docker can easily
scale up and scale down the number of systems.
• Docker comes into play at the deployment stage of the software
development cycle.
Docker Architecture

• Docker architecture consists of Docker client, Docker Daemon


running on Docker Host, and Docker Hub repository.
• Docker has client-server architecture in which the client
communicates with the Docker Daemon running on the Docker .

• If we have to build the Docker image, then we use the client to


execute the build command to Docker Daemon then Docker
Daemon builds an image based on given inputs and saves it into the
Docker registry.
• If you don’t want to create an image then just execute the pull
command from the client and then Docker Daemon will pull the
image from the Docker Hub
• Finally if we want to run the image then execute the run command
from the client which will create the container.
Docker Architecture Diagram
Components of Docker

The main components of Docker include –


 Docker clients and servers
 Docker images
 Dockerfile
 Docker Registries
 Docker containers.
2.Docker Images– Docker images are used to build docker containers by using a read-only
template.
• The foundation of every image is a base image eg. base images such as – ubuntu14.04 LTS,
and Fedora 20.
• Base images can also be created from scratch and then required applications can be added
to the base image by modifying it thus this process of creating a new image is called
“committing the change”.
• 3. Docker File– Dockerfile is a text file that contains a series of
instructions on how to build your Docker image.
• This image contains all the project code and its dependencies.
• The same Docker image can be used to spin ‘n’ number of
containers each with modification to the underlying image.
• The final image can be uploaded to Docker Hub and shared among
various collaborators for testing and deployment.
• The set of commands that you need to use in your Docker File is
FROM, CMD, ENTRYPOINT, VOLUME, ENV, and many more.
• 4. Docker Registries– Docker Registry is a storage component for
Docker images.
• We can store the images in either public/private repositories so that
multiple users can collaborate in building the application.
• Docker Hub is Docker’s cloud repository.
• Docker Hub is called a public registry where everyone can pull
available images and push their images without creating an image
from scratch.
• 5. Docker Containers– Docker Containers are runtime instances of
Docker images.
• Containers contain the whole kit required for an application, so the
application can be run in an isolated way.
• For eg.- Suppose there is an image of Ubuntu OS with NGINX
SERVER when this image is run with the docker run command,
then a container will be created and NGINX SERVER will be
running on Ubuntu OS.
Docker Installation and Setup
Docker Commands
Docker Image
• A Docker image is a read-only template that contains a set of
instructions for creating a container that can run on the Docker
platform.
• A Docker image is made up of a collection of files that bundle
together all the essentials – such as installations, application code,
and dependencies – required to configure a fully operational
container environment.

• It provides a convenient way to package up applications and


preconfigured server environments, which you can use for your
own private use or share publicly with other Docker users.

• Docker images are also the starting point for anyone using Docker
for the first time.
• You can create a Docker image by using one of two methods:
 Interactive: By running a container from an existing Docker image,
manually changing that container environment through a series of live
steps, and saving the resulting state as a new image.
 Dockerfile: By constructing a plain-text file, known as a Dockerfile,
which provides the specifications for creating a Docker image.
• Image Layers :
• Each of the files that make up a Docker image is known as a layer.
• These layers form a series of intermediate images, built one on top of the
other in stages, where each layer is dependent on the layer immediately
below it.
• The hierarchy of your layers is key to efficient lifecycle management of
your Docker images.
• You should organize layers that change most often as high up the stack as
possible.
• This is because, when you make changes to a layer in your image, Docker
not only rebuilds that particular layer, but all layers built from it.
• Therefore, a change to a layer at the top of a stack involves the least
amount of computational work to rebuild the entire image.
• Container Layer
• Each time Docker launches a container from an image, it adds a thin
writable layer, known as the container layer, which stores all
changes to the container throughout its runtime.
How to Create a Docker Image
• The following is a set of simplified steps to creating an image
interactively:
• Install Docker and launch the Docker engine
• Open a terminal session
• Use the following Docker run command to start an interactive shell
session with a container launched from the image specified
by image_name:tag_name:

• If you omit the tag name, then Docker automatically pulls the most
recent image version, which is identified by the latest tag.
• If Docker cannot find the image locally then it will pull what it
needs to build the container from the appropriate repository on
Docker Hub.
• In our example, we’ll launch a container environment based on the
latest version of Ubuntu:
• Now configure your container environment by, for example,
installing all the frameworks, dependencies, libraries, updates, and
application code you need. The following simple example adds an
NGINX server:

• Next, you’ll need to know the name or ID of your running container


instance.
• Open another Bash shell and type the following docker command to
list active container processes:
Dockerfile
• A Dockerfile is a text document that contains all the commands a
user could call on the command line to assemble an image.
• You can deploy only a single container with the help of the Docker
image.
• However, if you need to deploy several containers (each for
different tasks) from the same image then what happens?
• You can resolve this with the help of the dockerfile.
• Dockerfile is a simple text file that contains all the commands user
could call on the command line to assemble or build an image.
• With the help of docker build, you can easily automate build that
runs multiple commands defines in the dockerfile in succession.
• For example, you want to download the docker image from the
docker hub for your specific development needs.
• Then, you want to update the image, install some packages for your
development process.
Creating the Dockerfile
• We will create a dockerfile to build a LAMP server image from the
Ubuntu base image.
• First, you will need to create a directory to store the dockerfile. You
can create it with the following command:

• Next, create a directory named Dockerfile inside the directory:

• Add the following lines:

• Save and close the file when you are finished.


Building Image with Dockerfile
• After creating the Dockerfile, you can easily create a custom LAMP
image with the help of the Dockerfile.
• First, change the directory to LAMP and run the following
command to build the image from that file:

• The above command will start downloading the Ubuntu latest


image from the Docker Hub and install necessary packages
specified in the Dockerfile.
• Once the image has been built successfully, you should see the
following output:
• You can now list your newly build an image by running the
command:
• You should see the  output:
• Now, you have a custom LAMP server image in your hand. You can
also see the history of each command with the following command:

• You should see the following output:

• Creating a Container from LAMP Image


• Now, you can run a container from your image using the following
command:
• Once the container has been started, you should see the following
output:
• You can verify the running container using the following command:
Running Containers
• Containers are instances of Docker images that can be run using the
Docker run command.
• The basic purpose of Docker is to run containers.
Running a Container
• Running of containers is managed with the Docker run command.
• To run a container in an interactive mode, first launch the Docker
container.
Pushing a Docker container image to Docker Hub
• To push an image to Docker Hub, you must first name your local
image using your Docker Hub username and the repository name
that you created through Docker Hub on the web.
• You can add multiple images to a repository by adding a
specific :<tag> to them (for example docs/base:testing).
• If it’s not specified, the tag defaults to latest.
• Name your local images using one of these methods:
 When you build them, using docker build -t <hub-user>/<repo-name>[:<tag>]
 By re-tagging an existing local image docker tag <existing-image> <hub-user>/<repo-
name>[:<tag>]
• Now you can push this repository to the registry designated by its
name or tag.

• The image is then uploaded and available for use by your


teammates and/or the community.

You might also like