0% found this document useful (0 votes)
3 views

Unit 04 Devops

This document discusses the integration of build systems in software development, focusing on Jenkins as a flexible open-source build server for CI/CD processes. It covers various build systems, dependency management, and the importance of plugins in enhancing Jenkins functionality. Additionally, it highlights the significance of managing build dependencies and the specifications required for an effective build server.

Uploaded by

indhu mathi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Unit 04 Devops

This document discusses the integration of build systems in software development, focusing on Jenkins as a flexible open-source build server for CI/CD processes. It covers various build systems, dependency management, and the importance of plugins in enhancing Jenkins functionality. Additionally, it highlights the significance of managing build dependencies and the specifications required for an effective build server.

Uploaded by

indhu mathi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

UNIT - IV

INTEGRATING THE SYSTEM


Balike Mahesh
7207030340

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


• You need a system to build your code, and you need somewhere to build it.
• Jenkins is a flexible open source build server that grows with your needs.
• Some alternatives to Jenkins will be explored as well.

• We will also explore the different build systems and how they affect our DevOps work.

• Why do we build code?
• Most developers are familiar with the process of building code. When we work in the field of DevOps, however,
we might face issues that developers who specializein programming a particular component type won't
necessarily experience.
• For the purposes of this book, we define software building as the process of molding
• code from one form to another. During this process, several things might happen:
• The compilation of source code to native code or virtual machine bytecode,depending on our production
platform.
• Linting of the code: checking the code for errors and generating code quality measures by means of static code
analysis. The term "Linting" originated with a program called Lint, which started shipping with early versions
of the Unix operating system. The purpose of the program was to find bugs
• in programs that were syntactically correct, but contained suspicious codepatterns that could be
identified with a different process than compiling.
• Unit testing, by running the code in a controlled manner.
• The generation of artifacts suitable for deployment.It's a tall order!

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


4.1 The many faces of build systems
• There are many build systems that have evolved over the history of
software development. Sometimes, it might feel as if there are more
build systems than there are programming languages.
• Here is a brief list, just to get a feeling for how many there are:
• For Java, there is Maven, Gradle, and Ant
• For C and C++, there is Make in many different flavors
• For Clojure, a language on the JVM, there is Leiningen and Boot
apart from Maven
• For JavaScript, there is Grunt
• For Scala, there is sbt
• For Ruby, we have Rake
• Finally, of course, we have shell scripts of all kinds
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Depending on the size of your organization and the type of product you are building, you
might encounter any number of these tools. To make life even moreinteresting, it's not
uncommon for organizations to invent their own build tools.
• As a reaction to the complexity of the many build tools, there is also often the idea of
standardizing a particular tool. If you are building complex heterogeneous systems, this is
rarely efficient. For example, building JavaScript software is just easier with Grunt than it
is with Maven or Make, building C code is not very efficient with Maven, and so on. Often,
the tool exists for a reason.
• Normally, organizations standardize on a single ecosystem, such as Java and Maven or
Ruby and Rake. Other build systems besides those that are used for the primary code
base are encountered mainly for native components and third-party components.
• At any rate, we cannot assume that we will encounter only one build system within our
organization's code base, nor can we assume only one programming language.
• I have found this rule useful in practice: it should be possible for a developer to check out the
code and build it with minimal surprises on his or her local developer machine.
• This implies that we should standardize the revision control system and have a single
interface to start builds locally. B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• If you have more than one build system to support, this basically means that you
need to wrap one build system in another. The complexities of the build are thus
hidden and more than one build system at the same time are allowed. Developers
not familiar with a particular build can still expect to check it out and build it with
reasonable ease.
• Maven, for example, is good for declarative Java builds. Maven is also capable of
starting other builds from within Maven builds.
• This way, the developer in a Java-centric organization can expect the following
command line to always build one of the organization's components:
• mvn clean install
• One concrete example is creating a Java desktop application installer with the
Nullsoft NSIS Windows installation system. The Java components are built with
Maven. When the Java artifacts are ready, Maven calls the NSIS installer script to
produce a self-contained executable that will install the application on Windows.
• While Java desktop applications are not fashionable these days, they continue to be
popular in some domains.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
4.2 The Jenkins build server
• Jenkins is an open-source automation server widely used for continuous integration,
continuous delivery, and continuous deployment (CI/CD) processes. It helps automate
the building, testing, and deployment of software applications. Jenkins is written in Java
and provides a web-based interface for configuring and managing various automation
tasks.
• Here are some key features and components of Jenkins:
• Jobs: Jenkins uses jobs as the building blocks for automation. A job represents a task or
a set of tasks to be executed. Jobs can be configured to perform actions like compiling
source code, running tests, deploying applications, or triggering other jobs.
• Plugins: Jenkins has a vast ecosystem of plugins that extend its functionality. Plugins
can be installed to add support for different programming languages, version control
systems, build tools, testing frameworks, deployment targets, and more.
• Build Steps: Jobs in Jenkins are composed of build steps. Build steps define the actions
that Jenkins should perform during the build process, such as executing shell
commands, running scripts, invoking build tools, or executing tests.
• Pipelines: Jenkins supports the concept of pipelines, which allows the definition of entire
build processes as code. Jenkins pipelines provide a way to express the build, test, and
deployment stages in a declarative
B.MAHESH or scripted
(YOUTUBE CHANNELmanner, enabling
:: SV TECH KNOWLEDGE ) better visibility,
reusability, and versioning of the build process.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Distributed Builds: Jenkins can distribute build workloads across multiple nodes to
improve performance and scalability. It supports the setup of distributed build
environments where different nodes handle different parts of the build process.
• Integration and Extensibility: Jenkins integrates with various tools, version control
systems, and services. It can be easily integrated with popular platforms like Git, SVN,
Docker, JIRA, Slack, and many others. Additionally, Jenkins provides APIs and hooks for
extending its functionality and integrating with custom tools and services.
• Monitoring and Reporting: Jenkins offers comprehensive monitoring and reporting
capabilities. It provides real-time build logs, test results, and trend analysis, allowing
developers and teams to track the progress and health of their builds.
• Security and Authentication: Jenkins provides features for authentication, authorization,
and access control. It supports various security mechanisms, including user
authentication, role-based access control (RBAC), and integration with external
authentication providers like LDAP or Active Directory.
• By leveraging Jenkins, development teams can automate repetitive tasks, ensure code
quality through continuous integration and testing, and streamline the deployment
process, resulting in faster and more reliable software development cycles.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
4.3 Managing build dependencies
• Before running your build command, the buildbot will look for instructions about required
languages and software needed to run your command. These are called dependencies,
and how you declare them depends on the languages and tools used in your build.
• When it comes to managing build dependencies, different build systems have various
approaches. Let's take a look at a few examples:
• Maven (Java):
• Maven uses a Project Object Model (POM) file that specifies the project's
dependencies.
• When you build a Maven project, it automatically fetches the required
dependencies from a central repository if they are not already present on the build
server.
• Grunt (JavaScript):
• Grunt utilizes a build description file (e.g., Gruntfile.js) where you can define the
project's dependencies.
• Similar to Maven, Grunt fetches the specified dependencies automatically if they
are missing from the buildB.MAHESH
server.
(YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Go:
• Go projects often include links to required GitHub repositories in their
build files.
• When building a Go project, the build system can fetch the necessary
dependencies from these repositories.
• C and C++ (using GNU Autotools):
• GNU Autotools, including Autoconf, adapt to the available dependencies
on the build system rather than explicitly listing them.
• To build projects like Emacs, you typically run a configuration script that
determines which dependencies are present on the build system.
• These examples highlight different approaches to managing build
dependencies. Maven and Grunt explicitly define dependencies, allowing
the build system to handle fetching them. Go projects link to external
repositories, while C and C++ projects using GNU Autotools adapt to
available dependencies on the build system.
• Managing build dependencies can become more complex in real-world
scenarios, but understanding the principles and tools used by different build
systems helps streamline the process and ensure that the necessary
dependencies are available for successful builds.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Inenterprise settings, it is crucial to have full control over the build
dependencies and ensure that the software behaves consistently
across different environments. Having surprises or missing
functionality on production servers is undesirable. To address this,
we need to adopt a more deterministic approach to managing build
dependencies.
1.Dependency Locking:
2.Artifact Repositories:

3.Build Pipelines:
4.Testing and Validation:

5.Release Management:
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
The RPM (Red Hat Package Manager) system (#it is also comes under managing dependencies )
• The RPM (Red Hat Package Manager) system provides a solution for managing build dependencies and building
software on systems derived from Red Hat. It revolves around a build descriptor file called a spec file (short for
specification file). Here's a simplified explanation:
1. Spec File:
1. The spec file is a build descriptor written in a macro-based shell script format.
2. It contains various sections, including metadata, dependencies, build commands, and configuration options.
3. The spec file defines the requirements for successfully building the software package.
2. Build Dependencies:
1. The spec file lists the build dependencies required to build the software.
2. These dependencies specify the packages or libraries that need to be installed on the build system for a successful
build.
3. Pristine Build Sources:
1. The RPM system encourages keeping build sources pristine, i.e., unmodified.
2. If modifications are needed, the spec file can include patches that modify the source code before the build process
begins.
3. Patches allow you to adapt the source code to specific requirements or fix issues.

4. Building Software:
1. The RPM system utilizes the spec file to build software packages.
2. The spec file specifies the build commands, such as compiling, linking, and packaging the software.
3. The RPM system follows the instructions in the spec file to generate the final binary package.
• By using the RPM system, you can create RPM packages for your software with well-defined build dependencies
and build instructions. The spec file allows for adaptability
B.MAHESH by ::patching
(YOUTUBE CHANNEL the source
SV TECH KNOWLEDGE ) code, ensuring that the build
process is customizable while maintaining the ability to reproduce pristine build sources.
4.4 Jenkins plugins
• Plugins are the primary means of enhancing the functionality of a Jenkins environment to suit
organization- or user-specific needs. There are over a thousand different plugins which can be
installed on a Jenkins controller and to integrate various build tools, cloud providers, analysis
tools, and much more.
• Jenkins, as an extensible automation server, offers a wide range of plugins that extend its
functionality and enable integration with various tools and technologies. Here's an overview of
Jenkins plugins:
• Source Code Management (SCM) Plugins:
• SCM plugins provide integration with different version control systems like Git, Subversion (SVN),
Mercurial, etc.
• These plugins enable Jenkins to fetch source code from repositories and trigger builds based on
changes.
• Build Tool Plugins:
• Build tool plugins integrate Jenkins with popular build tools like Apache Maven, Gradle, Ant, and
MSBuild.
• They provide build steps and configurations specific to these tools, allowing seamless integration
within Jenkins pipelines or job configurations.
• Testing Framework Plugins:
• Testing framework plugins enable integration
B.MAHESH with::various
(YOUTUBE CHANNEL testing) frameworks like JUnit, NUnit,
SV TECH KNOWLEDGE
TestNG, etc.
• Deployment Plugins:
• Deployment plugins help automate the deployment of applications to different
environments, such as application servers, cloud platforms, or containers.
• Plugins like Kubernetes, Docker, AWS Elastic Beanstalk, or Azure App Service enable
seamless deployment workflows.
• Notification Plugins:
• Notification plugins facilitate sending notifications or alerts based on build status, test
results, or other events.
• Plugins such as Email Notification, Slack Notification, or Microsoft Teams Integration
enable communication and collaboration among team members.
• Monitoring and Reporting Plugins:
• Monitoring and reporting plugins integrate Jenkins with monitoring and reporting tools like
SonarQube, JIRA, Grafana, etc.
• They provide insights into code quality, performance metrics, and project management,
enhancing the visibility and management of software projects.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


Top Jenkins Plugins
• Git Plugin.
• Kubernetes Plugin
• Jira Plugin. ...
• Docker Plugin. ...
• Maven Integration Plugin. ...
• Blue Ocean Plugin. ...
• Amazon EC2 Plugin. ...
• Pipeline Plugin.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
Jenkins file system layout
• Jenkins has a specific file system layout that organizes its configuration and data. Here
are the key directories typically found in a Jenkins installation:
• Jenkins Home Directory:
•The Jenkins home directory stores the Jenkins configuration, plugins, and job-specific
data.
config.xml (global configuration),
•It contains files and directories like

plugins/ (plugin files), and jobs/ (job configurations and data).


• Workspace Directory:
•Each Jenkins job has its workspace directory where the source code is checked out
and build artifacts are created.
•The workspace directory is unique to each job and can be accessed during the build
process.
• Logs Directory:
•The logs directory contains log files generated by Jenkins, providing information about
build execution, errors, and other events.
•Logs are essential for troubleshooting
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
and monitoring Jenkins.
• Temporary Directory:
•Jenkins utilizes a temporary directory for storing temporary files
during the build process.
•This directory is typically cleared periodically or after each build to
reclaim disk space.
• Plugins Directory:
•The plugins directory stores the plugin files used by Jenkins.
•Each plugin is contained within a separate subdirectory, and
Jenkins manages the installation, updating, and removal of plugins
within this directory.
• Understanding the Jenkins plugin ecosystem and the file system
layout is essential for effective plugin management, configuring job
workflows, and accessing the necessary data and logs for
troubleshooting and monitoring purposes.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


4.5 The host server
• The build server is usually a pretty important machine for the
organization. Building software is processor as well as
memory and disk intensive. Builds shouldn't take too long, so
you will need a server with good specifications for the build
server—with lots of disk space, processor cores, and RAM.
The build server also has a kind of social aspect: it is here that
the code of many different people and roles integrates
properly for the first time. This aspect grows in importance if
the servers are fast enough. Machines are cheaper than
people, so don't let this particular machine be the area you
save money on.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


• The build server is a crucial machine for any organization. Its
main purpose is to build software, which requires a lot of
processing power, memory, and disk space. It's important for
builds to complete quickly, so the build server should have
high-performance specifications, including a fast processor,
ample RAM, and a large storage capacity.
• The build server also plays a social role in the organization.
It's where the code from different people and roles comes
together for the first time and integrates properly. This aspect
becomes more significant when the server is fast enough to
handle the workload efficiently. It's worth noting that investing
in machines is generally more cost-effective than hiring
additional people, so it's not an area where you should try to
save money.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
4.5.1 Build slaves
• To reduce build queues and improve efficiency, you can add build slaves to your setup.
The build slaves work alongside the master server and handle specific builds assigned to
them.
• One reason for using build slaves is that certain builds have specific requirements for the
operating system they run on. By assigning builds to appropriate build slaves, you ensure
that each build is executed on the appropriate host operating system.
• Build slaves also help in parallel builds, where multiple builds can be processed
simultaneously, increasing efficiency. They can also be used for building software on
different operating systems. For example, you can have a Jenkins master server running
on Linux and assign Windows slaves for components that require Windows build tools.
Similarly, if you need to build software for Apple Mac, having a Mac build slave is useful
due to Apple's rules regarding virtual servers.
• To add build slaves to a Jenkins master, there are various methods available. One
common approach is using the SSH method, where the Jenkins master issues commands
to the build slave through a secure shell (SSH) connection. Jenkins has a built-in SSH
facility to support this. Another method is starting a Jenkins slave by downloading a Java
Network Launch Protocol (JNLP) client from the master to the slave. This method is useful
when the build slave doesn't have an SSH server installed.
• By adding build slaves to yourB.MAHESH
Jenkins setup,
(YOUTUBE you
CHANNEL :: SV can distribute
TECH KNOWLEDGE ) the workload, reduce
build queues, and improve the overall efficiency of your software building process.
• Jenkins Master
• Your main Jenkins server is the Master. The Master’s job is to handle:
 Scheduling build jobs.
 Dispatching builds to the slaves for the actual execution.
 Monitor the slaves (possibly taking them online and offline as required).
 Recording and presenting the build results.
 A Master instance of Jenkins can also execute build jobs directly.
• Jenkins Slave
• A Slave is a Java executable that runs on a remote machine. Following are the characteristics of Jenkins Slaves:
 It hears requests from the Jenkins Master instance.
 Slaves can run on a variety of operating systems.
 The job of a Slave is to do as they are told to, which involves executing build jobs dispatched by the Master.
 You can configure a project to always run on a particular Slave machine or a particular type of Slave machine, or simply let Jenkins pick the
next available Slave.
• The diagram below is self-explanatory. It consists of a Jenkins Master which is managing three Jenkins Slave.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


4.6 Software on the host
• Hosted software is hosted and managed by the software manufacturer or a third-
party vendor. Users can access it globally through the Internet.
• Depending on the complexity of your builds, you might need to
install many different types of build tool on your build server.
Remember that Jenkins is mostly used to trigger builds, not
perform the builds themselves. That job is delegated to the build
system used, such as Maven or Make.
• In my experience, it's most convenient to have a Linux-based host
operating system. Most of the build systems are available in the
distribution repositories, so it's very convenient to install them
from there.
• To keep your build server up to date, you can use the same
deployment servers that you use to keep your application servers
up to date.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
Software on the host
• Depending on how complex your builds are, you may need to install various
build tools on your build server. It's important to note that Jenkins primarily
triggers builds and delegates the actual building process to specific build
systems like Maven or Make.
• Inmy experience, using a Linux-based host operating system for the build
server is the most convenient option. This is because most build systems
are readily available in the distribution repositories, making it easy to install
them from there.
• To ensure that your build server stays updated, you can utilize the same
deployment servers that you use to keep your application servers up to
date. This way, you can maintain consistency across your infrastructure and
ensure that both your build server and application servers are regularly
updated.
• By having the necessary build tools installed on your build server and
keeping it up to date, you can ensure a smooth and efficient building
process for your software B.MAHESH
projects.
(YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
4.7 Triggers
• You can either use a timer to trigger builds, or you can poll the code
repository for changes and build if there were changes.
• It can be useful to use both methods at the same time:
• Git repository polling can be used most of the time so that every
check in triggers a build.
• Nightly builds can be triggered, which are more stringent than
continuous builds, and thus take a longer time. Since these builds
happen at night when nobody is supposed to work, it doesn't matter if
they are slow.
• An upstream build can trigger a downstream build.
• You can also let the successful build of one job trigger another job.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Triggers in build systems offer different methods for initiating builds based on various
events or conditions. Two common trigger methods are timer-based triggering and code
repository polling:
1.Timer-Based Trigger: This method involves setting a specific time or interval to trigger
builds automatically. For example, you can schedule builds to run every hour, every day,
or at a specific time. Timer-based triggers are useful for regular or scheduled builds.
2.Code Repository Polling: With this method, the build system periodically checks the code
repository for changes. If there are new commits or updates, a build is triggered. Code
repository polling is commonly used for continuous integration (CI) to ensure that builds
are initiated whenever changes are pushed to the repository.
• Using a combination of these trigger methods can provide more flexibility and control over
the build process. Here are a few scenarios where different triggers can be beneficial:
• Git Repository Polling: This method is ideal for most situations, as it triggers a build for
every code check-in. It ensures that each change undergoes an automated build,
facilitating early detection of integration issues.
• Nightly Builds: Nightly builds are more comprehensive and time-consuming, typically
running during off-peak hours when development activity is low. These builds can be
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
triggered by a timer-based trigger, as their slower execution time won't affect daily
• Upstream and Downstream Builds: In complex build processes involving
multiple components or modules, an upstream build's success can trigger
downstream builds. This ensures that dependent components are built and
tested whenever changes are made in the upstream component.
• Job Chaining: Successful completion of one job can be configured to trigger
another job. This can be useful when multiple sequential steps or tasks are
involved in the build process, allowing for a streamlined workflow.
• By utilizing a combination of trigger methods, build systems can be tailored
to specific needs, ensuring efficient and timely execution of builds based on
different events or conditions. This flexibility helps automate the build
process and maintain a reliable and consistent software development
pipeline.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


• Triggers in the context of software development and build systems refer to events or
conditions that initiate the execution of a build process. Triggers can be configured in build
systems like Jenkins to automatically start a build when certain events occur or criteria are
met. Here are a few common types of triggers:
1.Manual Trigger: This is the simplest form of trigger where a build is initiated manually by a
user or developer. It requires manual intervention to start the build process.
2.Scheduled Trigger: Builds can be scheduled to run at specific times or intervals, such as
daily, weekly, or at a specific time of the day. Scheduled triggers are useful for automating
regular builds or performing tasks at specific times.
3.Source Code Change Trigger: This trigger starts a build when changes are detected in the
source code repository. It can be configured to monitor specific branches, directories, or
files for changes and automatically initiate a build when updates are made.
4.Continuous Integration (CI) Trigger: In a CI workflow, builds are triggered whenever
changes are pushed to the source code repository. This ensures that each code change is
validated through an automated build process, helping to identify and resolve integration
issues early.
• By utilizing triggers effectively, build systems can automate the build process, ensuring
timely and consistent executionB.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
of builds in response to various events or conditions. This
4.8 Job chaining and build pipelines
• Job chaining and build pipelines are useful concepts in build systems like Jenkins. Job chaining
allows for the sequential execution of multiple jobs, where the successful completion of one job
triggers the next job in the chain. This creates a logical flow of tasks, ensuring that dependencies
between jobs are met.
• Build pipelines, on the other hand, provide a visual representation of the entire software delivery
process. They allow for the creation of complex workflows with multiple stages, each represented
by a distinct job. Build pipelines offer a graphical view of the flow, making it easy to track the
progress of builds through different stages.
• In Jenkins, the first job in a chain is called the upstream build, while the second one is referred to
as the downstream build. This basic job chaining approach is often sufficient for many purposes.
• However, there are cases where a more advanced and controlled build chain is required. This is
where pipelines or workflows come into play. Pipelines provide a more sophisticated visualization
of the build steps and offer greater control over the details of the chain.
• Jenkins provides various plugins to create improved pipelines, with the multijob plugin and the
workflow plugin being two examples. The workflow plugin, promoted by CloudBees, is more
advanced and allows the pipeline to be described using a Groovy Domain Specific Language
(DSL), providing more flexibility than working solely with the web user interface.
• Overall, job chaining and build pipelines enhance the organization, visualization, and control of
build processes. They improve the management of complex workflows and contribute to efficient
software delivery.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
Job chaining and build pipelines
• Jobchaining and build pipelines are related concepts that enable the
creation of complex and interconnected build processes in build systems like
Jenkins. They help organize and streamline the execution of multiple jobs or
tasks in a specific sequence, facilitating the automation of software delivery
workflows. Let's explore each concept in more detail:
• Job Chaining: Job chaining refers to the configuration of build jobs in a
sequential manner, where the successful completion of one job triggers the
execution of the next job in the chain. This allows for a logical flow of tasks,
where each job depends on the successful completion of its predecessor.
Job chaining is useful when specific tasks or steps need to be executed in a
particular order, such as building, testing, and deploying software
components.
• For
example, Job A could be responsible for compiling the code, and once it
successfully completes, it triggers Job B, which performs unit tests. Upon
successful completion of Job B, Job C is triggered for integration testing,
and so on. Job chaining ensures thatCHANNEL
B.MAHESH (YOUTUBE each step
:: SV TECH is executed
KNOWLEDGE ) in the desired
sequence, with dependencies between jobs explicitly defined.
• Build Pipelines: Build pipelines extend the concept of job chaining by
providing a visual representation of the entire software delivery process. A
build pipeline allows for the creation of complex workflows with multiple
stages, each represented by a distinct job. The pipeline provides a
graphical representation of the flow, including the relationships and
dependencies between different stages or jobs.
• In a build pipeline, each stage represents a specific task or set of tasks,
such as building, testing, deploying, and releasing software. Jobs within
each stage are executed in parallel or sequentially, depending on the
defined dependencies and requirements. The pipeline enables tracking the
progress of builds through different stages, providing visibility into the
software delivery process.
• Build pipelines offer benefits like visual clarity, easy monitoring, and the
ability to track the status of each stage or job. They also facilitate the
management of complex workflows involving multiple teams, environments,
and deployment targets.
• Overall, job chaining and build pipelines help automate and streamline the
software delivery process, ensuring that tasks are executed in the desired
sequence with proper dependencies and visibility. They enhance
collaboration, improve efficiency, and provide traceability in the build and
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
release management workflows.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
4.9 Build servers and infrastructure as code
• While we are discussing the Jenkins file structure, it is useful to note an
impedance mismatch that often occurs between GUI-based tools such as Jenkins
and the DevOps axiom that infrastructure should be described as code.
• One way to understand this problem is that while Jenkins job descriptors are text
• file-based, these text files are not the primary interface for changing the job
descriptors.
• The web interface is the primary interface. This is both a strength and weakness.
• It is easy to create ad-hoc solutions on top of existing builds with Jenkins. You don't
• need to be intimately familiar with Jenkins to do useful work.
• On the other hand, the out-of-the-box experience of Jenkins lacks many features that
we are used to from the world of programming. Basic features like inheritance and
even function definitions take some effort to provide in Jenkins.
• The build server feature in GitLab, for example, takes a different approach. Build
descriptors are just code right from the start. It is worth checking out this feature in
GitLab if you don't need all the possibilities that Jenkins offers.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• When it comes to the structure of Jenkins files, there is often a mismatch
between GUI-based tools like Jenkins and the DevOps principle of describing
infrastructure as code.
• In Jenkins, job descriptors are stored as text files, but these files are not typically
the primary means of modifying the job descriptors. Instead, the web interface is
the main way to make changes. This can be seen as both a strength and a
weakness.
• On the positive side, Jenkins allows for easy ad-hoc solutions by building on
existing setups. You don't have to be deeply familiar with Jenkins to be
productive. However, the default experience in Jenkins lacks some features that
we are accustomed to in programming. Even basic features like inheritance and
function definitions require extra effort to implement in Jenkins.
• In contrast, other tools like GitLab's build server feature take a different approach.
In GitLab, build descriptors are treated as code right from the beginning. If you
don't require all the extensive capabilities that Jenkins offers, it's worth exploring
GitLab's build server feature.
• In summary, the Jenkins file structure can be seen as having a mismatch with the
"infrastructure as code" principle. While Jenkins provides flexibility and ease of
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
use through its web interface, it may lack some programming-like features. Other
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
Building by dependency order
• Many build tools have the concept of a build tree where dependencies are
built in the order required for the build to complete, since parts of the build
might depend on other parts.
• In Make-like tools, this is described explicitly; for instance, like this:
• a.out : b.o c.o
• b.o : b.c
• c.o : c.c
• So ,in order to build a.out, b.o and c.o must be built first.
• In tools such as Maven, the build graph is derived from the dependencies we
set for an artifact. Gradle, another Java build tool, also creates a build graph
before building.
• Jenkins has support for visualizing the build order for Maven builds, which
is called the reactor in Maven parlance, in the web user interface.
• This view is not available for Make-style builds, however.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Many build tools, like Make, Maven, and Gradle, employ a build tree concept to
handle dependencies between different parts of a build. The build tree ensures
that the necessary components are built in the required order for the entire build to
successfully complete.
• In Make-like tools, such as Make itself, dependencies are explicitly described in
the build script. For example, in the code snippet provided:
• a.out : b.o c.o
• b.o : b.c
• c.o : c.c
• To build a.out, b.o and c.o must be built first, as indicated by the
dependencies specified.
• In tools like Maven and Gradle, the build graph is derived from the declared
dependencies of artifacts. These tools analyze the dependencies specified in the
build configuration and automatically construct a build graph that represents the
correct order for building the components.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• Jenkins, being a versatile build system, offers support for
visualizing the build order of Maven builds in its web user
interface. This visualization, referred to as the "reactor" in
Maven terminology, helps developers understand the order
in which Maven modules or artifacts will be built within the
build tree.
• However, it's important to note that Jenkins does not provide
the same built-in visualization for Make-style builds. The
focus on visualizing the build order is primarily tailored
towards Maven builds, where the reactor structure plays a
significant role.

B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )


Build phases
• One of the principal benefits of the Maven build tool is that it standardizes
builds.
• This is very useful for a large organization, since it won't need to invent its
own build standards. Other build tools are usually much more lax regarding
how to implement various build phases. The rigidity of Maven has its pros
and cons. Sometimes, people who get started with Maven reminisce about
the freedom that could be had with tools such as Ant.
• You can implement these build phases with any tool, but it's harder to keep the
habit
• going when the tool itself doesn't enforce the standard order: building,
testing, and deploying.
• We will examine testing in more detail in a later chapter, but we should note here that the
testing phase is very important. The Continuous Integration server needs to be very good at
catching errors, and automated testing is very important for achieving that goal.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• One of the main advantages of the Maven build tool is its ability to
standardize builds. This standardization is particularly valuable for large
organizations as they don't have to create their own build conventions. In
contrast, other build tools often provide more flexibility in implementing
different build phases. The strictness of Maven has both its benefits and
drawbacks. Some developers, who are accustomed to more flexible tools
like Ant, may miss the freedom they had.
• Although it's possible to implement build phases with any build tool, it can
be challenging to maintain consistent practices without a tool that enforces a
standardized order, such as building, testing, and deploying.
• It's important to highlight the significance of the testing phase in the build
process. Effective error detection is crucial for a Continuous Integration
server, and automated testing plays a vital role in achieving that objective.
• In summary, Maven's ability to standardize builds offers advantages for
organizations by eliminating the need to define their own build conventions.
However, it may feel restrictive compared to more flexible tools. Regardless
of the build tool used, the B.MAHESH
testing phase holds great importance in ensuring
(YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
error-free software delivery, especially in the context of Continuous
Alternative build servers
• While Jenkins appears to be pretty dominant in the build server scene in
my experience, it is by no means alone. Travis CI is a hosted solution that is
popular among open source projects. Buildbot is a buildserver that is
written in, and configurable with, Python. The Go server is another one,
from ThoughtWorks. Bamboo is an offering from Atlassian. GitLab also
supports build server functionality now.
• Do shop around before deciding on which build server works best for you.
• When evaluating different solutions, be aware of attempts at vendor lock-
in. Also keep in mind that the build server does not in any way replace the
need for builds that are well behaved locally on a developer's machine.
• Also, as a common rule of thumb, see if the tool is configurable via
configuration files. While management tends to be impressed by graphical
configuration, developers and operations personnel rarely like being
forced to use a tool that can only be configured via a graphical user
interface.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• While Jenkins is widely used as a build server, it is not the only option available.
Other popular build server solutions include Travis CI, which is commonly used
for open-source projects, Buildout, written in Python and highly configurable, the
Go server from Thought Works, Bamboo by Atlassian, and GitLab, which now
supports build server functionality.
• When choosing a build server, it's important to explore different options and
compare them. Avoid vendor lock-in and consider solutions that offer flexibility
and configurability. Remember that the build server should complement, not
replace, the need for well-behaved builds on developers' machines.
• As a general rule, it's beneficial to choose a build server that allows configuration
via configuration files rather than relying solely on a graphical user interface.
While management might appreciate the convenience of a GUI, developers and
operations personnel prefer the flexibility of configuring tools through text-based
configuration files.
• In summary, while Jenkins is a dominant player in the build server scene, there
are other popular options available. It's crucial to evaluate different solutions, be
wary of vendor lock-in, and consider configurability and flexibility when making a
decision. Additionally, prioritizing configuration via files rather than solely relying
on a graphical interface canB.MAHESH
be advantageous forKNOWLEDGE
(YOUTUBE CHANNEL :: SV TECH developers
) and operations
personnel.
Collating quality measures
• A useful thing that a build server can do is the collation of software
quality metrics. Jenkins has some support for this out of the box. Java unit
tests are executed and can be visualized directly on the job page.
• Another more advanced option is using the Sonar code quality visualizer,
which is shown in the following screenshot. Sonar tests are run during
the build phase and propagated to the Sonar server, where they are
stored and visualized.
•A Sonar server can be a great way for a development team to see the
fruits of their efforts at improving the code base.
• The drawback of implementing a Sonar server is that it sometimes slows
down the builds. The recommendation is to perform the Sonar builds in
your nightly builds, once a day.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )
• A build server can provide valuable features for collating software quality metrics.
Jenkins offers some built-in support for this functionality. It can execute Java unit
tests and display the results directly on the job page, allowing developers to
visualize the test outcomes.
• For more advanced quality measurement, Jenkins integrates with Sonar, a code
quality visualizer. During the build phase, Sonar tests are executed and the results
are sent to a Sonar server, where they are stored and presented in a visual
format. Having a Sonar server can be beneficial for development teams to assess
the progress they make in enhancing the codebase.
• It's important to note that implementing a Sonar server may introduce some build
performance impact. To mitigate this, it is recommended to run Sonar builds
during nightly builds, once a day. This way, the builds are not affected by the
potential slowdown caused by Sonar analysis.
• In summary, a build server like Jenkins can collate software quality metrics. It
provides options for executing unit tests and visualizing results on the job page.
Additionally, integrating with a Sonar server enables more advanced code quality
analysis. However, it's crucial to consider the impact on build performance and
allocate Sonar analysis to nightly builds for better efficiency.
B.MAHESH (YOUTUBE CHANNEL :: SV TECH KNOWLEDGE )

You might also like