Devops Manual
Devops Manual
SDLC
The Software Development Life Cycle (SDLC) is a structured process used for designing,
developing, testing, and deploying software. It ensures software is high-quality, meets user
requirements, and is delivered within time and budget constraints. Here’s an overview of the
typical stages in the SDLC:
1. Requirement Gathering and Analysis
Purpose: Understand what the client or end-users need from the software.
Activities:
o Stakeholder interviews, surveys, and meetings.
o Documenting functional and non-functional requirements.
Outcome: A detailed Software Requirements Specification (SRS) document outlining
what the software should do.
3. System Design
Purpose: Create a blueprint for the software’s architecture.
Activities:
o Design both high-level architecture (database structure, technology stack) and
detailed designs (UI/UX design, API endpoints).
Outcome: Detailed design documents, often including UI mockups, ER diagrams, and
data flow diagrams.
4. Implementation (Coding)
Purpose: Actual development or coding of the software.
Activities:
o Programmers write code according to the design documents.
o Code is typically written in small, iterative phases, especially if following Agile
or similar methodologies.
Outcome: The first working version of the software (also known as alpha or beta
versions).
5. Testing
Purpose: Ensure the software functions as expected, is bug-free, and meets the original
requirements.
Activities:
o Unit Testing, Integration Testing, System Testing, and User Acceptance Testing
(UAT).
o Identify and fix bugs, performance issues, or gaps in functionality.
Outcome: A stable version of the software that’s ready for deployment.
6. Deployment
Purpose: Release the software to a live environment for use.
Activities:
o Deploy the software to production servers.
o Provide necessary documentation and training to end users.
o Monitor the system post-deployment for any immediate issues.
Outcome: The software is now live and in use by the end users.
7. Maintenance
Purpose: Ensure the software remains functional over time.
Activities:
o Fix bugs that weren’t caught during testing.
o Implement updates, upgrades, and improvements.
o Perform regular maintenance like optimizing databases, applying security patches,
etc.
Outcome: Ongoing support and updates to keep the software running smoothly.
SDLC Benefits:
Structured Approach: Provides a clear process with defined stages, ensuring
consistency and quality.
Risk Management: Early identification of risks and issues through analysis and
planning.
Cost Efficiency: Better planning and design help avoid costly fixes and scope creep.
Customer Satisfaction: Involving stakeholders in the early stages ensures that the final
product meets their needs.
Exercise 2:
Reference course name: Development & Testing with Agile: Extreme Programming
Get a working knowledge of using extreme automation through XP programming
practices of test first development, refactoring and automating test case writing.
Solve the questions in the “Take test” module given in the reference course name to
gauge wer understanding of the topic
AGILE METHDOLOGY
Agile Methodology is a flexible, iterative approach to software development that emphasizes
collaboration, customer feedback, and small, rapid releases. Unlike traditional development
models like Waterfall, Agile breaks down projects into smaller, manageable units called
"iterations" or "sprints," allowing for continuous improvement and adaptation throughout the
development process.
Key Concepts of Agile:
1. Iterative Development: Projects are divided into smaller parts, which are developed,
tested, and delivered in iterations (typically 1-4 weeks).
2. Customer Collaboration: Constant feedback from customers ensures the product meets
their needs.
3. Adaptability: Agile embraces changing requirements, even late in the development
process.
4. Cross-functional Teams: Agile teams include members from different disciplines, like
developers, testers, and business analysts, working together in close collaboration.
5. Continuous Feedback: Frequent reviews and feedback help ensure the project remains
on track and is adjusted as needed.
Agile Practices:
Agile teams use specific practices to promote collaboration, flexibility, and quality:
1. User Stories: Simple, informal descriptions of features told from the perspective of the
user.
o Example: “As a [user], I want [feature] so that [benefit].”
2. Backlog Grooming (Refinement): Regularly reviewing the product backlog to ensure it
is up-to-date and prioritized.
3. Daily Standups (Daily Scrum): A short, daily meeting (usually 15 minutes) where team
members discuss:
o What they did yesterday.
o What they plan to do today.
o Any blockers or impediments.
4. Sprint Planning: A meeting at the start of each sprint to plan what will be delivered and
how the team will approach the work.
5. Sprint Review: At the end of the sprint, the team demonstrates the completed work to
stakeholders and gathers feedback.
6. Sprint Retrospective: A meeting after each sprint to reflect on the process and find ways
to improve in the next sprint.
7. Burndown Chart: A visual representation of the work remaining in the sprint. It tracks
progress over time.
Agile Roles:
1. Product Owner: Owns the product backlog, prioritizes user stories, and ensures that the
team is working on the most valuable features.
2. Scrum Master (in Scrum): Ensures the Agile process is being followed, facilitates
communication, and removes obstacles.
3. Development Team: A group of cross-functional members who develop the product,
including developers, testers, designers, etc.
Benefits of Agile:
Faster Delivery: Regular iterations mean that features can be released frequently and
users see value sooner.
Flexibility: Agile welcomes changes, making it easier to adapt to changing customer
needs.
Customer Satisfaction: Continuous customer feedback ensures that the product aligns
with their expectations.
Improved Quality: Testing is integrated throughout the development process, helping
catch defects early.
Transparency: Frequent updates and reviews keep everyone on the same page about
project progress.
Challenges of Agile:
Changing Requirements: Constant changes may overwhelm the team if not managed
properly.
Team Collaboration: Agile demands a high level of communication and collaboration,
which may be difficult for distributed teams.
Client Involvement: Agile requires constant involvement from stakeholders, which may
not always be feasible.
When to Use Agile:
Projects with changing requirements: Agile works best in environments where
requirements evolve over time.
Customer-centered development: If customer feedback is crucial and expected
throughout the project, Agile provides the flexibility needed.
Fast-moving projects: When rapid delivery of software is a priority, Agile’s iterative
nature helps get early versions out quickly.
EXERCISE-3
DEVOPS LIFE CYCLE
The DevOps lifecycle refers to the continuous and iterative process that combines development
(Dev) and operations (Ops) practices to accelerate the delivery of high-quality software. It
integrates development, testing, deployment, and monitoring processes, allowing for continuous
feedback and improvements. The key objective of DevOps is to bridge the gap between
development and operations teams, enabling more frequent, reliable software releases.
4. Continuous Testing
o Purpose: Automate and execute various types of tests to ensure the software
works as expected.
o Activities:
Automated tests, such as unit tests, integration tests, performance tests,
and security tests, are triggered with every code change.
Tools like Selenium, JUnit, TestNG, or JMeter are commonly used.
Automated test results provide feedback to developers in real-time,
allowing them to fix issues before deployment.
o Outcome: Verified and bug-free code ready for deployment.
5. Continuous Deployment (CD)
o Purpose: Automatically deploy the tested code to production or staging
environments.
o Activities:
The code is deployed using automated pipelines that move it from testing
environments to production.
Tools like Jenkins, GitLab CI/CD, AWS CodePipeline, or Kubernetes
handle deployment processes.
Microservices architectures often use Docker and Kubernetes to manage
containerized applications for consistency across environments.
o Outcome: The code is continuously and automatically released into production or
staging environments.
7. Continuous Feedback
o Purpose: Use feedback from users, system monitoring, and development teams to
improve the software.
o Activities:
Collecting feedback from end-users, stakeholders, and automated
monitoring systems.
Teams analyze logs and user metrics to identify areas of improvement.
Applying feedback in the planning stage to continuously improve future
releases.
o Outcome: Better alignment with user needs, improved product performance, and
more efficient workflows.
8. Continuous Operations
o Purpose: Ensure the system runs smoothly 24/7 with little to no downtime.
o Activities:
Automating infrastructure scaling, backup, and disaster recovery.
Utilizing cloud services like AWS, Azure, or Google Cloud to ensure
high availability and scalability.
Automating system patches, updates, and maintenance tasks to minimize
manual intervention.
o Outcome: A self-sustaining system that is reliable, scalable, and able to handle
high traffic and demand.
DevOps Tools:
Version Control: Git, GitHub, GitLab, Bitbucket.
CI/CD: Jenkins, GitLab CI/CD, Travis CI, CircleCI.
Configuration Management: Ansible, Puppet, Chef.
Containerization: Docker, Kubernetes.
Monitoring: Prometheus, Nagios, ELK Stack (Elasticsearch, Logstash, Kibana),
Grafana.
Cloud Platforms: AWS, Azure, Google Cloud.
Benefits of DevOps:
Faster Time to Market: Shorter development cycles and faster releases ensure quick
delivery of new features.
Higher Quality: Continuous testing and integration lead to higher-quality software with
fewer bugs.
Increased Collaboration: DevOps fosters a culture of collaboration between
development and operations teams, leading to better alignment and communication.
Scalability: Automated infrastructure and containerized applications make scaling easier.
Better Customer Experience: Continuous feedback loops allow quick adjustments to
meet user needs.
Challenges in DevOps:
Cultural Shift: Adopting DevOps requires significant cultural changes, where
development and operations teams work closely together, which may be hard in
traditional environments.
Tool Complexity: Managing the different tools and technologies in the DevOps
ecosystem can be complex.
Security: In a rapid, automated release process, ensuring security at every step of the
pipeline (DevSecOps) becomes critical.
Exercise 4
Module name :Implementation of CICD with Java and open source stack
Configure the web application and Version control using Git using Git
commands and version control operations.
V2= V1+ef
Ex: GIT
mkdir project
cd project
git init
creation of files:
==================
vi or vim
sudo vi sarada
EDIT:
===========
nano ----
vi -----
for nano:
==============
sudo nano mounika
for vi :
===========
sudo vi sarada
w for save the content and q for exit from the edit mode
DIRECTORIES (FOLDER):
======================
See the status of your working directory (modified, staged, or untracked files).
git status
Git add .
Step -14: Push the files from local repository (git) to central repository (github)
git remote –v
Now Push the files from local repository (git) to central repository (github)
Step -15: Pull the files from central repository to local repository
git fetch -- This command is used for check the changes in central repository.
Configure a static code analyzer which will perform static analysis of the web
application code and identify the coding practices that are not appropriate. Configure
the profiles and dashboard of the static code analysis tool.
Procedure:
static code analysis:It helps us to ensure the overall code quality, fix bugs in the early
stage of development, and ensure that each developer is using the same coding
standards when writing the code.
There are three basic tools that we are going to use for our static code
analysis: CheckStyle, Findbugs, PMD.
CheckStyle
CheckStyle is a tool that helps programmers write code that aligns with already agreed
upon coding standards. It automatically checks if the code adheres to the coding
standards used on a project.
FindBugs
Fine people of the University of Maryland built this fantastic tool for us. What it
basically does for us is, of course, find bugs in our code. FindBugs analyses our code
and generates a report, giving us a list of all the bugs that could cause a program to
misbehave. One of the good examples of the bugs that could be detected are infinite
loops, unused variables, security, threading issues, and many more.
PMD
PMD is another useful tool in our static code analyzers toolbox. Beside reporting many
coding issues (e.g. possible memory leaks), PMD can check if our code was commented
properly if our variables are named properly and our method contains more than the
specified number of lines. PMD is highly configurable and the latest releases play quite
well with Lombok annotations. Previously, we needed to define custom excluded rules
in order for PMD to play nice with Lombok.
buildscript {
repositories {
mavenCentral()
dependencies {
classpath 'de.aaschmid:gradle-cpd-plugin:1.1'
}
apply plugin: 'checkstyle'
Once the plugins are included in the build script, we can start configuring each of the
plugins. First, we are going to configure the CheckStyle plugin.
Setting CheckStyle
For CheckStyle, we are going to set the ignoreFailures flag, toolVersion, and
configFile, which will point to the location of a configuration file. As a base for our
configuration file, we are going to use Google Code Style settings. The configurations
are basically the same — the only difference is that we are going to use four space
instead of two space indentation. And, that’s it. Nothing more needs to be done for
CheckStyle. Let’s set up the FindBugs next:
checkstyle {
toolVersion = '8.12'
ignoreFailures = false
configFile = file("${rootGradleDir}/static-code-analysis/checkstyle/checkstyle.xml
")
We are explicitly setting the plugin not to ignore failures i.e. ignoreFailures flag is set
to false. What that basically means is that our project build will fail if we run into any
issue during our static code analysis check. If we think about it, this has a lot of sense.
Our CI/CD pipeline should fail in case we run upon any issue in our code base. Being
it compile failure, unit test failure, code analysis, as long as we have an issue, we
shouldn’t be able to continue with our pipeline.
Setting FindBugs
In most cases, we should specify only toolVersion and ignoreFailures. There are other
options we could set here, such as specifying which bug detectors are going to be run
or to include/exclude lists of files that are going to be checked in our code base. For our
example, we will leave the default values here: all default bug detectors will be run, and
we are not going to exclude any file from FindBugs detection.
findbugs {
toolVersion = '3.0.1'
ignoreFailures = false
}
Setting PMD
For PMD, besides toolVersion and ignoreFailures, we are going to set the rule sets for
our code analysis. We have can set the rule sets in two ways. We can specify them
directly inside the PMD plugin configuration using ruleSets array, or we could extract
the rule sets to separate the XML file and reference the file using the
ruleSetFiles configuration parameter. We are going to choose the latter option since it
is more descriptive and allows us to provide exclusions to default rule sets categories.
For the codestyle category, we are
excluding DefaultPackage and OnlyOneReturn rules. we can check out ruleset.xml
for full setup.
pmd {
toolVersion = '6.7.0'
ignoreFailures = false
ruleSetFiles = files("${rootGradleDir}/static-code-analysis/pmd/ruleset.xml")
ruleSets = []
rulePriority = 3
Setting CPD
For Copy/Paste bug detection, we need to configure the CPD plugin. First, let’s set the
minimumTokenCount to 100. This means that the plugin will detect a duplicate code
bug if it finds around 5– 10 lines of the same code in separate places. If only four lines
of code are matched, the bug will not be detected. One useful option — especially if
we are using frameworks — is to set the ignoreAnnotations to true. It will allow us
to ignore “false positives” and ignore cases where classes or methods have the same 5–
6 lines of annotations. Finally, we’ll enable and generate XML by setting xml.enabled
to true.
cpd {
language = 'java'
toolVersion = '6.0.0'
}
cpdCheck {
reports {
text.enabled = false
xml.enabled = true
ignoreAnnotations = true
source = sourceSets.main.allJava
view raw
For remaining static analysis report plugins, we will enable generation of the HTML
report instead of XML one.
tasks.withType(Checkstyle) {
reports {
xml.enabled false
html.enabled true
tasks.withType(FindBugs) {
reports {
xml.enabled false
html.enabled true
tasks.withType(Pmd) {
reports {
xml.enabled false
html.enabled true
Great! We are done with the static analysis code configuration. Now, we just need to
include staticCodeAnalysis.gradle into our Gradle build script:
1
apply from: "${rootGradleDir}/staticCodeAnalysis.gradle"
Each plugin will add its own dependencies to the Java plugin check task (e.g. pmdMain,
cpdMain). Whenever we run ./gradlew clean build, the internally check
task will be triggered and static analysis steps will be run for our project. If any of the
code analysis steps fail, our build will fail as well. Static code analysis reports will be
generated under ./build/reports.
If in some situations we need to “loose” the specified static code rules, we can always
suppress static analysis errors by using @SuppressWarnings annotation. In order to
suppress the warning for having too many methods in a class, we could
put @SuppressWargning("PMD.TooManyMethods") on the given class.
We advise keeping static analysis “on” for the test classes as well. We should always
treat tests as an integrated part of our project. Test code should conform to the same
styles/rules we use throughout our project.
Exercise 6:
AIM:
Module Name: Implementation of CICD with Java and open source stack
Write a build script to build the application using a build automation tool like Maven.
Create a folder structure that will run the build script and invoke the various software
development build stages. This script should invoke the static analysis tool and unit
test cases and deploy the application to a web application server like Tomcat.
Procedure:
What is Maven
Maven is a powerful project management tool that is based on POM (project object
model). It is used for projects build, dependency and documentation. It simplifies
the build process like ANT. But it is too much advanced than ANT.
In short terms we can tell maven is a tool that can be used for building and
managing any Java-based project. maven make the day-to-day work of Java
developers easier and generally help with the comprehension of any Java-based
project.
POM means Project Object Model is key to operate Maven. Maven reads pom.xml
file to accomplish its configuration and operations. It is an XML file that contains
information related to the project and configuration information such
as dependencies, source directory, plugin, goals etc. used by Maven to build the
project.
The sample of pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>LoggerApi</artifactId>
<version>0.0.1-SNAPSHOT</version>
<!-- Add typical dependencies for a web application -->
<dependencies>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.11.0</version>
</dependency>
</dependencies>
</project>
Maven Repository
Maven repositories are directories of packaged JAR files with some metadata. The
metadata are POM files related to the projects each packaged JAR file belongs to,
including what external dependencies each packaged JAR has. This metadata
enables Maven to download dependencies of your dependencies recursively until all
dependencies are download and put into your local machine.
Maven has three types of repository :
1. Local repository
2. Central repository
3. Remote repository
Maven searches for dependencies in this repositories. First maven searches in Local
repository then Central repository then Remote repository if Remote repository
specified in the POM.
When working on a java project and that project contains a lot of dependencies,
builds, requirement, then handling all those things manually is very difficult and
tiresome. Thus using some tool which can do these works is very helpful.
Maven is such a build management tool which can do all the things like adding
dependencies, managing the classpath to project, generating war and jar file
automatically and many other things.
Pros and Cons of using Maven
Pros:
1 Maven can add all the dependencies required for the project
automatically by reading pom file.
2 One can easily build their project to jar, war etc. as per their requirements
using maven.
1. Maven makes easy to start project in different environments and one
doesn’t needs to handle the dependencies injection, builds, processing, etc.
4 Adding a new dependency is very easy. One has to just write the
dependency code in pom file.
Cons:
1 Maven needs the maven installation in the system for
working and maven plugin for the ide.
2 If the maven code for an existing dependency is not
available, then one cannot add that dependency using
maven.
One can use the Maven Build Tool in the following condition:
3 When there are a lot of dependencies for the project. Then
it is easy to handle those dependencies using maven.
4 When dependency version update frequently. Then one has
to only update version ID in pom file to update dependencies.
5 Continuous builds, integration, and testing can be easily
handled by using maven.
6 When one needs an easy way to Generating documentation
from the source code, Compiling source code, Packaging compiled code
into JAR files or ZIP files.
Commands:
mvn compile --- compilation of the pom.xml (will get .class files)
3. Maven installation
Mvn clean
mvn compile
mvn install
mvn package
Exercise 7:
AIM:
Module Name: Implementation of CICD with Java and open source stack Configure
the Jenkins tool with the required paths, path variables, users and pipeline views.
Procedure:
Jenkins has the most optimal product community and set of really useful plugins that
suits most of your software projects: you can build software, deploy software,
websites, portals to various places (e.g including AWS, DigitalOcean, bare metal
servers) or to run unit tests. It can be integrated with communication tools of your
choice, like Slack, HipChat or email.
If you haven't had a chance to try Jenkins earlier, feel free to use the tutorial below to
get started.
Manual installation
Install Java
The easiest way to install Java is through the apt-get package manager:
Once you added ppa above, you can install java with the following command:
You will need to execute a series of the following commands, namely: add the
Jenkins signing key, register Jenkins apt sources, update package lists, and install
Jenkins package.
Navigate to Manage Jenkins (on the left) and choose the "Configure Global Security"
item on the page loaded.
Now look below on the Matrix-based security (select it, if it is not selected
previously), and make sure Anonymous only has the "Read" right under the View
group.
Click Save at the bottom of the page. After the page reloads, you'll see a login form,
but simply ignore that and
go to the home page (like, for example, http://jenkins-host:8080/). You'll see this
signup form, and the first signed up account will be the administrator.
Jenkins would not be so powerful without plugins. Usually, I install these plugins by
default:
Bitbucket:
The BitBucket plugin is designed to offer integration between BitBucket and
Jenkins. BitBucket offers a Jenkins hook, but this one just triggers a build for
a specific job on commit, nothing more. The BitBucket plugin, like the
GitHub plugin, uses the POST hook payload to check which job has to get
triggered based on the changed repository/branch.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
bitbucket-pullrequest-builder:
This plugin builds pull requests from Bitbucket.org. This is a must-have
plugin if you perform QA deploy for each submitted pull request.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Bitbucket+pullrequest+builder+plugin
build-pipeline-plugin:
This plugin provides a Build Pipeline View of upstream and downstream
connected jobs that typically form a build pipeline. In addition, it offers the
ability to define manual triggers for jobs that require intervention prior to
execution, e.g. an approval process outside of Jenkins. Provides nice
visualization of the paths & flows.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Build+Pipeline+Plugin
copyartifact:
Adds a build step to copy artifacts from another project. The plugin lets you
specify which build to copy artifacts from (e.g. the last successful/stable build,
by build number, or by a build parameter). You can also control the copying
process by filtering the files being copied, specifying a destination directory
within the target project, etc.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Copy+Artifact+Plugin
credentials:
Adds a build step to copy artifacts from another project. The plugin lets you
specify which build to copy artifacts from (e.g. the last successful/stable build,
by build number, or by a build parameter). You can also control the copying
process by filtering the files being copied, specifying a destination directory
within the target project, etc.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Credentials+Plugin
delivery-pipeline-plugin:
Visualisation of Delivery/Build Pipelines, renders pipelines based on
upstream/downstream jobs. When using Jenkins as a build server it is now
possible with the Delivery Pipeline Plugin to visualise one or more Delivery
Pipelines in the same view even in full screen.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Delivery+Pipeline+Plugin
environment-script:
Environment Script Plugin allows you to have a script run after SCM
checkout, before the build. If the script fails (exit code isn't zero), the build is
marked as failed.
Any output on standard out is parsed as environment variables that are
applied to the build. It supports "override syntax" to append paths to PATH-
like variables.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Environment+Script+Plugin
git:
Supports popular git version control system
ghprb:
This plugin builds pull requests in GitHub. It's another must-have plugin if
your software development life cycle includes deploying pull requests to PR
environment to test.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/GitHub+pull+request+builder+plugin
greenballs: The funniest plugin - changes Jenkins to use green balls instead of
blue for successful builds.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Green+Balls
hipchat:
This plugin allows your team to setup build notifications to be sent to
HipChat rooms.To enable notifications, add "HipChat Notifications" as a
post-build step.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/HipChat+Plugin
junit:
Allows JUnit-format test results to be published. Note: number of tools,
including Karma, PhpUNIT & other tools allow to publish test results in a
JUnit format. Thus, this is a must-have plugin for unit test flows.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/JUnit+Plugin
matrix-auth:
Offers matrix-based security authorization strategies (global and per-project).
This is quite handy if you have shared build server across several teams.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Matrix+Authorization+Strategy+Plugin
parameterized-trigger:
This plugin lets you trigger new builds when your build has completed, with
various ways of specifying parameters for the new build.
You can add multiple configurations: each has a list of projects to trigger, a
condition for when to trigger them (based on the result of the current build),
and a parameters section.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Parameterized+Trigger+Plugin
rebuild:
This plays nice with the parameterized-trigger plugin. The plug-in allows the
user to rebuild a parametrized build without entering the parameters again.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Rebuild+Plugin
ssh: You can use the SSH Plugin to run shell commands on a remote machine
via ssh.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/SSH+plugin
s3: Allows uploading artifacts to S3 with multiple options.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/S3+Plugin
throttle-concurrents:
This plugin allows for throttling the number of concurrent builds of a project
running per node or globally.
Unfortunately, this is also a must-have plugin for Node (0.10-0.12) projects
with NPM - two concurrent npm installs often fail.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Throttle+Concurrent+Builds+Plugin
Plugins are installed using Plugin manager on a Manage Jenkins Section.
server {
server_name jenkins.vagrant.dev;
ssl_certificate /etc/nginx/jenkins_selfsigned.crt;
ssl_certificate_key /etc/nginx/jenkins_selfsigned.key
location / {
proxy_pass http://127.0.0.1:8080;
proxy_redirect off;
proxy_connect_timeout 150;
proxy_send_timeout 100;
proxy_read_timeout 100;
}}
Automated installation
path = public/ansible_developer_recipes
url = [email protected]:Voronenko/ansible-developer_recipes.git
[submodule "roles/sa-box-bootstrap"]
path = roles/sa-box-bootstrap
url = [email protected]:softasap/sa-box-bootstrap.git
[submodule "roles/sa-box-jenkins"]
path = roles/sa-box-jenkins
url = [email protected]:softasap/sa-box-jenkins.git
hosts - list here the initial box credentials that were provided to you for the
server. Note: jenkins-bootstrap assumes you have the fresh box with root
access only. If your box already secured, adjust credentials appropriately
[jenkins-bootstrap]
vars_files:
- ./jenkins_vars.yml
roles:
-{
role: "sa-box-bootstrap",
root_dir: "{{playbook_dir}}/public/ansible_developer_recipes",
deploy_user: "{{jenkins_user}}",
deploy_user_keys: "{{jenkins_authorized_keys}}"
jenkins.yml provisioning script that configures jenkins with set of plugins and
users.
setup_jenkins.sh shell script that invokes deployment in two steps: initial box
bootstraping & jenkins setup
#!/bin/sh
ansible-playbook jenkins_bootstrap.yml --limit jenkins_bootstrap
jenkins_authorized_keys:
- "{{playbook_dir}}/components/files/ssh/vyacheslav.pub"
jenkins_domain: "vagrant.dev"
jenkins_host: "jenkins"
java_version: 8
- jenkins_users list of users with passwords to create. Admin and deploy are required
users.
Admin is used to manage instance, deploy is used to access the artifacts via
deployment scripts.
If you won't override passwords, the default one will be used (per role), which is not
the best, for public deployments.
jenkins_users:
-{
name: "Admin",
password: "AAAdmin",
email: "no-reply@localhost"
-{
name: "deploy",
password: "DeDeDeDeploy",
email: "no-reply@localhost"
- bitbucket # https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
- bitbucket-pullrequest-builder
- build-pipeline-plugin
- copyartifact # https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin
- credentials # https://wiki.jenkins-ci.org/display/JENKINS/Credentials+Plugin
- delivery-pipeline-plugin # https://wiki.jenkins-
ci.org/display/JENKINS/Delivery+Pipeline+Plugin
- environment-script # https://wiki.jenkins-
ci.org/display/JENKINS/Environment+Script+Plugin
- git
- ghprb # https://wiki.jenkins-
ci.org/display/JENKINS/GitHub+pull+request+builder+plugin
- greenballs # https://wiki.jenkins-ci.org/display/JENKINS/Green+Balls
- hipchat # https://wiki.jenkins-ci.org/display/JENKINS/HipChat+Plugin
- junit # https://wiki.jenkins-ci.org/display/JENKINS/JUnit+Plugin
- matrix-auth # https://wiki.jenkins-
ci.org/display/JENKINS/Matrix+Authorization+Strategy+Plugin
- matrix-project #https://wiki.jenkins-ci.org/display/JENKINS/Matrix+Project+Plugin
- parameterized-trigger #https://wiki.jenkins-
ci.org/display/JENKINS/Parameterized+Trigger+Plugin
- rebuild # https://wiki.jenkins-ci.org/display/JENKINS/Rebuild+Plugin
- ssh
- s3 # https://wiki.jenkins-ci.org/display/JENKINS/S3+Plugin
- throttle-concurrents #https://wiki.jenkins-
ci.org/display/JENKINS/Throttle+Concurrent+Builds+Plugin
You can download this template code from this repository. In order to use it - fork it,
adjust parameters to your needs, and use.
2. Maven installation
MAVEN VERSION CHECK
JENKINS INSTALLTION
JENKINS PATH
Exercise 8:
AIM:
Module name: Implementation of CICD with Java and open source stack
Configure the Jenkins pipeline to call the build script jobs and configure to run
it whenever there is a change made to an application in the version control
system. Make a change to the background color of the landing page of the web
application and check if the configured pipeline runs.
Procedure:
There are six steps to building a pipeline with Jenkins. But, before you begin those six
steps, make sure you have the following in your system.
1. Download Jenkins
Choose Pipeline script as the Destination and paste the Jenkins file content in
the Script from the GitHub.
Click on Save to keep the changes.
Now click on the Build Now to process the build.
To check the output, click on any stage and click Log; a message will appear
on the screen.
Some steps will appear on the screen. The next step is to integrate the Jenkins file into
the Version Control system.
The credentials can be added with the help of the ‘Add’ option.
You can go to the console output option to check the log that is taking place.
You will soon be able to see that all the segments of the pipeline are completed. The
artifact will be present to download. The war file can be downloaded using that link.
The entire process helps us understand how the whole Pipeline is configured. Using
similar types of steps, different kinds of automation pipelines can be configured.
First, create a new directory for our project in a new directory and run this command:
git init
The “Pipeline create” command lets you specify the template where your code will be
built and tested. This method is like how a knitter uses yarn but in our case, we’re
using the web toolbelt instead of the yarn.
The only difference is that you won’t need to use your own build environment to run
tests. The web toolbelt will do this for you automatically.
To use the pipeline to build your sample app, we will open a new terminal window
and type:
$ bin/pipeline build
This command will take a while because it’s building and testing the app on our local
development machine.
After the build process is complete, you can see that the completed project is stored in
our index.html file. We can use these files in other apps or even deploy the app to a
server to test it out.
Exercise 9:
AIM:
Module name: Implementation of CICD with Java and open source stack
Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages
Procedure:
Step 1: In your Terminal or CLI, Start and enable Jenkins and docker
Step 2: In your Jenkins console click on New Item from where you will create your
first job.
Step 3: After you click on New Item , You need to choose an option freestyle project
Step 4: In the configuration section select SCM and you will add the git repo link and
save it.
Step 5: Then you will select Build option and choose to Execute shell
Step 6: Provide the shell commands. Here it’ll build the archive file to induce a war
file. After that, it’ll get the code that already forces then it uses wiz to put in the
package. So, it merely installs the dependencies and compiles the applying.
Step 8: Click on the .freestyle project and save it with the proper name.
Step 9: Again repeat step 4, In the configuration section select SCM and you will add
Step 10: Repeat step 5, You will select Build option and choose to Execute shell
Step 11: You will now write the shell module commands as for int phase and build the
container.
Step 12: Again you will create a new job as before in previous steps.
Step 13: Select freestyle project and provide the item name (here I have given Job3)
Step 14: Again repeat step 4, In the configuration section select SCM and you will add
Step 15: Repeat step 10, You will select Build option and choose to Execute shell.
Step 16: Write the shell commands, Now it will verify the container files and the
Step 18:From the build actions, You will choose post-build and click on build other
projects
Step 19: You will need to provide the name of the project to build after the job 1 and
Step 21: From the build actions, You will choose post-build and click on build other
projects
Step 22: You will need to provide the name of the project to build after the job 2 and
then click save
Step 24: Now, You will choose and select a build Pipeline view and add the name.
Step 25: Choose the Job 1 and save OK
Step 26: let's RUN it and start the CICD process now
Step 27:After you build the job, To verify open the link in your browser
localhost:8180/sample.text, This is the port where your app is running
Exercise 10
AIM:
Module name: Implementation of CICD with Java and open source stack
Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages
Procedure:
SonarQube is an excellent tool for measuring code quality, using static analysis to find
code smells, bugs, vulnerabilities, and poor test coverage. Rather than manually
analysing the reports, why not automate the process by integrating SonarQube with
your Jenkins continuous integration pipeline? This way, you can configure a quality
gate based on your own requirements, ensuring bad code always fails the build.
You’ll learn exactly how to do that in this article, through a full worked example where
we add SonarQube analysis and SonarQube quality gate stages to a Jenkins pipeline.
SonarQube refresher
The SonarQube server also has a UI where you can browse these reports. They look
like this:
Quality gates
In SonarQube a quality gate is a set of conditions that must be met in order for a project
to be marked as passed. In the above example the project met all the conditions.
Here you can see here that a condition failed because the maintainability rating was
a D rather than A.
Running a SonarQube scan from a build on your local workstation is fine, but a robust
solution needs to include SonarQube as part of the continuous integration process. If
you add SonarQube analysis into a Jenkins pipeline, you can ensure that if the quality
gate fails then the pipeline won’t continue to further stages such as publish or release.
To do this, we can use the SonarQube Scanner plugin for Jenkins. It includes two
features that we’re going to make use of today:
1. SonarQube server configuration – the plugin lets you set your SonarQube server
location and credentials. This information is then used in a SonarQube analysis
pipeline stage to send code analysis reports to that SonarQube server.
2. SonarQube Quality Gate webhook – when a code analysis report is submitted
to SonarQube, unfortunately it doesn’t respond synchronously with the result of
whether the report passed the quality gate or not. To do this, a webhook call
must be configured in SonarQube to call back into Jenkins to allow our pipeline
to continue (or fail). The SonarQube Scanner Jenkins plugin makes this
webhook available.
2. the SonarQube scanner is run against a code project, and the analysis report is
sent to SonarQube server
3. SonarQube finishes analysis and checking the project meets the configured
Quality Gate
4. SonarQube sends a pass or failure result back to the Jenkins webhook exposed
by the plugin
5. the Jenkins pipeline will continue if the analysis result is a pass or optionally
otherwise fail
Let’s get our hands dirty with a worked example. We’ll run through all the steps in
the UI manually as this is the best way to understand the setup.
one that runs against a codebase with zero issues (I wish all my code
You’ll need to make sure you have Docker installed before carrying on.
Fast track: to get up and running quickly check out this GitHub repository. Everything
is setup through configuration-as-code, except the steps under Configure SonarQube
below.
What better way to start these two services than with Docker Compose? Create the
following file docker-compose.yml:
version: "3"
services:
sonarqube:
image: sonarqube:lts
ports:
- 9000:9000
networks:
- mynetwork
environment:
- SONAR_FORCEAUTHENTICATION=false
jenkins:
image: jenkins/jenkins:2.319.1-jdk11
ports:
- 8080:8080
networks:
- mynetwork
networks:
mynetwork:
Running docker-compose up in the directory containing the file will start Jenkins
on http://localhost:8080 and SonarQube on http://localhost:9000. Awesomeness!
Grab the Jenkins administrator password from the Jenkins logs in the console output
of the Docker Compose command you just ran.
In the final steps you’ll have to create a user and confirm the Jenkins URL
of http://localhost:8080.
Once complete head over to Manage Jenkins > Manage Plugins > Available and search
for sonar. Select the SonarQube Scanner plugin and click Install without restart.
Once the plugin is installed, let’s configure it!
Go to Manage Jenkins > Configure System and scroll down to the SonarQube
servers section. This is where we’ll add details of our SonarQube server so Jenkins can
pass its details to our project’s build when we run it.
Click the Add SonarQube button. Now add a Name for the server, such as SonarQube.
The Server URL will be http://sonarqube:9000. Remember to click Save.
Configuring SonarQube
Let’s jump over to SonarQube. Click Log in at the top-right of the page, and log in with
the default credentials of admin/admin. You’ll then have to set a new password.
Now go to Administration > Configuration > Webhooks. This is where we can add
webhooks that get called when project analysis is completed. In our case we need to
configure SonarQube to call Jenkins to let it know the results of the analysis.
Click Create, and in the popup that appears give the webhook a name of Jenkins, set the
URL to http://jenkins:8080/sonarqube-webhook and click Create.
In this case, the URL has the path sonarqube-webhook which is exposed by the
SonarQube Scanner plugin we installed earlier.
SonarQube comes with its own Sonar way quality gate enabled by default. If you click
on Quality Gates you can see the details of this.
It’s all about making sure that new code is of a high quality. In this example we want
to check the quality of existing code, so we need to create a new quality gate.
Click Create, then give the quality gate a name. I’ve called mine Tom Way
Click Save then on the next screen click Add Condition. Select On Overall Code. Search
for the metric Maintainability Rating and choose worse than A. This means that if
existing code is not maintainable then the quality gate will fail. Click Add Condition to
save the condition.
Finally click Set as Default at the top of the page to make sure that this quality gate
will apply to any new code analysis.
1. A pipeline which runs against a code project over at the sonarqube-jacoco- code-
coverage GitHub repository. The code here is decent enough that the pipeline
should pass.
2. A pipeline which runs against the same project, but uses the bad-code branch.
The code here is so bad that the pipeline should fail.
Back in Jenkins click New Item and give it a name of sonarqube-good-code, select
the Pipeline job type, then click OK.
Scroll down to the Pipeline section of the configuration page and enter the following
declarative pipeline script in the Script textbox:
pipeline {
agent any
stages {
stage('Clone sources'){
steps {
stage('SonarQube analysis'){
steps {
withSonarQubeEnv('SonarQube'){
sh "./gradlew sonarqube"
stage("Quality gate"){
steps {
1. in the Clone sources stage code is cloned from the GitHub repository mentioned
earlier
2. in the SonarQube analysis stage we use
the withSonarQubeEnv('Sonarqube') method exposed by the plugin to wrap the
Gradle build of the code repository. This provides all the configuration required
for the build to know where to find SonarQube. Note that the project build itself
must have a way of running SonarQube analysis, which in this case is done by
running ./gradlew sonarqube. For more information about running SonarQube
analysis in a Gradle build see this article
3. in the Quality gate stage we use the waitForQualityGate method exposed by the
plugin to wait until the SonarQube server has called the Jenkins webhook. The
abortPipeline flag means if the SonarQube analysis result is a failure, we abort
the pipeline.
Create another pipeline in the same way, but name it sonarqube-bad-code. The pipeline
script is almost exactly the same, except this time we need to check out the bad-
code branch of the same repository.
pipeline {
agent any
stages {
stage('Clone sources'){
steps {
stage('SonarQube analysis'){
steps {
withSonarQubeEnv('SonarQube'){
sh "./gradlew sonarqube"
stage("Quality gate"){
steps {
}
}
in the Clone sources stage we’re now also specifying the branch attribute to
point to the bad-code branch
You’ll be able to see that the Quality gate stage of the pipeline has failed. Exactly
what we wanted, blocking any future progress of this pipeline.
In the build’s Console Output you’ll see the message ERROR: Pipeline aborted due to
quality gate failure: ERROR which shows that the pipeline failed for the right reason.
Over in SonarQube you’ll see that this time it’s reporting a Quality Gate failure.
We can see that the maintainability rating has dropped to B because of the two code
smells. This doesn’t meet our quality gate, which requires a minimum A rating.
Final thoughts
You’ve seen that integrating SonarQube quality gates into Jenkins is straightforward
using the SonarQube Scanner Jenkins plugin. To apply this to a production setup, I
suggest also to:
For full details about setting up SonarQube analysis in a Gradle code project, see How
To Measure Code Coverage Using SonarQube and Jacoco. If you’re using Maven,
check out this documentation from SonarQube.
Exercise 11
AIM:
Module name: Implementation of CICD with Java and open source stack
Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages
Procedure:
Jenkins provides an out of box functionality for Junit, and provides a host of plugins
for unit testing for other technologies, an example being MSTest for .Net Unit tests. If
you go to the link https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin it will
give the list of Unit Testing plugins available.
Example of a Junit Test in Jenkins
Step 5 − Next click the option to Add post-build option and choose the option of
“Publish Junit test result report”
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that
Reports is a folder which is created in the HelloWorld project workspace. The “*.xml”
basically tells Jenkins to pick up the result xml files which are produced by the
running of the Junit test cases. These xml files which then be converted into reports
which can be viewed later.
Once done, click the Save option at the end.
Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful
or not. In the Build output information, you will now notice an additional section called
Test Result. In our case, we entered a negative Test case so that the result would fail
just as an example.
One can go to the Console output to see further information. But what’s more
interesting is that if you click on Test Result, you will now see a drill down of the Test
results.
Jenkins provides an out of box functionality for Junit, and provides a host of plugins
for unit testing for other technologies, an example being MSTest for .Net Unit tests. If
you go to the link https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin it will
give the list of Unit Testing plugins available.
Example of a Junit Test in Jenkins
Step 5 − Next click the option to Add post-build option and choose the option of
“Publish Junit test result report”
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that
Reports is a folder which is created in the HelloWorld project workspace. The “*.xml”
basically tells Jenkins to pick up the result xml files which are produced by the
running of the Junit test cases. These xml files which then be converted into reports
which can be viewed later.
Once done, click the Save option at the end.
Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful
or not. In the Build output information, you will now notice an additional section called
Test Result. In our case, we entered a negative Test case so that the result would fail
just as an example.
One can go to the Console output to see further information. But what’s more interesting
is that if you click on Test Result, you will now see a drill down of the Test results.
Exercise 12
AIM:
Module name: Implementation of CICD with Java and open source stack
Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages
Procedure:
Code analysis in the agile product development cycle is one of the important and
necessary items to avoid possible failures and defects arising out of the continuous
changes in the source codes. There are few good reasons to include this in our
development lifecycle.
More importantly this you can include in your build process once and use it always
without having to do any manual steps.
Challenge
Now let’s talk about the actual the challenge. SonarQube does help us to gain
visibility into our code base. However, soon you will realize that having visibility into
code isn't enough and in order to take the actual advantage of code analysis, we need
to make the use of different data insights that we get with SonarQube.
One way was to enforce the standards and regulate them across all teams within the
organization. Quality Gates exactly what we needed here and are the best way to
ensure that standards are met and regulated across all the projects in your
organization. Quality Gates can be defined as a set of threshold measures set on your
project like Code Coverage, Technical Debt Measure, Number of Blocker/Critical
issues, Security Rating/ Unit Test Pass Rate and more.
Failing your build jobs when the code doesn’t meet criteria set in Quality Gates
should be the way to go. We were using Jenkins as our CI tool and therefore we
wanted to setup Jenkins job to fail if the code doesn’t meet quality gates.
In this article, we are going to setup following
Prerequisites
Here is the snapshot of the job that currently passing build before Quality Gates setup.
Let’s setup Quality gate metrics in the sonarqube server. We are going to create
quality gate only for the metrics “Code coverage” for demo purpose. But there are
more metrics available that you should be selecting while creating quality gates.
Select the project from the available list to which you want to associate this quality
gate. We have selected sample miqp project for which we have set up Jenkins job.
Now go to the Jenkins job and configure the quality gate validation. Click on the job
and go to Post-build Actions and provide the project details you have associated with
Quality Gate created in the earlier steps.
Run the Jenkins job again and verify the build status post quality check enabled.
As we could see that code passed the build, however, it doesn't pass quality gate
check. Therefore, build fails in the end. We can verify the same with the project
status in sonarqube server.